Cloud monitoring console should provide a great start on analyzing your storage needs. https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/Admin/MonitoringLicenseUsage#Monitor_the_Storage_Su...
See more...
Cloud monitoring console should provide a great start on analyzing your storage needs. https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/Admin/MonitoringLicenseUsage#Monitor_the_Storage_Summary_dashboard The key concept you must be familiar with, is the Splunk bucket lifecycle, as buckets are the smallest form of storage in Splunk, and impacts greatly how and when your buckets move from active searchable to active archive. I wouldnt over complicate it with compression. While Splunk does compress data, your entitlements are on raw data ingested, so just closely analyze your daily ingest in your biggest indexes and poke around with the `dbinspect` command and the monitoring console to ensure your bucket health is good. Data onboarding and data quality is key to ensure your timestamps dont pollute your buckets with timestamps way in the past or future, because a bucket can only migrate to archive when ALL EVENTS in the bucket meet the time/size criteria. https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/Admin/MonitoringHealth#Health_indicator_information_and_additional_resources:~:text=Search%20Manual.-,Bucket%20size%20and%20range,-An%20index%20typically https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/Admin/MonitoringIndexing#Verify_data_quality Also even going back and reading Splunk Enterprise docs on "smartstore" will help provide you with some good background, or work with your account team to go through it and ensure you have a good handle on it.
1. Is there a way to directly install Custom Apps / Add-ons (that are originally built for Splunk Enterprise), in Splunk Cloud? We were thinking about compatibility issues, and if the apps would work...
See more...
1. Is there a way to directly install Custom Apps / Add-ons (that are originally built for Splunk Enterprise), in Splunk Cloud? We were thinking about compatibility issues, and if the apps would work the same way. Yes, see "installing Private apps" on Splunk Cloud Platform. -https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/Admin/PrivateApps 2. Is there a way to gauge whether or not the quantity of data that we want to send from external sources, would require us to install a Heavy / Universal Forwarder? (We are trying to avoid additional costs by taking Splunk Cloud, so we were wondering if we could do without them) As long as your cloud deployment is sized correctly around how much ingest and search you plan to do, you can absolutley use cloud without the need for HFs or on-premesis infra. It's always an option when needed. The main place to get familiar with is the Splunk Cloud Service Description. It lays out the service and any limits or recommends we have. For example, https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/Service/SplunkCloudservice#Experience_designations Victoria Classic Modular and scripted inputs Modular and scripted inputs can now run directly on the search tier without the additional overhead of a separate IDM instance. Review pull based service limits below: Up to 500GB/day for entitlement of less than 166 SVC or 1 TB Up to 1.5TB/day for more than 166 SVC or 1 TB Modular and scripted inputs must run on a separate IDM instance or customer-managed heavy forwarder. Victoria runs the inputs on the SH tier to allow self service. Classic runs the "HF"s for you as "IDM"s, but way less self service. So depends on what you value more. The free cloud trial instances wont be what you want for actual testing etc. Have your Sales Engineer spin up a demo stack internally and you can play with them or do a full blown POC. How much ingest do you plan to do? Check out the Splunk Cloud Migration Assessment app for help translating requirements. https://splunkbase.splunk.com/app/4974 Hope that helps! Feel free to join others on splunk cloud in the splunk_cloud room on community slack too! splk.it/slack
I have this vulnerability on all our instances on the last version of splunkforwarder The version of OpenSSL installed on the remote host is prior to 1.0.2zf. It is, therefore, affected by a vulne...
See more...
I have this vulnerability on all our instances on the last version of splunkforwarder The version of OpenSSL installed on the remote host is prior to 1.0.2zf. It is, therefore, affected by a vulnerability as referenced in the 1.0.2zf advisory. identified in CVE-2022-1292, the OpenSSL rehash command line tool. Fixed in OpenSSL 3.0.4 (Affected 3.0.0,3.0.1,3.0.2,3.0.3). Fixed in OpenSSL 1.1.1p (Affected 1.1.1-1.1.1o). Fixed in OpenSSL 1.0.2zf (Affected 1.0.2-1.0.2ze). (CVE-2022-2068) Any recommendation here
It's so long time when I have done this last time, that I cannot be 100% sure that it was that way. But at least https://docs.splunk.com/Documentation/Splunk/latest/Indexer/ConfiguresearchheadwithCLI ...
See more...
It's so long time when I have done this last time, that I cannot be 100% sure that it was that way. But at least https://docs.splunk.com/Documentation/Splunk/latest/Indexer/ConfiguresearchheadwithCLI didn't mention that there is anything else what you need to do for individual peers. Also this https://docs.splunk.com/Documentation/Splunk/latest/DMC/Addinstancesassearchpeers said "Do not add clustered indexers. If you are monitoring an indexer cluster and you are hosting the monitoring console on an instance other than the cluster manager, you must add the cluster manager as a search peer and configure the monitoring console instance as a search-head in that cluster."
The way to go is with the OpenTelemetry Helm Chart. Wrote a lil quickstart here https://github.com/matthewmodestino/otel-quickstart/blob/main/kubernetes/0-quickstart-home.md#kubernetes-otel-quicksta...
See more...
The way to go is with the OpenTelemetry Helm Chart. Wrote a lil quickstart here https://github.com/matthewmodestino/otel-quickstart/blob/main/kubernetes/0-quickstart-home.md#kubernetes-otel-quickstart-home See docs and validated architecture for more! https://docs.splunk.com/Documentation/Splunk/9.1.2/Data/OtelCollectorKubernetes https://docs.splunk.com/Documentation/SVA/current/Architectures/OTelKubernetes If you run into issues reach out to your SE, we have workshops or jump into the community slack channel splk.it/slack! holler at me in the kubernetes channel, or opentelemetry channels, (mattymo)
That's interesting, because I added all the Search heads to the new MC, plus the current cluster master and I don't see the indexers listed on the distributed mode. I guess it may come after I've com...
See more...
That's interesting, because I added all the Search heads to the new MC, plus the current cluster master and I don't see the indexers listed on the distributed mode. I guess it may come after I've completed the setup of distributed mode, but I need to make the new instance a search head first according to the documentation, so I'll start there.
Hi @splunksumman -
I’m a Community Moderator in the Splunk Community. This question was posted 1 year ago, so it might not get the attention you need for your question to be answered. We recomme...
See more...
Hi @splunksumman -
I’m a Community Moderator in the Splunk Community. This question was posted 1 year ago, so it might not get the attention you need for your question to be answered. We recommend that you post a new question so that your issue can get the visibility it deserves. To increase your chances of getting help from the community, follow these guidelines in the Splunk Answers User Manual when creating your post.
Thank you!
Hi all, We are going to the Splunk cloud and want to keep the LDAP search also in cloud. Today we have install the app on a search head and with working commands. I know how to forward the data ...
See more...
Hi all, We are going to the Splunk cloud and want to keep the LDAP search also in cloud. Today we have install the app on a search head and with working commands. I know how to forward the data to splunk cloud from a HF, but what about the ldap command? Like ldapgroup etc? do we need to install the app in Cloud also to get the commands to work? //Jan
I have enterprise network and we have Splunk enterprise license. Question: while troubleshooting source type or host if checking, it needs to show past history of particular user or source in the d...
See more...
I have enterprise network and we have Splunk enterprise license. Question: while troubleshooting source type or host if checking, it needs to show past history of particular user or source in the dashboard. Past history like how many alerts triggered the same user, those alert details if click the link it must be show past troubleshooting history.
Hi Nasser I am taking the same course I tried multiple queries nothing worked can you help me
source="3--المصدر-الداعم-الثالثسجل-الملفات.csv" host="Ghaidas-MBP" index="main" sourcetype="stc_logs" a...
See more...
Hi Nasser I am taking the same course I tried multiple queries nothing worked can you help me
source="3--المصدر-الداعم-الثالثسجل-الملفات.csv" host="Ghaidas-MBP" index="main" sourcetype="stc_logs" action="blocked"
I used this Query as well to count action
source="3--المصدر-الداعم-الثالثسجل-الملفات.csv" host="Ghaidas-MBP" index="main" sourcetype="stc_logs" | stats count by action
but neither queries have yielded any results
Yes you said that you want to use it to modify a report, but you didn't define how it should modify. Basically add normal SPL after your report and use token with $data$ how ever you want to use it.
Hi you don't say how you want to use that token. Without that information we cannot directly tell it to you! You could found from https://docs.splunk.com/Documentation/Splunk/latest/Viz/tokens how ...
See more...
Hi you don't say how you want to use that token. Without that information we cannot directly tell it to you! You could found from https://docs.splunk.com/Documentation/Splunk/latest/Viz/tokens how to use tokens in dashboards. I'm not sure if you try to use token on that report (checkpoint1) or not? Unfortunately you cannot use tokens in other places than this dashboards or other output links/dashboards. One app which you should install and use when you are developing dashboards with tokens is https://splunkbase.splunk.com/app/1603. With it you can automatically see what tokens you have defined and what values those have. r. Ismo
any ideas how can I use foreach to "collect" all changes (using mvappend)? My current attempt works only if I restrict foreach to one specific field (e.g. "a") and even then it shows just one change ...
See more...
any ideas how can I use foreach to "collect" all changes (using mvappend)? My current attempt works only if I restrict foreach to one specific field (e.g. "a") and even then it shows just one change pro id: | foreach a [ eval changed = if ( previous_<> != <> , mvappend(changed, "<>") , 0) ] | search changed!=0 | stats values(changed) values(id) by _time
"I need help with this XML for a dashboard; essentially, I need to call a token that modifies data within a report, having already created the token with the name 'data.' How can I do this?" <for...
See more...
"I need help with this XML for a dashboard; essentially, I need to call a token that modifies data within a report, having already created the token with the name 'data.' How can I do this?" <form version="1.1"> <label>Lista IP da bloccare</label> <fieldset submitButton="true" autoRun="false"> <input type="time" token="data"> <label></label> <default> <earliest>rt-24h</earliest> <latest>rt</latest> </default> </input> </fieldset> <row> <panel> <table> <search ref="checkpoint1"></search> <option name="drilldown">none</option> </table> </panel> </row> </form>
Hey, Does anyone know what is the updated code for the v2 endpoint? I am trying to deploy apps with this config to Splunk Cloud but the vetting fails because new version of Splunk doesnt support old...
See more...
Hey, Does anyone know what is the updated code for the v2 endpoint? I am trying to deploy apps with this config to Splunk Cloud but the vetting fails because new version of Splunk doesnt support old endpoint I tried adapting the line of code for the button with the documentation, but it doesnt download anything and redirects to a 404 page.
1. Please post your data/code samples in a pre-formatted way (using either the "preformatted" style or the code sample control in the editor. It makes the sample easier to read. 2. It's not clear wh...
See more...
1. Please post your data/code samples in a pre-formatted way (using either the "preformatted" style or the code sample control in the editor. It makes the sample easier to read. 2. It's not clear what you want to get from this data. 3. Unless you have a very good reason and a strong use case, you should not parse data _into fields_ while indexing (in other words - create indexed fields). Most parsing in Splunk is done in search time. 4. Unless you have a very very good reason (even better one than the one for the indexed fields) you should not use SHOULD_LINEMERGE=true. It gives you a huge performance hit.
We have exactly the same problem here. Tested today on a Windows 2016/2019 - UFW Update from 9.1.1 to 9.1.3 But a new installation is out of the question for us, as you will lose all checkpoints and...
See more...
We have exactly the same problem here. Tested today on a Windows 2016/2019 - UFW Update from 9.1.1 to 9.1.3 But a new installation is out of the question for us, as you will lose all checkpoints and a reread of all is the result.
If I understand you correctly, you want the settings you defined on your DS to propagate to forwarders across your environment (or at least to some designated UF(s)). You did the first step correctl...
See more...
If I understand you correctly, you want the settings you defined on your DS to propagate to forwarders across your environment (or at least to some designated UF(s)). You did the first step correctly - you created an app in etc/deployment-apps (I hope the "deploymentapps" in your description is just a typo). But now you have to define a server class tying app(s) to deployment client(s) and deload deployment server. See the https://docs.splunk.com/Documentation/Splunk/latest/Updating/Aboutdeploymentserver document (read thoroughly the pages about creating server classes and deploying apps).
To most of those questions answers can be found on Splunk documentation pages. Some of them require a bit of experience with the system. Both knowing the docs as well as experience sums up to product...
See more...
To most of those questions answers can be found on Splunk documentation pages. Some of them require a bit of experience with the system. Both knowing the docs as well as experience sums up to product knowledge and abilities which are required to work with the product. Unless it's a very entry-level position where you should learn everything from scratch (but for such work you shouldn't get such questions in interview), you should know this before trying to administer Splunk environments. Otherwise you can do some very costly damage to your (potential) employer installation.