All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello Experts, I'm currently having CSV file that contains fields such as ID, IP, OS, _time, status etc. I need to create a metric index. Do I need to change field names in the csv file to align w... See more...
Hello Experts, I'm currently having CSV file that contains fields such as ID, IP, OS, _time, status etc. I need to create a metric index. Do I need to change field names in the csv file to align with Splunk expectation or can I import data as it is? I'd appreciate any guidance or examples how to achieve this.? Thanks in advance
Hi, have a requests to restore 40weeks logs from dynamic data archive storage data for one of the index on splunk cloud.may i know process and best practices if any
Thanks @gcusello  The inclusive condition also worked...
The ssl is enabled and can not change when using Splunk Clound free trial, where I can find/download the certificate.
index="starshield" source="http-requests" "firewallSource" IN ("WAF","RATE_LIMIT") "botscore"<10 | stats count values(client.ip) as ip,values(firewallSource) by client.ip,clientRequest.httpHost
How can Splunk query which IPs have been requested continuously for more than 3 days? And there are multiple values in the firewallSource field, how can we know which IPs have both WAF and ATE in requ... See more...
How can Splunk query which IPs have been requested continuously for more than 3 days? And there are multiple values in the firewallSource field, how can we know which IPs have both WAF and ATE in requests during a certain time period_ LIMITED?
HI @gcusello  This time its runs without error, but no result found. index="XXXX" "Genesys system is available" | rename "response_details.response_payload.entities{}.onlineStatus" as status | wh... See more...
HI @gcusello  This time its runs without error, but no result found. index="XXXX" "Genesys system is available" | rename "response_details.response_payload.entities{}.onlineStatus" as status | where name="YYYY" | stats count(eval(status="offline")) AS offline_count count(eval(status="online")) AS online_count earliest(eval(if(status="offline",_time,""))) AS offline earliest(eval(if(status="online",_time,""))) AS online | fillnull value=0 offline_count | fillnull value=0 online_count | eval condition=case( offline_count=0 AND online_count>0,"Online", offline_count>0 AND online_count=0,"Offline", offline_count>0 AND online_count>0 AND online>offline, "Offline but newly online", offline_count>0 AND online_count>0 AND online>offline, "Offline", offline_count=0 AND online_count=0, "No data") | search condition="Offline" OR condition="Offline but newly online" | table condition
I think this SPL tacked on to the end of your search will work assuming the versioning follows Semantic Versioning convention. | stats dc(host) as dc_hosts by Version | ev... See more...
I think this SPL tacked on to the end of your search will work assuming the versioning follows Semantic Versioning convention. | stats dc(host) as dc_hosts by Version | eval major_version=mvindex(split(Version, "."), 0), minor_version=mvindex(split(Version, "."), 1), patch_version=mvindex(split(Version, "."), 2), minor_patch_version=mvindex(split(Version, "."), 3) | sort 0 -major_version, -minor_version, -patch_version, -minor_patch_version | fields - *_version | eventstats first(Version) as latest_version | where NOT 'Version'=='latest_version'
I'm currently working on crafting a Splunk Query to identify systems that have been inactive for a specified duration (which can vary based on user requirements). My intention is to utilize "Windows ... See more...
I'm currently working on crafting a Splunk Query to identify systems that have been inactive for a specified duration (which can vary based on user requirements). My intention is to utilize "Windows event logs" as the data source, focusing on EventCode=4624. Primarily, I'll be manipulating the default field "_time" as there isn't another relevant field available. I'd appreciate any guidance or suggestions you might have in this regard.
@yuanliu  Using addcoltotals 1) How do I round the score 1129.36 to 1129? 2) How do I remove No 11, but keep No 1-10? See below.  Thank you so much for your help    
Upon receiving the SSL certificate, I've implemented a meticulous review of its contents, segregating the relevant stanzas into distinct .pem and .key files. This refined approach ensures clarity and... See more...
Upon receiving the SSL certificate, I've implemented a meticulous review of its contents, segregating the relevant stanzas into distinct .pem and .key files. This refined approach ensures clarity and precision in handling SSL certificates. Furthermore, to optimize our distributed environment, I've seamlessly copied these files to both instances – the Search Head and the Heavy Forwarder. This streamlined method aims to enhance the efficiency and consistency of our SSL certificate management across our infrastructure. If you have any questions or suggestions regarding this approach, feel free to share your insights.
Assuming the data you shared is _raw. I think this SPL will do what you are looking for. | makeresults | fields - _time | eval _raw="INFO 2023-12-11 17:06:01,726 [[Runtime].Pay f... See more...
Assuming the data you shared is _raw. I think this SPL will do what you are looking for. | makeresults | fields - _time | eval _raw="INFO 2023-12-11 17:06:01,726 [[Runtime].Pay for NEW_API : [ { \"API_NAME\": \"wurfbdjd\", \"DEP_DATE\": \"2023-12-08T00:00:00\" }, { \"API_NAME\": \"mcbhsa\", \"DEP_DATE\": \"2023-12-02T00:00:00\" }, { \"API_NAME\": \"owbaha\", \"DEP_DATE\": \"2023-12-02T00:00:00\" }, { \"API_NAME\": \"pdjna7aha\", \"DEP_DATE\": \"2023-11-20T00:00:00\" } ]" ``` Extract entire array of json objects ``` | rex max_match=0 "NEW\_API\s+:\s+(?<json_array>\[(?:(?:.*)\n?)+\])" ``` parse out each individual json object from the array as a multivalue field ``` | eval json_objects=spath(json_array, "{}") | fields - json_array, _raw ``` mvexpand the multivalue json ``` | mvexpand json_objects ``` extract all fields from json blobs ``` | spath input=json_objects | fields - json_objects      
Not sure if I am interpreting your question correctly but I gave it a shot. So given that the are many different fieldnames with dot notation. You are trying to get a final table of som... See more...
Not sure if I am interpreting your question correctly but I gave it a shot. So given that the are many different fieldnames with dot notation. You are trying to get a final table of something like this? I was able to achieve this by utilizing a foreach loop | makeresults | eval "tmp.exe"="value1" | append [ | makeresults | eval "noop.spl"="value2" ] | append [ | makeresults | eval "tmp.spl"="value3" ] | append [ | makeresults | eval "foo.exe"="value4" ] | append [ | makeresults | eval "tmp.tgz"="value5" ] | append [ | makeresults | eval "foo.tgz"="value6", "tmp.exe"="value7" ] ``` Gather unique fieldnames as values of a new field ``` | foreach *.* [ | eval existing_fieldname=if( isnotnull('<<FIELD>>'), mvappend( 'existing_fieldname', "<<FIELD>>" ), 'existing_fieldname' ) ] ``` Parse out prefix and suffix of the new field ``` | eval prefix=case( isnull(existing_fieldname), null(), mvcount(existing_fieldname)==1, mvindex(split(existing_fieldname, "."), 0), mvcount(existing_fieldname)>1, mvmap(existing_fieldname, mvindex(split(existing_fieldname, "."), 0)) ), suffix=case( isnull(existing_fieldname), null(), mvcount(existing_fieldname)==1, mvindex(split(existing_fieldname, "."), 1), mvcount(existing_fieldname)>1, mvmap(existing_fieldname, mvindex(split(existing_fieldname, "."), 1)) ) ``` Use chart function to display unique combos of prefix/suffix from inherited fieldnames ``` | chart limit=50 count as count over prefix by suffix ``` Replace numbers in the table with "X" to signify that the prefix/suffix combo was found in the data ``` | foreach * [ | eval <<FIELD>>=if( NOT "<<FIELD>>"=="prefix", if( '<<FIELD>>'>0, "X", null() ), '<<FIELD>>' ) ]
@shocko  I love the question. The answers are complicated.  I'll respond below for Simple XML Dashboards. For Dashboard Studio, please submit a different and detailed question to the Splunk Communit... See more...
@shocko  I love the question. The answers are complicated.  I'll respond below for Simple XML Dashboards. For Dashboard Studio, please submit a different and detailed question to the Splunk Community. 1) themes out of the box are light and dark. You can look through the docs on how to create your own, if you wish: https://dev.splunk.com/enterprise/docs/developapps/createapps/buildapps/adduithemes/ 2) If you search your Splunk filesystem for bootstrap-dark.css, you'll find the file that provides dark theme. 2.5) If you don't have access to the filesystem, you can use your browser dev tools to get the URL to bootstrap-dark.css. 2.7) Be warned. It's a BIG file. Formatted pretty, it comes to >7300 lines 3) You're better off to change the font, IMHO, to use CSS overrides. To read more: https://docs.splunk.com/Documentation/Splunk/latest/AdvancedDev/UseCSS https://docs.splunk.com/Documentation/Splunk/9.1.2/Viz/PanelreferenceforSimplifiedXML Finally, note in that last example, you can include css style inside an HTML panel within a dashboard. Searching through this larger Splunk Answers Community for "html css style dashboard panel", should yield you plenty of examples. Here's a great one from my friend Niket to get you started with nearly ready copy/paste code: https://community.splunk.com/t5/Dashboards-Visualizations/How-to-add-style-to-panel-titles-in-my-dashboard/m-p/405736  
If I am understanding your question correctly I usually parse out an array of json objects as a mutlivalued field first and then use an mvexpand against that MV field. After this you can SPATH each j... See more...
If I am understanding your question correctly I usually parse out an array of json objects as a mutlivalued field first and then use an mvexpand against that MV field. After this you can SPATH each json_object individually so its contents will be on its own row.  This will also prevent situation where there are some json objects whose key's may have null values and them not properly aligning in the final output. Here is an example: | makeresults | eval event_id=sha256(tostring(random())), json_object="[{\"field1\": \"value_a\", \"field2\": \"value_b\", \"field3\": \"value_c\"},{\"field1\": \"value_x\", \"field2\": \"value_y\", \"field3\": \"value_z\"},{\"field1\": \"value_q\", \"field2\": \"value_r\", \"field3\": \"value_s\"},{\"field1\": \"value_a\", \"field2\": \"value_r\", \"field3\": \"value_c\", \"field4\": \"value_w\"},{\"field2\": \"value_a\", \"field3\": \"value_b\", \"field4\": \"value_s\"}]" | eval mv_json_object=spath(json_object, "{}") | fields - json_object | mvexpand mv_json_object | spath input=mv_json_object | fields - mv_json_object
Hello, I am trying to determine why my table on Dashboard Studio is showing "No Data" but it shows on Dashboard Classic. I referenced the report in the code and I am using a token for data input. Wh... See more...
Hello, I am trying to determine why my table on Dashboard Studio is showing "No Data" but it shows on Dashboard Classic. I referenced the report in the code and I am using a token for data input. Whenever I open in search, it pulls up all the data I need, but just does not show in the Dashboard. {     "type": "ds.savedSearch",     "options": {         "ref": "E.1_Malicious_Emails_Inbound"     } }   I also checked the APP permissions, and they are in the same app and readable between the report and dashboard. Just No Data. Has anyone ran into an issue like this?    
Hello, I'm trying to install Splunk ITSI 4.17.1 in a Search Head Cluster with Splunk Enterprise 9.1.2. I already extract the .spl in the directory $SPLUNK_HOME$/etc/shcluster/apps but when I execut... See more...
Hello, I'm trying to install Splunk ITSI 4.17.1 in a Search Head Cluster with Splunk Enterprise 9.1.2. I already extract the .spl in the directory $SPLUNK_HOME$/etc/shcluster/apps but when I execute the command splunk apply shcluster-bundle it shows that it has deployed everything correctly but when I go to the Search  Heads none of the ITSI apps are deployed. i just made a test deploying another simple app just for testing purposes and it worked. Do you have any idea?
If you plan on using a deployment server to update your TA or apps, then that would be the easiest route. It's a lot to cover on the deployment server if you haven't used it before, give the link bel... See more...
If you plan on using a deployment server to update your TA or apps, then that would be the easiest route. It's a lot to cover on the deployment server if you haven't used it before, give the link below a read if you can: https://docs.splunk.com/Documentation/Splunk/9.1.2/Updating/Deploymentserverarchitecture Splunk also covers the deployment server part in this training: Splunk Enterprise System Administration https://www.splunk.com/en_us/pdfs/training/splunk-enterprise-system-administration-course-description.pdf https://www.splunk.com/en_us/training/course-catalog.html?filters=filterGroup2SplunkEnterpriseCertifiedAdmin   The gist of a deployment server is: Your non-distributed Splunk instances check into your deployment server (DS) to retrieve any apps you want to deploy. The TAs/apps are all on your DS (etc/deployment-apps) and you manage what app your Splunk instances get with the DS serverclass.conf.
@brdr @nashnexagate Hello, did any of you find a solution for this? I have the same problem
That's not what I was expecting.  I expected a stats values command that was globbing field values together.  Can you share a sample event?  How many events are in the sample output?