All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

How can Splunk query which IPs have been requested continuously for more than 3 days? And there are multiple values in the firewallSource field, how can we know which IPs have both WAF and ATE in requ... See more...
How can Splunk query which IPs have been requested continuously for more than 3 days? And there are multiple values in the firewallSource field, how can we know which IPs have both WAF and ATE in requests during a certain time period_ LIMITED?
HI @gcusello  This time its runs without error, but no result found. index="XXXX" "Genesys system is available" | rename "response_details.response_payload.entities{}.onlineStatus" as status | wh... See more...
HI @gcusello  This time its runs without error, but no result found. index="XXXX" "Genesys system is available" | rename "response_details.response_payload.entities{}.onlineStatus" as status | where name="YYYY" | stats count(eval(status="offline")) AS offline_count count(eval(status="online")) AS online_count earliest(eval(if(status="offline",_time,""))) AS offline earliest(eval(if(status="online",_time,""))) AS online | fillnull value=0 offline_count | fillnull value=0 online_count | eval condition=case( offline_count=0 AND online_count>0,"Online", offline_count>0 AND online_count=0,"Offline", offline_count>0 AND online_count>0 AND online>offline, "Offline but newly online", offline_count>0 AND online_count>0 AND online>offline, "Offline", offline_count=0 AND online_count=0, "No data") | search condition="Offline" OR condition="Offline but newly online" | table condition
I think this SPL tacked on to the end of your search will work assuming the versioning follows Semantic Versioning convention. | stats dc(host) as dc_hosts by Version | ev... See more...
I think this SPL tacked on to the end of your search will work assuming the versioning follows Semantic Versioning convention. | stats dc(host) as dc_hosts by Version | eval major_version=mvindex(split(Version, "."), 0), minor_version=mvindex(split(Version, "."), 1), patch_version=mvindex(split(Version, "."), 2), minor_patch_version=mvindex(split(Version, "."), 3) | sort 0 -major_version, -minor_version, -patch_version, -minor_patch_version | fields - *_version | eventstats first(Version) as latest_version | where NOT 'Version'=='latest_version'
I'm currently working on crafting a Splunk Query to identify systems that have been inactive for a specified duration (which can vary based on user requirements). My intention is to utilize "Windows ... See more...
I'm currently working on crafting a Splunk Query to identify systems that have been inactive for a specified duration (which can vary based on user requirements). My intention is to utilize "Windows event logs" as the data source, focusing on EventCode=4624. Primarily, I'll be manipulating the default field "_time" as there isn't another relevant field available. I'd appreciate any guidance or suggestions you might have in this regard.
@yuanliu  Using addcoltotals 1) How do I round the score 1129.36 to 1129? 2) How do I remove No 11, but keep No 1-10? See below.  Thank you so much for your help    
Upon receiving the SSL certificate, I've implemented a meticulous review of its contents, segregating the relevant stanzas into distinct .pem and .key files. This refined approach ensures clarity and... See more...
Upon receiving the SSL certificate, I've implemented a meticulous review of its contents, segregating the relevant stanzas into distinct .pem and .key files. This refined approach ensures clarity and precision in handling SSL certificates. Furthermore, to optimize our distributed environment, I've seamlessly copied these files to both instances – the Search Head and the Heavy Forwarder. This streamlined method aims to enhance the efficiency and consistency of our SSL certificate management across our infrastructure. If you have any questions or suggestions regarding this approach, feel free to share your insights.
Assuming the data you shared is _raw. I think this SPL will do what you are looking for. | makeresults | fields - _time | eval _raw="INFO 2023-12-11 17:06:01,726 [[Runtime].Pay f... See more...
Assuming the data you shared is _raw. I think this SPL will do what you are looking for. | makeresults | fields - _time | eval _raw="INFO 2023-12-11 17:06:01,726 [[Runtime].Pay for NEW_API : [ { \"API_NAME\": \"wurfbdjd\", \"DEP_DATE\": \"2023-12-08T00:00:00\" }, { \"API_NAME\": \"mcbhsa\", \"DEP_DATE\": \"2023-12-02T00:00:00\" }, { \"API_NAME\": \"owbaha\", \"DEP_DATE\": \"2023-12-02T00:00:00\" }, { \"API_NAME\": \"pdjna7aha\", \"DEP_DATE\": \"2023-11-20T00:00:00\" } ]" ``` Extract entire array of json objects ``` | rex max_match=0 "NEW\_API\s+:\s+(?<json_array>\[(?:(?:.*)\n?)+\])" ``` parse out each individual json object from the array as a multivalue field ``` | eval json_objects=spath(json_array, "{}") | fields - json_array, _raw ``` mvexpand the multivalue json ``` | mvexpand json_objects ``` extract all fields from json blobs ``` | spath input=json_objects | fields - json_objects      
Not sure if I am interpreting your question correctly but I gave it a shot. So given that the are many different fieldnames with dot notation. You are trying to get a final table of som... See more...
Not sure if I am interpreting your question correctly but I gave it a shot. So given that the are many different fieldnames with dot notation. You are trying to get a final table of something like this? I was able to achieve this by utilizing a foreach loop | makeresults | eval "tmp.exe"="value1" | append [ | makeresults | eval "noop.spl"="value2" ] | append [ | makeresults | eval "tmp.spl"="value3" ] | append [ | makeresults | eval "foo.exe"="value4" ] | append [ | makeresults | eval "tmp.tgz"="value5" ] | append [ | makeresults | eval "foo.tgz"="value6", "tmp.exe"="value7" ] ``` Gather unique fieldnames as values of a new field ``` | foreach *.* [ | eval existing_fieldname=if( isnotnull('<<FIELD>>'), mvappend( 'existing_fieldname', "<<FIELD>>" ), 'existing_fieldname' ) ] ``` Parse out prefix and suffix of the new field ``` | eval prefix=case( isnull(existing_fieldname), null(), mvcount(existing_fieldname)==1, mvindex(split(existing_fieldname, "."), 0), mvcount(existing_fieldname)>1, mvmap(existing_fieldname, mvindex(split(existing_fieldname, "."), 0)) ), suffix=case( isnull(existing_fieldname), null(), mvcount(existing_fieldname)==1, mvindex(split(existing_fieldname, "."), 1), mvcount(existing_fieldname)>1, mvmap(existing_fieldname, mvindex(split(existing_fieldname, "."), 1)) ) ``` Use chart function to display unique combos of prefix/suffix from inherited fieldnames ``` | chart limit=50 count as count over prefix by suffix ``` Replace numbers in the table with "X" to signify that the prefix/suffix combo was found in the data ``` | foreach * [ | eval <<FIELD>>=if( NOT "<<FIELD>>"=="prefix", if( '<<FIELD>>'>0, "X", null() ), '<<FIELD>>' ) ]
@shocko  I love the question. The answers are complicated.  I'll respond below for Simple XML Dashboards. For Dashboard Studio, please submit a different and detailed question to the Splunk Communit... See more...
@shocko  I love the question. The answers are complicated.  I'll respond below for Simple XML Dashboards. For Dashboard Studio, please submit a different and detailed question to the Splunk Community. 1) themes out of the box are light and dark. You can look through the docs on how to create your own, if you wish: https://dev.splunk.com/enterprise/docs/developapps/createapps/buildapps/adduithemes/ 2) If you search your Splunk filesystem for bootstrap-dark.css, you'll find the file that provides dark theme. 2.5) If you don't have access to the filesystem, you can use your browser dev tools to get the URL to bootstrap-dark.css. 2.7) Be warned. It's a BIG file. Formatted pretty, it comes to >7300 lines 3) You're better off to change the font, IMHO, to use CSS overrides. To read more: https://docs.splunk.com/Documentation/Splunk/latest/AdvancedDev/UseCSS https://docs.splunk.com/Documentation/Splunk/9.1.2/Viz/PanelreferenceforSimplifiedXML Finally, note in that last example, you can include css style inside an HTML panel within a dashboard. Searching through this larger Splunk Answers Community for "html css style dashboard panel", should yield you plenty of examples. Here's a great one from my friend Niket to get you started with nearly ready copy/paste code: https://community.splunk.com/t5/Dashboards-Visualizations/How-to-add-style-to-panel-titles-in-my-dashboard/m-p/405736  
If I am understanding your question correctly I usually parse out an array of json objects as a mutlivalued field first and then use an mvexpand against that MV field. After this you can SPATH each j... See more...
If I am understanding your question correctly I usually parse out an array of json objects as a mutlivalued field first and then use an mvexpand against that MV field. After this you can SPATH each json_object individually so its contents will be on its own row.  This will also prevent situation where there are some json objects whose key's may have null values and them not properly aligning in the final output. Here is an example: | makeresults | eval event_id=sha256(tostring(random())), json_object="[{\"field1\": \"value_a\", \"field2\": \"value_b\", \"field3\": \"value_c\"},{\"field1\": \"value_x\", \"field2\": \"value_y\", \"field3\": \"value_z\"},{\"field1\": \"value_q\", \"field2\": \"value_r\", \"field3\": \"value_s\"},{\"field1\": \"value_a\", \"field2\": \"value_r\", \"field3\": \"value_c\", \"field4\": \"value_w\"},{\"field2\": \"value_a\", \"field3\": \"value_b\", \"field4\": \"value_s\"}]" | eval mv_json_object=spath(json_object, "{}") | fields - json_object | mvexpand mv_json_object | spath input=mv_json_object | fields - mv_json_object
Hello, I am trying to determine why my table on Dashboard Studio is showing "No Data" but it shows on Dashboard Classic. I referenced the report in the code and I am using a token for data input. Wh... See more...
Hello, I am trying to determine why my table on Dashboard Studio is showing "No Data" but it shows on Dashboard Classic. I referenced the report in the code and I am using a token for data input. Whenever I open in search, it pulls up all the data I need, but just does not show in the Dashboard. {     "type": "ds.savedSearch",     "options": {         "ref": "E.1_Malicious_Emails_Inbound"     } }   I also checked the APP permissions, and they are in the same app and readable between the report and dashboard. Just No Data. Has anyone ran into an issue like this?    
Hello, I'm trying to install Splunk ITSI 4.17.1 in a Search Head Cluster with Splunk Enterprise 9.1.2. I already extract the .spl in the directory $SPLUNK_HOME$/etc/shcluster/apps but when I execut... See more...
Hello, I'm trying to install Splunk ITSI 4.17.1 in a Search Head Cluster with Splunk Enterprise 9.1.2. I already extract the .spl in the directory $SPLUNK_HOME$/etc/shcluster/apps but when I execute the command splunk apply shcluster-bundle it shows that it has deployed everything correctly but when I go to the Search  Heads none of the ITSI apps are deployed. i just made a test deploying another simple app just for testing purposes and it worked. Do you have any idea?
If you plan on using a deployment server to update your TA or apps, then that would be the easiest route. It's a lot to cover on the deployment server if you haven't used it before, give the link bel... See more...
If you plan on using a deployment server to update your TA or apps, then that would be the easiest route. It's a lot to cover on the deployment server if you haven't used it before, give the link below a read if you can: https://docs.splunk.com/Documentation/Splunk/9.1.2/Updating/Deploymentserverarchitecture Splunk also covers the deployment server part in this training: Splunk Enterprise System Administration https://www.splunk.com/en_us/pdfs/training/splunk-enterprise-system-administration-course-description.pdf https://www.splunk.com/en_us/training/course-catalog.html?filters=filterGroup2SplunkEnterpriseCertifiedAdmin   The gist of a deployment server is: Your non-distributed Splunk instances check into your deployment server (DS) to retrieve any apps you want to deploy. The TAs/apps are all on your DS (etc/deployment-apps) and you manage what app your Splunk instances get with the DS serverclass.conf.
@brdr @nashnexagate Hello, did any of you find a solution for this? I have the same problem
That's not what I was expecting.  I expected a stats values command that was globbing field values together.  Can you share a sample event?  How many events are in the sample output?
Here is a snippet of the URL I am sending and the time format in which it needs to be: startTime=2023-12-01T16%3A27%3A45.000Z&endTime=2023-12-01T16%3A32%3A45.000Z However, when I try to send "l... See more...
Here is a snippet of the URL I am sending and the time format in which it needs to be: startTime=2023-12-01T16%3A27%3A45.000Z&endTime=2023-12-01T16%3A32%3A45.000Z However, when I try to send "latesttime" or "earliesttime", splunk is sending it in epoch. How do I get the proper format of time for the URL within the workflow action? Thanks!
Yes it was installed.
Yes, I believe it can be done.  The first search should be followed by a <done> element within which a token is set to one of the result fields of the search.  That token is then referenced in the se... See more...
Yes, I believe it can be done.  The first search should be followed by a <done> element within which a token is set to one of the result fields of the search.  That token is then referenced in the second search. <search><query>blahblahblah</query> <done> <set token=foo>$results.foo$</set> </done> </search> If the second search expects the contents of the token to be in a particular format then the query should generate that format or you may be able to use an <eval> element within <done> to produce the desired structure.
Hello, I'm trying to find information on how to use Splunk with Visual Studio Code. I have an authentication token on my development instance. I've installed the Visual Studio Code Extension for S... See more...
Hello, I'm trying to find information on how to use Splunk with Visual Studio Code. I have an authentication token on my development instance. I've installed the Visual Studio Code Extension for Splunk on GitHub. I'm lost from here on. What do I enter in the url and webroot fields in the launch.json file? "configurations": [ { "type": "chrome", "request": "launch", "name": "Launch Chrome against localhost", "url": "https://<host name>:8080", "webRoot": "${workspaceFolder}" } ] This opens Splunk in my Chrome browser, but it is an empty search field. I created splnb file in VSC, but when I run it, I receive ERROR: Unauthorized. Thanks in advance for any direction provided. God bless, Genesius
Hi @1ueshkil .. I did this UseCases bit long back.. so i got some confusions now. i am just giving you my educated guesses..  Let us know, if you have Security Essentials App - ( https://splunkbase.... See more...
Hi @1ueshkil .. I did this UseCases bit long back.. so i got some confusions now. i am just giving you my educated guesses..  Let us know, if you have Security Essentials App - ( https://splunkbase.splunk.com/app/3435 ) >>> We have already integrated linux, palo alto,SAP log sources. Nice. most of the problems solved. You no need to worry about data/logs required for the UseCase creation. Now you need to focus only on UseCase Creation >>>Just looking to create Linux, Palo alto, SAP use cases which is based on MITRE framework or any attack pattern use cases, as we don't have that much knowledge to create SPL use cases. Pls select a simple usecase to start with. Lets say DDOS attack on Linux systems. then we can try to work on the UseCase creation step by step.