All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

How to change a span of 1 week time to start from Monday to friday   usually span=1w it will show data from monday 00:00 hrs to Sunday 23.59 hrs   Can someone help on this query    Thanks in A... See more...
How to change a span of 1 week time to start from Monday to friday   usually span=1w it will show data from monday 00:00 hrs to Sunday 23.59 hrs   Can someone help on this query    Thanks in Advance
Hello, I have nested json type log messages like below being forwarded to splunk -   { "timeStamp": "2021-03-11T07:45:49.780000+00:00", "status": "deactive", "deviceId": ... See more...
Hello, I have nested json type log messages like below being forwarded to splunk -   { "timeStamp": "2021-03-11T07:45:49.780000+00:00", "status": "deactive", "deviceId": "uuid12345", "details": { "Device:Information": { "Type": "Apple", "Content": { "uuid12345": { "Name": "IOS", "Version": "14.4" } } } } }     I'd like to generate a table like below out of all such log messages - deviceId Name Version uuid12345 IOS 14.4 uuid12346 Android 8.1   I am aware that a table of fields can be easily created using table command or stats (to get counts by Name and Version), however the problem with this log message structure is that the nested json path `details.Device:Information.Content` contains a key with value `uuid12345` which is dynamic in nature. Therefore, a query like this doesn't work as I need since the wildcard character seem to create one column for each interpreted value like `details.Device:Information.Content.uuid12345.Name`, `details.Device:Information.Content.uuid12346.Name`, `details.Device:Information.Content.uuid12345.Version`, `details.Device:Information.Content.uuid12346.Version` -   | table deviceId, details.Device:Information.Content.*.Name, details.Device:Information.Content.*.Version   Is it possible to get this information extracted into a table like I described above? Would it be possible to extract `Name` and `Version` as fields so that I don't have to use full json path in table or stats command? Thanks for your help in advance.  
Hi I'm a Splunk beginner. When I read a Splunk document(https://docs.splunk.com/Documentation/SplunkCloud/latest/Viz/tokens), I got a question. Using token with job properties, panel can be hide. ... See more...
Hi I'm a Splunk beginner. When I read a Splunk document(https://docs.splunk.com/Documentation/SplunkCloud/latest/Viz/tokens), I got a question. Using token with job properties, panel can be hide.   So my dashboard code is below.       <search> <query> | makeresults | eval test = "123" </query> <progress> <condition match = "'job.resultCount' == 1"> <set token = "show_html">True</set> </condition> </progress> </search>       What I understand is resultCount == 1 (result is _time, test fields with one row. so resultCount is 1) So that panel should be hide I guess. Am I wrong understand? Thanks.  
Hello, I am attempting to create a search that utilises two drop boxes. The first dropdown is populated from a query which lists servers   <label>Server List</label> <description>Projects the... See more...
Hello, I am attempting to create a search that utilises two drop boxes. The first dropdown is populated from a query which lists servers   <label>Server List</label> <description>Projects the future disk usage of all/description> <fieldset autoRun="false" submitButton="true"> <input type="dropdown" token="Host" searchWhenChanged="true"> <label>Server Name</label> <fieldForLabel>Host</fieldForLabel> <fieldForValue>Host</fieldForValue> <search> <query>| inputlookup windows_hostmon_system | regex Host="regexgoeshere" | dedup Host</query> </search> <default>default server goes here</default> <initialValue>InitialServerName</initialValue> </input>   I then want to use the selection of the above dropdown and commit it to a token to be used in the next dropdown.   <input type="dropdown" token="Disk"> <label>Disk</label> <fieldForLabel>Disk</fieldForLabel> <fieldForValue>Name</fieldForValue> <search> <query>eventtype=hostmon_windows source=Disk host="$HostChoice$" sourcetype="Winhostmon" Name="*" | stats count(Name) by Name</query> <earliest>-15m</earliest> <latest>now</latest> </search> </input>     Is it possible to capture the output of a dynamic dropdown and use that output as a variable in other dynamic dropdowns. Kind Regards
The first change condition is working fine but the second one I have where I setting a token with a different value is not. What I want to do is to change the search query when the value is "All". A... See more...
The first change condition is working fine but the second one I have where I setting a token with a different value is not. What I want to do is to change the search query when the value is "All". And when the value has categories add the where to the query  Let me show what I have: <input type="multiselect" token="categories" searchWhenChanged="false">     <label>Select Categories</label>     <fieldForLabel>category</fieldForLabel>     <fieldForValue>category</fieldForValue>     <search>        <query>index=abc  "allcategories"      </search> <valuePrefix>"</valuePrefix> <choice value="*">All</choice> <change> <condition value="*">     <set token="search_filter1">index=abc </set> </condition> <condition>     <set token="search_filter1">index=abc  | where category IN ($categories$)</set> </condition> </change> </input> Then I have a panel like this <panel>     <title>Panel by categories</title>     <table>       <search>         <query>$search_filter1$ | stats count by category</query>       </search>     </table> </panel>  
Hello There, I would like to add a label to my dashboard using HTML. The label is based out of my search query using Splunk SPL. I am able to add the label but my query shows the raw source code on ... See more...
Hello There, I would like to add a label to my dashboard using HTML. The label is based out of my search query using Splunk SPL. I am able to add the label but my query shows the raw source code on my dashboard. How can I get my search query to show the actual results of the query and not the source code. Below is my code. Any help is welcome:  <html> <body> <h1>Sports Report</h1> <search> <query> | rest /servicesNS/-/-/data/lookup-table-files search="*_Sports_Report.csv" | eval updated=strptime(updated,"%FT%T%:z") | eval desired_time=strftime(updated, "%B %d, %Y") | rename desired_time as "Last Updated" title as Team | table "Last Updated", Team </query> </search> </body> </html>
I've never used |regex, but use |where match() quite often.  Is the former just syntax sugar or is there any difference?
Hi, I know there are other ways to get this through the deployment server, but I'm trying to find a SPL to get results of which of my Splunk UF clients currently has a specific deployment app. I ha... See more...
Hi, I know there are other ways to get this through the deployment server, but I'm trying to find a SPL to get results of which of my Splunk UF clients currently has a specific deployment app. I have been able to use this SPL to find all all deployment apps on all my Splunk UF clients:   | rest /services/deployment/server/clients splunk_server=splunkdeploy01* | table hostname applications*.stateOnClient | untable hostname applications value | eval applications=replace(applications,"applications\.(\w+)\.stateOnClient","\1") | stats values(applications) as applications by hostname   However, I'm looking for a SPL that searches across all Splunk UF clients for a specific deployment app: "all_splunk_uf" Thanks!
We're seeing severe memory issues with the Splunk AddOn for Microsoft Cloudservices where a single python process can consume well over 200Gb of memory, ultimately being reaped by the OOM Killer. Th... See more...
We're seeing severe memory issues with the Splunk AddOn for Microsoft Cloudservices where a single python process can consume well over 200Gb of memory, ultimately being reaped by the OOM Killer. These are Azure Firewall collections. Rather critical to our logging infrastructure and we have no information as to why this is happening. It doesn't happen to all collections, but does happen to many ... You can see below, one child process consuming > 60Gb ... and Splunkd having grown to 275Gb of virtual memory. splunk 17311 125 67318812 62146568 /opt/splunk/bin/python3.7 /opt/splunk/etc/apps/Splunk_TA_microsoft-cloudservices/bin/mscs_storage_blob.py splunk 15910 227 274447324 82630100 splunkd -p 8089 restart Splunk version 8.0.4.1 Splunk_TA_microsoft-cloudservices version 4.1.2    
I m just trying to feed the AWS instance data to Splunk, the output for "aws ec2 describe-instances" is in json format and i m placing the result to a variable. now i m trying to call the variable in... See more...
I m just trying to feed the AWS instance data to Splunk, the output for "aws ec2 describe-instances" is in json format and i m placing the result to a variable. now i m trying to call the variable in curl command  asset = aws ec2 describe-instances echo $asset curl -k -H "Authorization: Splunk Auth_token" https://splunkcloud.com:8088/services/collector/raw -d {"sourcetype": "some-fields", "fields":'$asset'}" and output seems like  curl: (6) Could not resolve host: "disabled" curl: (3) [globbing] unmatched close brace/bracket in column 1 could some one give a better solution?
Hello, I'm trying to define a successful search vs. unsuccessful. I define successful search as one with a click. I then want to sort by the top queries without clicks to identify content gaps. I h... See more...
Hello, I'm trying to define a successful search vs. unsuccessful. I define successful search as one with a click. I then want to sort by the top queries without clicks to identify content gaps. I have two logs, one where action=search and another where action=resultClicked. Both logs have the query value and a userId.  Maybe there is a better way, but I'm struggling to bin and match. All resultClicked logs correspond to an action search log and where the userId is the same and within 1h, we can be certain that all others that do not have a match are unsuccessful.  How would you try?   
Will de app Splunk Add-on for Microsoft Windows DNS python version 3 ready soon
Hello, I am trying to extract the full line from the raw data log matching a pattern in the line.  Sample data: blah blah 24 packages updated on 5th may 2021 3:00 pm blah blah I am able to use a... See more...
Hello, I am trying to extract the full line from the raw data log matching a pattern in the line.  Sample data: blah blah 24 packages updated on 5th may 2021 3:00 pm blah blah I am able to use a regex to extract everything after a pattern lets say "packages updated" using the below regex, but I am not able to extract the full line including the number (24 in this case) in the beginning of the line. base search | rex field=_raw "(?m)packages updated\s(?<pkg_count>.*)" With this above regex, I get a new field named pkg_count with value of = on 5th may 2021 3:00 pm But I'd like to get a field with the full line "24 packages updated on 5th may 2021 3:00 pm" Thanks!
Are ther any plans to make splunk db connect python version 3 ready. I upgraded currently to version 3.5.1 With the Python Upgrade Readiness App I got the message that this app is not ready for pyt... See more...
Are ther any plans to make splunk db connect python version 3 ready. I upgraded currently to version 3.5.1 With the Python Upgrade Readiness App I got the message that this app is not ready for python version 3. Thanks for your help. Regards Christian
I have the summary index to record hourly event count for all device (de_count). I have the following search to get max number of hours without events for myindex=router: myindex=router | bucket _... See more...
I have the summary index to record hourly event count for all device (de_count). I have the following search to get max number of hours without events for myindex=router: myindex=router | bucket _time span=1h  | stats sum(de_count) as event_count by _time  (get hourly event count by _time) | search event_count!=0 | delta _time as mydelta ( get max number of hours without events) | eval number_of_zeros=floor(mydelta/3600.00)-1 | stats max(number_of_zeros) | rename "max(number_of_zeros)" as maxgap | table myindex maxgap  How can I get max number of hours without events for all indexes, myindex=* ?
I've created a dashboard for searching and filtering events, and it consists of two panels for presenting the results: a table to show a summary of the events based on search criteria, columns are:... See more...
I've created a dashboard for searching and filtering events, and it consists of two panels for presenting the results: a table to show a summary of the events based on search criteria, columns are: Time, CorrelationId, Service Name, Log Level, and Message, which are shared attributes among all events an Events panel to show the entirety of an event, which includes attributes that are specific to an event and are not shared with other events, e.g. Stack Trace for errors the table's drilldown is set to "row", and when a row is clicked it sets some tokens that are used to search again to find that event and show it on the Events panel. My goal is to avoid the second search, because the event is already retrieved by the table panel. I've tried passing _raw from table panel to events panel and use makeresults but that command creates a table row and can only be viewed under Statistics/Table tab and does not show anything when Events/List tab is selected. What I need is to view the event in the format that is shown in the screenshot below: I know that renaming a JSON to _raw will deserialize it, but that requires a result-set of events to begin with, e.g.       * | head 1 | eval tmp="{\"key\":\"value\"}" | rename tmp as _raw       will show the new JSON instead of the original event but the below query with makeresults does not give the same result:       | makeresults | eval tmp="{\"key\":\"value\"}" | rename tmp as _raw         only Statistics tab shows results.   To summarize, I want to get the event in the format that can be seen in the first screenshot above, but without running a search, because I already have the entire event, including its _raw. Any help is appreciated!
I am using EC2 to run my search heads. I want to configure Splunk Addon to start delivering alerts when they trigger to specific SNS Topic (ideally on per-alert basis). Where do I do this?
I have events in my logs. I want to capture "temp" and table it received_time="2021-05-25T15:51:22.181+00:00"] 37 pollAcu20:830 ACU: PSU: Connected: true Output voltage: 4775 0.01V, Output current: ... See more...
I have events in my logs. I want to capture "temp" and table it received_time="2021-05-25T15:51:22.181+00:00"] 37 pollAcu20:830 ACU: PSU: Connected: true Output voltage: 4775 0.01V, Output current: 36 0.01A Critical temp: 426 0.1 Deg C Status: 0x3 Fault: false
Hi guys   I just setup this app https://splunkbase.splunk.com/app/2878/ on our splunk cloud deployment.    If i go and follow instructions: In order to setup the app, navigate to "Settings" -> "... See more...
Hi guys   I just setup this app https://splunkbase.splunk.com/app/2878/ on our splunk cloud deployment.    If i go and follow instructions: In order to setup the app, navigate to "Settings" -> "Alert actions". Click on "Setup Slack Alerts". On the setup screen you'll want to supply a Webhook URL. You can obtain this URL by configuring a custom integration for you Slack workspace.   All i get is a non found page:   Also if i go to apps "Manage Apps" and click on "Set up" instead , same error page:   any clue?   thanks guys  
I have an expired certificate I am trying to update and Splunk is not raised after I apply the certificate the steps I m running 1. /opt/splunk/bin/splunk cmd openssl genrsa -aes256 -out mySplunk... See more...
I have an expired certificate I am trying to update and Splunk is not raised after I apply the certificate the steps I m running 1. /opt/splunk/bin/splunk cmd openssl genrsa -aes256 -out mySplunkWebPrivateKey.key 2048 2. /opt/splunk/bin/splunk cmd openssl rsa -in mySplunkWebPrivateKey.key -out mySplunkWebPrivateKey.key 3. /opt/splunk/bin/splunk cmd openssl req -new -key mySplunkWebPrivateKey.key -out mySplunkWebCert.csr Generate the certificate in Amdocs certificate tool (371336.cer is created ) /opt/splunk/bin/splunk cmd openssl x509 -in 371336.cer -outform PEM -out ilissplfwd05.pem Updated the /opt/splunk/etc/system/local/web.conf vi /opt/splunk/etc/system/local/web.conf privKeyPath = etc/auth/splunkweb/ilissplfwd05.key serverCert = etc/auth/splunkweb/ilissplfwd05.pem