All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi! I've created a free trial splunk enterprise and I was trying to run the following:         curl -k -u admin:pass https://localhost:8089/services/messages         But it retu... See more...
Hi! I've created a free trial splunk enterprise and I was trying to run the following:         curl -k -u admin:pass https://localhost:8089/services/messages         But it returns :         <?xml version="1.0" encoding="UTF-8"?> <response> <messages> <msg type="ERROR">Unauthorized</msg> </messages> </response>          I didnt change any of the default ports btw   Any idea why?
Hello. Is there a documentation to have a full visual list, how many and which icons, Splunk Enterprise includes in its default installation a user can use in his own Dashboard Panels inside a custo... See more...
Hello. Is there a documentation to have a full visual list, how many and which icons, Splunk Enterprise includes in its default installation a user can use in his own Dashboard Panels inside a custom html code? Just to know if some icons are useful to make a Dashboard looks better without uploading new images/icons. Thanks.
Hi, How can i write this statement | eval protocolUsed = case( regex consumerkey="[a-z0-9]{8}-[a-z0-9]{4}-[a-z0-9]{4}-[a-z0-9]{4}-[a-z0-9]{12}","O1", regex consumerkey="^[a-z0-9A-Z]{2,}$", "O2"))
Hi,  I'm trying to get logs from rapid7 insightvm into my slpunk server. I have downloaded the Rapidinsightsvm add-on and set it up. I have the index created but no logs are gettting dumped to my in... See more...
Hi,  I'm trying to get logs from rapid7 insightvm into my slpunk server. I have downloaded the Rapidinsightsvm add-on and set it up. I have the index created but no logs are gettting dumped to my index. Is there anything else outside of the add-on setup instructions that I need to do? It is showing the status "false" I have checked the internal logs it is showing the error  "pid=30350  thanks  
Hello, Is it possible to do conditional In Line field extraction in SPLUNK for the following sample data: Sample Data (3 Events) tR3225256009BMFTH77770977DF74S58628201804533FGRT fR6225256009B... See more...
Hello, Is it possible to do conditional In Line field extraction in SPLUNK for the following sample data: Sample Data (3 Events) tR3225256009BMFTH77770977DF74S58628201804533FGRT fR6225256009BMFFT77779977TG76S58628201804633TSRD gR1225256004BMGHL7090997YJK66S58628201804833EDAR   I have done: (^)(?i)(?<DeprtCode>.{1})(?i)(?<mode_no>.{2})(?i)(?<StudentId>.{9})(?i)(?<BuildingCode>.{5})(?i)(?<Code>.{2})(?i) Help Needed to Extract Field under following conditions: If from character # 20-25 (6 Characters) are all Numerics then extract those 6 characters as Account_no, if those 6 characters are not all Numerics (like sample event 3) then extract all characters from 20-46 as no_Account Is it possible? Any recommendations will be highly appreciated. Thank you!  
When I manually run a Splunk search via the API as follows: curl "https://host:8089/services/search/v2/jobs" -d search='search query...'  -d max_count=0 -d earliest_time=xxx -d latest_time=now cu... See more...
When I manually run a Splunk search via the API as follows: curl "https://host:8089/services/search/v2/jobs" -d search='search query...'  -d max_count=0 -d earliest_time=xxx -d latest_time=now curl "https://host:8089/services/search/v2/jobs/jobid/results/" --get -d output_mode=csv -d count=0 I get timestamps like this for the _time column "2023-02-02T00:06:34.000-08:00" When I run the same query, just as a saved search: curl "https://host:8089/servicesNS/nobody/search/search/v2/jobs/export?output_mode=csv -d search='savedsearch "Saved Search"' I get timestamps like this for the _time column "2023-02-06 00:00:00.000 PST" How can I make the latter look like the former so Excel can ingest it properly?
Hi I have a key named ick=2c27194g-af5e-4f7d-9847-07cd5c4c70af  Want to search all the ick using regex  I tried  regex ick="="([a-z0-9]{8}-[a-z0-9]{4}-[a-z0-9]{4}-[a-z0-9]{4}-[a-z0-9]{12})"" ... See more...
Hi I have a key named ick=2c27194g-af5e-4f7d-9847-07cd5c4c70af  Want to search all the ick using regex  I tried  regex ick="="([a-z0-9]{8}-[a-z0-9]{4}-[a-z0-9]{4}-[a-z0-9]{4}-[a-z0-9]{12})"" It is not giving any results. Can some one help?
However we are trying to achive ping status in a dashboard .And also pingcommand app is not supported in splunk and we are not to use apps like network toolkit. How to ping server without using any a... See more...
However we are trying to achive ping status in a dashboard .And also pingcommand app is not supported in splunk and we are not to use apps like network toolkit. How to ping server without using any apps.    
Hi User,   Thanks for the reply. Below is the raw text that has been received on splunk user interface.    {"timestamp": "2023-01-24T08:06:29.621490Z", "level": "INFO", "filename": "splunk_sample... See more...
Hi User,   Thanks for the reply. Below is the raw text that has been received on splunk user interface.    {"timestamp": "2023-01-24T08:06:29.621490Z", "level": "INFO", "filename": "splunk_sample_csv.py", "funcName": "main", "lineno": 38, "message": "Dataframe row : {\"_c0\":{\"0\":null,\"1\":\"266\",\"2\":\"267\",\"3\":\"268\"},\"_c1\":{\"0\":\"Timestamp\",\"1\":\"2023\\/01\\/10 13:31:19\",\"2\":\"2023\\/01\\/10 13:31:19\",\"3\":\"2023\\/01\\/10 13:31:19\"},\"_c2\":{\"0\":\"application\",\"1\":\"DWHEAP\",\"2\":\"DWHEAP\",\"3\":\"DWHEAP\"},\"_c3\":{\"0\":\"ctm\",\"1\":\"LNDEV02\",\"2\":\"LNDEV02\",\"3\":\"LNDEV02\"},\"_c4\":{\"0\":\"cyclic\",\"1\":\"False\",\"2\":\"False\",\"3\":\"False\"},\"_c5\":{\"0\":\"deleted\",\"1\":\"False\",\"2\":\"False\",\"3\":\"False\"},\"_c6\":{\"0\":\"description\",\"1\":\"Job to populate data to RDV for SK SOURCE SALES_EVENT\",\"2\":\"Job to populate data to RDV for SK SOURCE SALES_HIERARCHY\",\"3\":\"Job to populate data to RDV for SK SOURCE SALES_EVENT\"},\"_c7\":{\"0\":\"endTime\",\"1\":null,\"2\":null,\"3\":null},\"_c8\":{\"0\":\"estimatedEndTime\",\"1\":\"[u'20230110144400']\",\"2\":\"[u'20230110123200']\",\"3\":\"[u'20230110123200']\"},\"_c9\":{\"0\":\"estimatedStartTime\",\"1\":\"[u'20230110122700']\",\"2\":\"[u'20230110122700']\",\"3\":\"[u'20230110122700']\"},\"_c10\":{\"0\":\"folder\",\"1\":\"DWHEAP_RDV_SKBACKEND\",\"2\":\"DWHEAP_RDV_SKBACKEND\",\"3\":\"DWHEAP_RDV_SKBACKEND_TEST\"},\"_c11\":{\"0\":\"folderId\",\"1\":\"LNDEV02:\",\"2\":\"LNDEV02:\",\"3\":\"LNDEV02:\"},\"_c12\":{\"0\":\"held\",\"1\":\"False\",\"2\":\"False\",\"3\":\"False\"},\"_c13\":{\"0\":\"host\",\"1\":\"fraasdwhbdd1.de.db.com\",\"2\":\"fraasdwhbdd1.de.db.com\",\"3\":\"fraasdwhbdd1.de.db.com\"},\"_c14\":{\"0\":\"jobId\",\"1\":\"LNDEV02:5jtzl\",\"2\":\"LNDEV02:5jtzi\",\"3\":\"LNDEV02:5jtho\"},\"_c15\":{\"0\":\"logURI\",\"1\":\"https:\\/\\/lnemd.uk.db.com:8443\\/automation-api\\/run\\/job\\/LNDEV02:5jtzl\\/log\",\"2\":\"https:\\/\\/lnemd.uk.db.com:8443\\/automation-api\\/run\\/job\\/LNDEV02:5jtzi\\/log\",\"3\":\"https:\\/\\/lnemd.uk.db.com:8443\\/automation-api\\/run\\/job\\/LNDEV02:5jtho\\/log\"},\"_c16\":{\"0\":\"name\",\"1\":\"SALES_EVENT_RDV\",\"2\":\"SALES_HIERARCHY_RDV\",\"3\":\"SALES_EVENT_RDV\"},\"_c17\":{\"0\":\"numberOfRuns\",\"1\":\"0\",\"2\":\"0\",\"3\":\"0\"},\"_c18\":{\"0\":\"orderDate\",\"1\":\"230106\",\"2\":\"230106\",\"3\":\"230106\"},\"_c19\":{\"0\":\"outputURI\",\"1\":\"Job did not run, it has no output\",\"2\":\"Job did not run, it has no output\",\"3\":\"Job did not run, it has no output\"},\"_c20\":{\"0\":\"startTime\",\"1\":null,\"2\":null,\"3\":null},\"_c21\":{\"0\":\"status\",\"1\":\"Wait Condition\",\"2\":\"Wait Condition\",\"3\":\"Wait Condition\"},\"_c22\":{\"0\":\"subApplication\",\"1\":\"RDV_SKBACKEND\",\"2\":\"RDV_SKBACKEND\",\"3\":\"RDV_SKBACKEND_TEST\"},\"_c23\":{\"0\":\"type\",\"1\":\"Command\",\"2\":\"Command\",\"3\":\"Command\"}} ", "process": 2819, "processName": "MainProcess"}   In the above raw text there are jobId's  \"_c14\":{\"0\":\"jobId\",\"1\":\"LNDEV02:5jtzl\",\"2\":\"LNDEV02:5jtzi\",\"3\":\"LNDEV02:5jtho\"} We need to extract those jobids from the raw text and add them as a seperate field in the events using SPL in the user interface. Please help me on this.
Hi there,  We are trying to collect data from Azure Monitor REST api using Rest API modular Input. The config looks like below:  The problem we are having is the refresh token doesn't work a... See more...
Hi there,  We are trying to collect data from Azure Monitor REST api using Rest API modular Input. The config looks like below:  The problem we are having is the refresh token doesn't work as we expected. Below is the internal log generated by the rest api app, there is no log coming from Azure Monitor API. We can see the attempt trying to execute HTTP request but nothing happened.  Any tips would be appreciated, thank you. 
I have a metric index with a hierarchical structure (maybe all metric indexes are like this).   SuperCategory.Category1.metric1                                                   .metric2          ... See more...
I have a metric index with a hierarchical structure (maybe all metric indexes are like this).   SuperCategory.Category1.metric1                                                   .metric2                                                   .metric3 SuperCategory.Category2.metric1                                                       .metric2 There is a many to one relationship between categories.  I've tried many different combination methods but my starting point was:       | mstats avg(Vmax.StorageGroup.HostIOs) as IOPs avg(Vmax.StorageGroup.AllocatedCapacity) as SgCapacity avg(Vmax.Array.UsableCapacity) as ArrayCap avg(Vmax.Array.HostIOs) as ArrayIOPS WHERE index=storage_metrics by Array_Name, Loc, sgname span=1d | eval SgIOPs = round(IOPs, 2), SgCapacity = round(SgCapacity, 2), SgCapPct=round((SgCapacity/ArrayCap)*100, 2), SgIOPct=round((IOPs/ArrayIOPS)*100, 2) | table sgname Array_Name Loc SgIOPs ArrayIOPS SgIOPct SgCapacity ArrayCap SgCapPct _time       Nothing is returned for any of the Vmax.Array metrics.  There are many 'sgname' to any single 'Array_Name'.  As you can probably tell I'm trying to calculate what % of an array total an sgname is using.  I find myself in this situation quite often and don't really know how to handle it.   I appreciate any help anyone can offer.     
Hi, How can I make this search to display the peak by day index=* sourcetype=Perfmon:Memory host=* |timechart span=7d | stats sparkline(avg(windows_mem_free)) as Trend avg(windows_mem_free) as Av... See more...
Hi, How can I make this search to display the peak by day index=* sourcetype=Perfmon:Memory host=* |timechart span=7d | stats sparkline(avg(windows_mem_free)) as Trend avg(windows_mem_free) as Average, max(windows_mem_free) as Peak , latest(windows_mem_free) as Current, latest(_time) as "Last Updated" by host | convert ctime("Last Updated") | eval Peak=round((Peak)/1000,2) | eval Current=round((Current)/1000,2) | eval Average=round((Average)/1000,2)   Thank you,  
Have been using the Splunk ODBC driver v3.11 (64-bit) to pull time-series event data from a saved search (report) into Excel for manipulation.  Has been working perfectly for over a month but today E... See more...
Have been using the Splunk ODBC driver v3.11 (64-bit) to pull time-series event data from a saved search (report) into Excel for manipulation.  Has been working perfectly for over a month but today Excel has suddenly began complaining about missing columns when executing a refresh: "[Expression.Error] The column 'XXXX' of the table wasn't found." Sure enough, the preview in Power Query shows those columns missing.  Sometimes I can refresh and they'll reappear, but mostly they don't.  The search on the Splunk side runs fine and includes all the columns Excel complains about. I've been troubleshooting by time bounding the search to see if some bad data has crept it but am not finding any pattern yet. Is there any way to enable debug logging for the ODBC driver?  Nothing obvious jumps out at me in the Windows ODBC configuration GUI. Open to other ideas as well - may have to switch back to a two step CSV export and subsequent load into Excel - less than ideal. Thanks in advance!
Still working on this.  I want to create a single pane dashboard panel with trend indicator.  This value is going to display: The current count of events over the last 3 hours and the trendline... See more...
Still working on this.  I want to create a single pane dashboard panel with trend indicator.  This value is going to display: The current count of events over the last 3 hours and the trendline will display the deviation from the average count over the same 3 hour timespan over the last week.
As a newcomer to Splunk, I am currently seeking to gain a deeper understanding of Splunk apps and their associated benefits. While I am familiar with the process of packaging and deploying an app, I ... See more...
As a newcomer to Splunk, I am currently seeking to gain a deeper understanding of Splunk apps and their associated benefits. While I am familiar with the process of packaging and deploying an app, I remain uncertain regarding one particular aspect: whether it is possible to bundle configuration related to the search head and apply it to the entire search head, as opposed to only a specific app? My difficulty in understanding the specifics of this process has led me to question whether, upon deploying the packaged configuration, it will indeed only be applied to that specific app and not to the wider Splunk environment. I would greatly appreciate it if you could point me towards any relevant documents or resources too.
convert 2023-03-15T17:25:18.832-0400 to YYYY-MM-DD HH:MM:SS.Millisec . 2023-03-15T17:25:18.832-0400 ------------------- > 2023-03-15 17:25:18.832 Once converted to the asked format i need to calc... See more...
convert 2023-03-15T17:25:18.832-0400 to YYYY-MM-DD HH:MM:SS.Millisec . 2023-03-15T17:25:18.832-0400 ------------------- > 2023-03-15 17:25:18.832 Once converted to the asked format i need to calculate the difference between EndTime & StartTIme.
I am a Splunk newbie needing help.  I had successfully setup my Windows PCs to log printing events, specifically event 307.  I setup a Splunk report that would list any activity with this event id.  ... See more...
I am a Splunk newbie needing help.  I had successfully setup my Windows PCs to log printing events, specifically event 307.  I setup a Splunk report that would list any activity with this event id.  I noticed that one PC that shared a printer stopped reporting.  I can see on the PC's event log that it is logging print jobs but are not showing up in the Splunk index that collects the logs.  Other events from that PC are being logged in the index.  I ran some tests on other PCs, doing print jobs "to PDF".  It shows up in the index on one other PC but does not for other PCs.  I am using the Splunk agent on all my PCs.  Is there a way to track an event being logged on a PC and making its way through the splunk agent onto the splunk server and the index?  Also, the historic print events that used to show up in the Index no longer do when I choose "All Time" as the time reference.
I have two different queries that return the absolute same result:      value | chart count(status) by request_method, http_cloudid     and      value | chart count by request_metho... See more...
I have two different queries that return the absolute same result:      value | chart count(status) by request_method, http_cloudid     and      value | chart count by request_method, http_cloudid       Does it mean that the field specified in the `count` function actually does nothing using `chart`? 
Hi, I have a new Node app and am trying to install the AppDynamics package. This fails with a 403 when downloading some files. This is from a Windows 11 machine using Node 18.15.0. We also have Mac... See more...
Hi, I have a new Node app and am trying to install the AppDynamics package. This fails with a 403 when downloading some files. This is from a Windows 11 machine using Node 18.15.0. We also have Mac M1 developers who also need to be able to install this package. Seeing as this is a 403 is there some kind of authentication needed? Or is the file just not available over the VPN. If I paste the download URL into a browser I also get a 403. Thanks installing packageappdynamics-libagent-napi-native v18.15.0 Testing: BaseUrl set is:https://cdn.appdynamics.com/packages/nodejs/ Agent version 23.3.0 Get agent version result is:23.3.0.0 Agent version 23.3.0 baseUrl is:https://cdn.appdynamics.com/packages/nodejs/ DownloadURL is:https://cdn.appdynamics.com/packages/nodejs/23.3.0.0/appdynamics-libagent-napi-native-win32-x64-v18.tgz https://cdn.appdynamics.com/packages/nodejs/23.3.0.0/appdynamics-libagent-napi-native-win32-x64-v18.tgz Agent version 23.3.0 Downloadinghttps://cdn.appdynamics.com/packages/nodejs/23.3.0.0/appdynamics-libagent-napi-native-win32-x64-v18.tgz Saving toC:\Users\alanm\AppData\Local\Temp\appdynamics-23.3.0.0\appdynamics-libagent-napi-native-win32-x64-v18.tgz user agent: yarn/1.22.19 npm/? node/v18.15.0 win32 x64 Receiving... https://cdn.appdynamics.com/packages/nodejs/23.3.0.0/appdynamics-libagent-napi-native-win32-x64-v18.tgz Error requesting archive. Status: 403 Request options: { "uri": "https://cdn.appdynamics.com/packages/nodejs/23.3.0.0/appdynamics-libagent-napi-native-win32-x64-v18.tgz", "followRedirect": true, "headers": { "User-Agent": "yarn/1.22.19 npm/? node/v18.15.0 win32 x64" }, "gzip": true, "encoding": null, "strictSSL": true, "hostname": "cdn.appdynamics.com", "path": "/packages/nodejs/23.3.0.0/appdynamics-libagent-napi-native-win32-x64-v18.tgz" } Response headers: { "content-type": "application/xml", "transfer-encoding": "chunked", "connection": "close", "date": "Wed, 15 Mar 2023 21:13:36 GMT", "server": "nginx/1.16.1", "x-cache": "Error from cloudfront", "via": "1.1 9910b161083ec8200ad24e6d6beec168.cloudfront.net (CloudFront)", "x-amz-cf-pop": "SYD1-C1",
Hello All - I'm fairly new to Splunk and I've been racking my head for the past 8 hours trying to create a table for comparing two sets of runs. My search string is as follows: index="fe" source="... See more...
Hello All - I'm fairly new to Splunk and I've been racking my head for the past 8 hours trying to create a table for comparing two sets of runs. My search string is as follows: index="fe" source="regress_rpt" pipeline="soc" version IN ("(version="23ww09a" OR version="23ww10b")") dut="*" cthModel="*" (testlist="*") (testName="*") status=* | rex field=testPath "/regression/(?<regressionPath>.*)" | search regressionPath="*" | search regressionPath="***"| stats values(cyclesPerCpuSec) values(version) by testPath | reverse This will generate a table similar to:     testPath values(cyclesPerCpuSec) vales(version) test.6 719.91 23ww10b test.5 742.44 23ww10b test.4 563.46 23ww10b test.6 694.36 23ww10a test.5 423.23 23ww10a test.4 146.34 23ww10a   However, I want to display a table that looks like   testPath 23ww10a 23ww10b test.6 694.36 719.91 test.5 423.23 742.44 test.4 146.34 563.46   Essentially, I'm trying to create table so I can compare the results of a specific test across different model versions. Any help is greatly appreciated! Thanks, Phil