All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I have traefik access log in json format with below values. --- Value Samples --- Duration: 109249593                  ==> [The total time taken (in nanoseconds) by processing the response] t... See more...
Hi, I have traefik access log in json format with below values. --- Value Samples --- Duration: 109249593                  ==> [The total time taken (in nanoseconds) by processing the response] time: 2021-03-20T09:30:01-07:00. ==> [The Request Time] RequestAddr: example.domain.com. ==> [The HTTP Host header] ------ I am trying to use these 3 key/values to calculate TPS and also percentiles(using the Duration as response time metric). I have below query to do TPS as of now. --- index=myindex sourcetype=access_log RequestAddr=*.domain.com | eval count=1 | timechart per_second(count) as TPS by RequestAddr ---- I am assuming above query will not provide actual TPS since logs are buffered before written to file and then this is pushed by splunk forwarder. Still a noob figuring out splunk.  Please let me know any ideas to go about calculating TPS and percentiles using the value samples. Thank you !
Hello, the Value and Shipping fields have commas, and refer to the currency, there is a way to treat this data before indexing, so that this data is treated as a "." in place of the "," This ma... See more...
Hello, the Value and Shipping fields have commas, and refer to the currency, there is a way to treat this data before indexing, so that this data is treated as a "." in place of the "," This makes it impossible when summing and averaging values. Ex: Original: IP=189.41.40.129,Produto="Test1",Valor=179,00,Categoria=Banho,Campanha=1,Vendeu=1,MetododeCompra=1,Bandeira=1,Transportadora=2,Frete=18,57,"_time"2021/01/25 19:20:37.374" IP=201.0.205.197,Produto="Test2",Valor=123,98,Categoria=Jogos,Campanha=1,Vendeu=0,MetododeCompra=0,Bandeira=0,Transportadora=5,Frete=12,58,"_time"2021/01/25 19:20:38.977" IP=187.125.147.178,Produto="Teste3",Valor=139,90,Categoria=Cozinha,Campanha=1,Vendeu=1,MetododeCompra=1,Bandeira=1,Transportadora=3,Frete=14,27,"_time"2021/01/25 19:20:38.977" IP=187.115.202.233,Produto="Test4",Valor=139,90,Categoria=Cozinha,Campanha=1,Vendeu=1,MetododeCompra=1,Bandeira=1,Transportadora=2,Frete=14,51,"_time"2021/01/25 19:20:39.579" IP=187.111.15.221,Produto="Test5",Valor=164,00,Categoria=Banho,Campanha=2,Vendeu=1,MetododeCompra=1,Bandeira=1,Transportadora=1,Frete=16,81,"_time"2021/01/25 19:20:40.580" Change: IP=189.41.40.129,Produto="Test1",Valor=179.00,Categoria=Banho,Campanha=1,Vendeu=1,MetododeCompra=1,Bandeira=1,Transportadora=2,Frete=18.57,"_time"2021/01/25 19:20:37.374" IP=201.0.205.197,Produto="Test2",Valor=123.98,Categoria=Jogos,Campanha=1,Vendeu=0,MetododeCompra=0,Bandeira=0,Transportadora=5,Frete=12.58,"_time"2021/01/25 19:20:38.977" IP=187.125.147.178,Produto="Teste3",Valor=139.90,Categoria=Cozinha,Campanha=1,Vendeu=1,MetododeCompra=1,Bandeira=1,Transportadora=3,Frete=14.27,"_time"2021/01/25 19:20:38.977" IP=187.115.202.233,Produto="Test4",Valor=139.90,Categoria=Cozinha,Campanha=1,Vendeu=1,MetododeCompra=1,Bandeira=1,Transportadora=2,Frete=14.51,"_time"2021/01/25 19:20:39.579" IP=187.111.15.221,Produto="Test5",Valor=164.00,Categoria=Banho,Campanha=2,Vendeu=1,MetododeCompra=1,Bandeira=1,Transportadora=1,Frete=16.81,"_time"2021/01/25 19:20:40.580"  
Hi, So my search window is from Feb 19 - Feb 23. I would like to have isolate Feb 19 - to have my events start on this date. My time range give me all the dates, but I would like to have them start ... See more...
Hi, So my search window is from Feb 19 - Feb 23. I would like to have isolate Feb 19 - to have my events start on this date. My time range give me all the dates, but I would like to have them start with Feb 19 and two days after, but still keeping my time range search Feb 19 - Feb 23.  I am using the where clause Feb_19>=PlayTime because I would like to have the events starting on Feb_19. Is my concept correct? I just need to start on the Feb_19 using less than or greater than equal to.  (index="Example") OR (index="Blah") | eval SundayTime=case(area="23", effortsTimeStamp), PlayTime=case(eventType="Fun", loggedHrofEvent) | eval date="2021-02-19 00:00:00.00" | eval Feb19=strptime(date,"%Y-%m-%d %H:%M:%S.%6N") | eval Feb_19=strftime(Feb19,"%Y-%m-%d %H:%M:%S") | stats values(documents) as documents, values(index) as index, latest(PlayTime) as PlayTime latest(SundayTime) as SundayTime values(Feb_19) as Feb_19 by orders | where isnull(PlayTime) AND Feb_19>=PlayTime  
  When I ran the python script the above error throwing as shown in the screen shot. Installed python3 on the environment.  Could some help me with the same.   
Trying to monitor the performance data on MacOS and downloaded Splunk Add-on for Unix and Linux  After clicking 'save' for setting on Splunk Enterprise Web, it shows the page that says 'Safari Can't... See more...
Trying to monitor the performance data on MacOS and downloaded Splunk Add-on for Unix and Linux  After clicking 'save' for setting on Splunk Enterprise Web, it shows the page that says 'Safari Can't Connect to the Server  Safari can't open the page "localhost:8000/en-US/app/Splunk_TA_nix/ta_nix_configuration" because Safari can't connect to the server "localhost".' Please find the attached for reference Any help would be appreicated! Thanks!   ref: https://docs.splunk.com/Documentation/AddOns/released/UnixLinux/About https://splunkbase.splunk.com/app/833/ https://lantern.splunk.com/hc/en-us/articles/360048491734-Operating-system-performance-data-
Trying to monitor the performance data on MacOS and downloaded Splunk Add-on for Unix and Linux  After clicking 'save' for setting on Splunk Enterprise Web, it shows the page that says 'Safari Can't... See more...
Trying to monitor the performance data on MacOS and downloaded Splunk Add-on for Unix and Linux  After clicking 'save' for setting on Splunk Enterprise Web, it shows the page that says 'Safari Can't Connect to the Server  Safari can't open the page "localhost:8000/en-US/app/Splunk_TA_nix/ta_nix_configuration" because Safari can't connect to the server "localhost".' Please find the attached for reference Any help would be appreicated! Thanks!
How can I, from an IP, obtain its location to bring information by region? In the example below I only have the IP column, I need to bring information about the country and region of the same.  ... See more...
How can I, from an IP, obtain its location to bring information by region? In the example below I only have the IP column, I need to bring information about the country and region of the same.   IP="189.80.213.213",Produto="Chuveiro Ducha Advanced Eletronica Turbo Lorenzetti",Valor="164,00",Categoria=Banho,Campanha="2",Vendeu="1",MetododeCompra="1",Bandeira="1",Transportadora="4",Frete="17,26",Time=2021/01/26 19:06:32.179" IP="177.184.142.26",Produto="Crepeira Eletrica 4 Cavidades Antiaderente",Valor="99,90",Categoria=Cozinha,Campanha="2",Vendeu="1",MetododeCompra="1",Bandeira="1",Transportadora="1",Frete="10,24",Time=2021/01/26 19:06:31.579"  
I like to make alert when ever i am opening my email weather i give correct password or the wrong password no matter what is it possible using splunk tool ?
Is there individual indexer specific conf files present specially for Props.conf file ?  In Linux , how can we identify the indexer specific conf files for a particular index ?
While doing the schedule and export option of PDF generation , the graph format is getting truncated . However , the print then save as PDF option is working correctly. Is this because of the issue w... See more...
While doing the schedule and export option of PDF generation , the graph format is getting truncated . However , the print then save as PDF option is working correctly. Is this because of the issue with the Limits.conf file ? I was mentioning about the default value of Max_rows_per_table as 1000.  Please suggest a best way to fix this issue or is this expected on PDF generation and no solutions available . 
Here is the data for illustration: (To facilitate experiment, I provide below the query snippet to recreate the data in Splunk query.) | makeresults | eval _raw= "Date Time DEVICE ATTRIBUTE STATE... See more...
Here is the data for illustration: (To facilitate experiment, I provide below the query snippet to recreate the data in Splunk query.) | makeresults | eval _raw= "Date Time DEVICE ATTRIBUTE STATE TagExpected 2021-03-19 11:56:22.449 K30 SOR_B_STATUS.STATE SOR_FAILED 1 2021-03-19 12:16:17.564 K30 SOR_A_STATUS.STATE SOR_FAILED 1 2021-03-19 12:17:55.191 K30 SOR_RESTRICT_STATUS.STATE SOR_SPEED_RESTRICT_ON 1 2021-03-19 12:21:16.659 K30 SOR_A_STATUS.STATE SOR_FAILED 2 2021-03-19 12:32:42.247 K30 SOR_B_STATUS.STATE SOR_FAILED 2 2021-03-19 12:51:21.456 A60 SOR_RESTRICT_STATUS.STATE SOR_SPEED_RESTRICT_ON 2 2021-03-19 12:51:52.949 A60 SOR_A_STATUS.STATE SOR_FAILED 1 2021-03-19 12:54:01.077 A60 SOR_B_STATUS.STATE SOR_FAILED 1 2021-03-19 13:01:26.367 A60 SOR_RESTRICT_STATUS.STATE SOR_SPEED_RESTRICT_ON 1 2021-03-19 13:01:26.818 K30 SOR_A_STATUS.STATE SOR_FAILED 3 2021-03-19 13:02:41.142 K30 SOR_B_STATUS.STATE SOR_FAILED 3 2021-03-19 13:08:19.694 A60 SOR_RESTRICT_STATUS.STATE SOR_SPEED_RESTRICT_ON 2 2021-03-19 13:09:14.433 K30 SOR_B_STATUS.STATE SOR_FAILED 4 2021-03-19 13:10:19.149 W34 SOR_RESTRICT_STATUS.STATE SOR_SPEED_RESTRICT_OFF 1 2021-03-19 13:16:12.847 A60 SOR_B_STATUS.STATE SOR_FAILED 3 2021-03-19 13:24:59.420 A60 SOR_A_STATUS.STATE SOR_FAILED 3 2021-03-19 13:24:59.870 A60 SOR_RESTRICT_STATUS.STATE SOR_SPEED_RESTRICT_ON 3 2021-03-19 13:25:48.068 A60 SOR_RESTRICT_STATUS.STATE SOR_SPEED_RESTRICT_OFF 2021-03-19 13:35:47.614 A60 SOR_A_STATUS.STATE SOR_FAILED 4 2021-03-19 13:38:19.632 A90 SOR_B_STATUS.STATE SOR_FAILED 1 2021-03-19 13:46:10.118 R20 SOR_B_STATUS.STATE SOR_FAILED 1 2021-03-19 13:50:30.328 R50 SOR_A_STATUS.STATE SOR_FAILED 1 2021-03-19 13:54:58.831 W20 SOR_RESTRICT_STATUS.STATE SOR_SPEED_RESTRICT_ON 1 2021-03-19 13:55:30.622 W20 SOR_RESTRICT_STATUS.STATE SOR_SPEED_RESTRICT_OFF 2021-03-19 13:56:38.060 A60 SOR_A_STATUS.STATE SOR_FAILED 5 2021-03-19 14:02:19.102 K30 SOR_B_STATUS.STATE SOR_FAILED 5 2021-03-19 14:08:51.212 R50 SOR_A_STATUS.STATE SOR_FAILED 2 2021-03-19 14:09:47.657 R20 SOR_B_STATUS.STATE SOR_FAILED 2 2021-03-19 14:11:10.387 C30 SOR_B_STATUS.STATE SOR_FAILED 1 2021-03-19 15:01:15.315 C30 SOR_B_STATUS.STATE SOR_FAILED 2 2021-03-19 15:02:33.670 R65 SOR_A_STATUS.STATE SOR_FAILED 1 2021-03-19 15:06:56.258 C50 SOR_B_STATUS.STATE SOR_FAILED 1 2021-03-19 15:09:32.583 R50 SOR_A_STATUS.STATE SOR_FAILED 3 2021-03-19 15:09:33.484 R50 SOR_RESTRICT_STATUS.STATE SOR_SPEED_RESTRICT_ON 3 2021-03-19 15:09:40.240 R50 SOR_RESTRICT_STATUS.STATE SOR_SPEED_RESTRICT_OFF 2021-03-19 15:36:17.104 A90 SOR_B_STATUS.STATE SOR_FAILED 1" | multikv forceheader=1 | eval combined=Date." ".Time | eval _time=strptime('combined', "%Y-%m-%d %H:%M:%S.%Q") | fields _time DEVICE ATTRIBUTE STATE TagExpected In the above records, I’d like to tag or group the events by the following rules: For a device value, e.g. K30, I’d like to tag the events with DEVICE value K30 with first occurrences of SOR_B_STATUS.STATE or SOR_A_STATUS.STATE (their occurrences do not matter) and SOR_RESTRICT_STATUS.STATE after with the same tag value, to be one group. For the same DEVICE value, next occurrences of SOR_B_STATUS.STATE or SOR_A_STATUS.STATE (their occurrences do not matter) and SOR_RESTRICT_STATUS.STATE after I’d group them to be tag 2, to be a different group, etc. Subset of such event group will also be considered to be separate group. The events in a group does not need to be adjacent in time. I don’t insist on the tag value, any mechanism to group the events would be fine. I feel that using streamstats or transaction in some fashion might solve the problem. But I need help to wrap around my brain to work it help. Really appreciate that you could give me some help or hint. Thanks in advance, and have nice weekends!  
Hi Everyone,  I have two events like below on the same index though. I captured all fields through rex command but unable to join and publish the desired output. Kindly Help. Thank you index=abc  ... See more...
Hi Everyone,  I have two events like below on the same index though. I captured all fields through rex command but unable to join and publish the desired output. Kindly Help. Thank you index=abc  Event 1 : caseStatus in update case :: CaseStatusToUpdate [caseId=12345, caseStatus=Active, timeStamp=Fri Mar 19 18:49:39 UTC 2021] Event 2: caseDetails :: [caseID=12345, type=Credit] Output: caseID, caseStatus, type, timeStamp
Guys, I need support, I need to upload these files and process this data, but I need them to be indexed by the _time field, I was unable to use the standard type of pre-defined data. Can you help ... See more...
Guys, I need support, I need to upload these files and process this data, but I need them to be indexed by the _time field, I was unable to use the standard type of pre-defined data. Can you help me? Another detail, the Valor and Frete fields have commas, and refer to currency, there is a way to treat this data before indexing, so that this data is treated as "." for example ? Example file: https://www.dropbox.com/s/exh7g1glumxcetr/log.txt?dl=0
I am trying to create a dashboard for users that need to be able to set a time/date range manually. But in the time picker that is created by default in the dashboard, the latest_time is always 'now'... See more...
I am trying to create a dashboard for users that need to be able to set a time/date range manually. But in the time picker that is created by default in the dashboard, the latest_time is always 'now'. The only way around this is to use the presets or the advanced options, but I can't expect users to work out the necessary times in the advanced.  I've tried looking through the documentation and forums, but not finding anything yet that indicates how to get around this. Anyone have a way to get around this?
However, so far, I can't derive anything meaningful for building the dashboards. I would like to set Splunk to monitor the host operating systems logs files and/or performance data on macOS. I get d... See more...
However, so far, I can't derive anything meaningful for building the dashboards. I would like to set Splunk to monitor the host operating systems logs files and/or performance data on macOS. I get data in from sources including '/var/log' and '/Library/Logs' but don't see anything meaningful from the data with certain field values filtered. I would also like to monitor the performance data but not sure where they locate at or how to filter the values. Any help would be appreciated! Thanks! System Log Folder: /var/log System Log: /var/log/system.log Mac Analytics Data: /var/log/DiagnosticMessages System Application Logs: /Library/Logs System Reports: /Library/Logs/DiagnosticReports User Application Logs: ~/Library/Logs (in other words, /Users/NAME/Library/Logs) User Reports: ~/Library/Logs/DiagnosticReports (in other words, /Users/NAME/Library/Logs/DiagnosticReports)
Hello I am looking for documentation concernant the way to integrate Splunk add-ons ans apps on a Splunk platform? (Installation, configuration file to update (indexes.conf or server.conf for exampl... See more...
Hello I am looking for documentation concernant the way to integrate Splunk add-ons ans apps on a Splunk platform? (Installation, configuration file to update (indexes.conf or server.conf for example, connector needs,...). Thanks 
Best strategy to deal with / Fix Time zone (time_sych) issues between Splunk Servers & hosts scattered around the US? How does one prepare a report of the Hosts having time zone differences & ways t... See more...
Best strategy to deal with / Fix Time zone (time_sych) issues between Splunk Servers & hosts scattered around the US? How does one prepare a report of the Hosts having time zone differences & ways to fix the issues.
Hi Everyone, Requesting small help with configuring props.conf which can help me to break the multiline events correctly. These are two types of events which I am trying to ingest for the first one ... See more...
Hi Everyone, Requesting small help with configuring props.conf which can help me to break the multiline events correctly. These are two types of events which I am trying to ingest for the first one either a part is being ingested or the event is broken for the second one(in a single line) that is ingesting without any issues. I tried below props.conf but no luck, I am just a newbie therefore requesting your help. For BREAK_ONLY_BEFORE I added the rex so that it can capture and break both types of events.   [testing] BREAK_ONLY_BEFORE={(\s+|)"transaction-id"(\s+|):(\s+|)" SHOULD_LINEMERGE=false NO_BINARY_CHECK=1 TRUNCATE=0 MAX_EVENTS=1024       { "transaction-id" : "steve-123", "usecase-id" : "123", "timestamp" : "2021-03-07T06:51:27,188+0100", "timestamp-out" : "2021-03-07T06:51:27,188+0100", "component" : "A", "payload" : "{\"error\":\"Internal server error\",\"message\":\"Internal server error\",\"description\":\"The server encountered an unexpected condition that prevented it from fulfilling the request\"}", "country-code" : "IN", "status" : "error", "error-code" : "500", "error" : "Internal Server Error", "message-size" : 176, "logpoint" : "response" } {"transaction-id":"steve-456","usecase-id":"456","timestamp":"2021-03-07T06:51:27,188+0100","timestamp-out":"2021-03-07T06:51:27,188+0100","component":"B","payload":"{\"error\":\"Internalservererror\",\"message\":\"Internalservererror\",\"description\":\"The server encountered an unexpected condition that prevented it from fulfilling the request\"}","country-code":"IN","status":"error","error-code":"500","error":"Internal Server Error","message-size":176,"logpoint":"response"}     Thanks, Sunny
I have an app that configures data inputs with columns for "Name" and "Destination".  Once there is data in the sourcetype is it possible to know which "Name" it belongs to?  Hope that makes sense. ... See more...
I have an app that configures data inputs with columns for "Name" and "Destination".  Once there is data in the sourcetype is it possible to know which "Name" it belongs to?  Hope that makes sense.   Thanks! David
I am trying to define a query where I have to use the earliest time as 2 days ago at 22:20:45 and latest time 1 day ago at 22:20:45 I tried different formats below earliest=-2d@:22h:20m:45s latest... See more...
I am trying to define a query where I have to use the earliest time as 2 days ago at 22:20:45 and latest time 1 day ago at 22:20:45 I tried different formats below earliest=-2d@:22h:20m:45s latest=-1d@d+20h+30m+45s But - I am not sure if these are correct and also how can I check if this translates to the date and time I am trying to set. Thank you