All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, We are using SaaS controller build 23.4.0-1559 and all our browser jobs running on our local PSA fail with error "Invalid measurement status state when publishing results: Failed to save result"... See more...
Hi, We are using SaaS controller build 23.4.0-1559 and all our browser jobs running on our local PSA fail with error "Invalid measurement status state when publishing results: Failed to save result". We have tried using both an older PSA agent and the latest one, but are getting the same error. Any work arounds for this issue? Thanks, RB
Hello, I have events with Key/Value pair assigned by "="Highlighted in Bold) and separated by special character "^".  Some Key/Value pair don't have values (Example Application Data). How would I p... See more...
Hello, I have events with Key/Value pair assigned by "="Highlighted in Bold) and separated by special character "^".  Some Key/Value pair don't have values (Example Application Data). How would I perform field extraction for these events? 4 sample events are provided below. Any help will be highly appreciated. Thank you so much. 4 Sample Events Application data=^Provider=TEST^ClientID=7acc2a917-309d-461b-806a-8b34ea6232aed8^givenName=XYZ^surName=ABC^dateofBirth=1970-10-05^address=1940n Xaybas St Apt 1^city=ABC^state=NY^zip=50000^email=xyz@gmail.com^phone=10974173700^alevel=IAB7^SESSIONID=eZdfasRaMfTSSG2EDGUuT1UaYnWvk5rk=^AppID=OBA^TransD=4de099545e02-493s5-4720-9094-cef80cd71f7r3||2023-05-25T15:25:38.150Z||||||12.209.9.173| Application data=^Provider=TEST^givenName=XYZ^surName=ABC^dateofBirth=1970-10-05^address=1940n Xaybas St Apt 1^city=ABC^state=NY^zip=50000^email=ayz@gmail.com^phone=10974173700^alevel=IAB7^sub=20216defba5c04b6c8481eca9d174d43cas^isass=https://api.test/oasidc^aud=[a255e650b9a8194b0264468854b57b41]^exp=Thu May 25 21:25:37 EDT 2023^iat=Thu May 25 11:25:37 EDT 2023^AppID=OBA^TransID=49df545e02-493d5-4720-90df94-cef80cd71f7bv3||2023-05-25T15:25:38.097Z||||||10.208.9.173| Application data=^Provider=TEST^givenName=XYZ^surName=ABC^dateofBirth=1970-10-05^address=1940n Xaybas St Apt 1^city=ABC^state=NY^zip=50000^email=xyz@gmail.com^phone=10974173700^alevel=IAB7^sub=20216defse5c04b6c8481eca9d174d43c^isass=https://api.test/oidfdc^aud=[a255e650dfdb9a8194b0264468854b57b41]^exp=Thu May 25 21:25:37 EDT 2023^iat=Thu May 25 11:25:37 EDT 2023^AppID=OBA^TransID=49sd545e02-493j5-4720-9ds094-cef80cd71fvv73||2023-05-25T15:25:38.094Z||||||12.208.9.173| Application data=^Provider=eTEST^ID=9bsa5263e3-7423-4f01-a00f-4esse16e1693a8^givenName=MAAAMA^surName=SODA^dateofBirth=1968:10:20^address=92 Barca Boulevard^city=TAC^state=NH^zip=60000^email=mksds@gmail.com^phone=629-337-2349^alevel=D^SSIONID=pdffT2awi0gYTLbJo9kUtvosJsLnXNM=^AppID=OBA||2023-05-25T15:24:54.795Z||||||10.208.10.170|
Hi there, I'm a noob. I'm looking to generate a report containing a list of events per host for a specific timeframe (e.g. 5 mins), grouped by host, and with a heading per host, like this: ------... See more...
Hi there, I'm a noob. I'm looking to generate a report containing a list of events per host for a specific timeframe (e.g. 5 mins), grouped by host, and with a heading per host, like this: ---------------------------------------- Host: host1.somedomain.com ---------------------------------------- 2023-05-26T15:36:46.000001+10:00 [2023-05-26T15:36:46+10:00] host1.somedomain.com - kernel: <blah1> 2023-05-26T15:36:46.012345+10:00 [2023-05-26T15:36:46+10:00] host1.somedomain.com - kernel: <blah2> ---------------------------------------- Host: host2.somedomain.com ---------------------------------------- 2023-05-26T15:36:46.004567+10:00 [2023-05-26T15:36:46+10:00] host2.somedomain.com - kernel: <blah3> 2023-05-26T15:36:46.005678+10:00 [2023-05-26T15:36:46+10:00] host2.somedomain.com - kernel: <blah4> etc. etc. I have got to the point where I'm able to generate a report containing all events for the timeframe using this search, but there is no grouping by host, and therefore no heading per host: index=myindex | sort 0 host, _time Can anyone suggest how I might achieve the above? Many thanks.
for example I want to upload a log file to splunk using universal forwarder. But in that log file there is a lot of log data I don't want to use and I don't want to put it on splunk, I can process it... See more...
for example I want to upload a log file to splunk using universal forwarder. But in that log file there is a lot of log data I don't want to use and I don't want to put it on splunk, I can process it in UF or on splunk. my end goal is to parse logs to json file to draw dashboard on splunk
Dear All: I want to publish the application to Splunk,but I want to clarify some things,as follows: 1. How to publish your own program to Splunk. 2.What is the query efficiency of the Splunk log i... See more...
Dear All: I want to publish the application to Splunk,but I want to clarify some things,as follows: 1. How to publish your own program to Splunk. 2.What is the query efficiency of the Splunk log interface? For example, if I want to achieve a QPS of 9000, what is the minimum configuration that my machine should meet. 3.What is the minimum configuration that a machine can meet to meet the most basic performance Thank you~
Hello, I am trying to get a field extraction working, and have written regex accordingly that the field extractor seems to like. The raw logs are a list of quotes-encapsulated fields separated by ... See more...
Hello, I am trying to get a field extraction working, and have written regex accordingly that the field extractor seems to like. The raw logs are a list of quotes-encapsulated fields separated by commas: "field1","field2","field3",... Certain fields can have multiple values, wherein the values are separated only by a comma but quotes enclose only the entire list of fields. For example: "field1","field2","field3value1,field3value2,field3value3",... To complicate matters, values that belong to a certain field can contain multiple words separated by other characters, such as "Software/Technology" or "Business and Industry" so that the entire field may look something like this: "Software/Technology,Business Services,Application,Business and Industry,Computers and Internet" That field needs to be extracted and displayed exactly as it is shown, The regex I have attempted for this is as follows: "(?<categories>[^\"]+|) "(?<categories_again>[\w\s\/\,]+|) Although the field extractor, rex function, and regex101 like both of these extractions and they work exactly as expected, when I search I get each word from within the field as its own independent value, which is not what I need: Software Technology Business Services Application and Industry At this point I'm out of ideas as to regex modifications or other work-arounds that can be applied to fix this. Has anyone else encountered this problem and if so, were you able to fix it and how? Otherwise I think I have to bring this to Splunk support. Thank you
I have a sourcetype with events like:     fieldname.field1=value1,fieldname.field2=value1 value2 value3 value4,fieldname.field3=value1     To extract the fields, I use the following trans... See more...
I have a sourcetype with events like:     fieldname.field1=value1,fieldname.field2=value1 value2 value3 value4,fieldname.field3=value1     To extract the fields, I use the following transformation which works correctly:     [extract_key_value] FORMAT = $1::$2 REGEX = fieldname.([^=]+)=([^\,]+)     The field2, I want to convert it into a multivalue field by splitting it on spaces. If I do the following, it works correctly:     [field2] SOURCE_KEY = field2 MV_ADD = 1 FORMAT = field_test::$1 REGEX = ([^\s]+):     However, if I try to overwrite the field, it doesn't work.     [field2] SOURCE_KEY = field2 MV_ADD = 1 FORMAT = field2::$1 REGEX = ([^\s]+):     What could be my mistake? Thank you in advance PD: My props.conf:     [mysourcetype] KV_MODE = None REPORT-extract_key_value = extract_key_value REPORT-extract_mv_fields = field2 SHOULD_LINEMERGE = false      
Hello! Trying to collect Google Workspace Alert Center events using this app https://splunkbase.splunk.com/app/5556   Input for the alert center successfully connects to the API but it is a ve... See more...
Hello! Trying to collect Google Workspace Alert Center events using this app https://splunkbase.splunk.com/app/5556   Input for the alert center successfully connects to the API but it is a very strange significant delay (few hours) between alert entry in Google and its ingestion to the our Splunkcloud. From what I can see in app logs - app works correctly and successfully querying API each 300 seconds. Now I've enabled debug logging, maybe it will help to understand what goes wrong. Any ideas on troubleshooting are appreciated. Second question, there is an ADDON-61892 issue (GWS Alert Center: 'Gmail Phishing' source inputs not working as expected) mentioned in the release notes. Is it possible to get more information on this issue or Splunk issue tracker is private?
The splunkd.log on a Windows host shows the following errors: 05-22-2023 15:31:34.452 -0400 ERROR FrameworkUtils [15508 ExecProcessor] - Incorrect path to script: \.\bin\rectify_hostname.sh. Script... See more...
The splunkd.log on a Windows host shows the following errors: 05-22-2023 15:31:34.452 -0400 ERROR FrameworkUtils [15508 ExecProcessor] - Incorrect path to script: \.\bin\rectify_hostname.sh. Script must be located inside $SPLUNK_HOME\bin\scripts. 05-22-2023 15:31:34.452 -0400 ERROR ExecProcessor [15508 ExecProcessor] - Ignoring: "\.\bin\rectify_hostname.sh"  How am I able to fix this? I cannot find the "\.\bin\rectify_hostname.sh" path on the host.
Hi Friends, while I'm using  |addinfo in my search and I can retrieve data successfully but our client can't view the data in this query. But they can access that index successfully. Only |addinfo ... See more...
Hi Friends, while I'm using  |addinfo in my search and I can retrieve data successfully but our client can't view the data in this query. But they can access that index successfully. Only |addinfo is unable to search them.   Could you please guide me which capability related with this command ? Which capability I need to provide permission to them to access |addinfo command ?   My query: index=pg_idx_whse_snow_prod sourcetype="snow:incident" source="https://pgglobalenterprise.service-now.com/" | addinfo | eval earliest=strftime(info_min_time,"%Y-%m-%d %H:%M:%S"), latest=strftime(info_max_time,"%Y-%m-%d %H:%M:%S") | where (sys_created_on&gt;=earliest) | dedup ticket_id | stats count  
Hello, I have a table in dashboard studio with 3 rows; userid, timestamp, and eventtype. I want to filter the table by the first 3 digits of user's IDs when selected from the drop down, for example... See more...
Hello, I have a table in dashboard studio with 3 rows; userid, timestamp, and eventtype. I want to filter the table by the first 3 digits of user's IDs when selected from the drop down, for example user IDs are abc123 , abc234, xyz123. I want the filter to have "abc" "xyz" and "all" as the drop down options to filter. The issue I'm running into is that the userid is mapped from a lookup so it is not in the event data. Any suggestion would be welcomed. Thank you!
I am looking for the table to be in decreasing order and with the Total row on top.  This is my current search.  index=jia source="/hptc_cluster/splunk/Reports/PBS/splunkresults.csv" host="gridmetr... See more...
I am looking for the table to be in decreasing order and with the Total row on top.  This is my current search.  index=jia source="/hptc_cluster/splunk/Reports/PBS/splunkresults.csv" host="gridmetrics" sourcetype="JiaGridMetrics" | stats count as Slots by JobName | rename Slots as CPU | addcoltotals label=Total labelfield=JobName | sort count desc This is the output.  
We have a log file that is split into multiple events. In these events we need to count the number of occurrences where Event XXX > 0 and Event YYY > 0 for each Source file. So finding 1 match of XXX... See more...
We have a log file that is split into multiple events. In these events we need to count the number of occurrences where Event XXX > 0 and Event YYY > 0 for each Source file. So finding 1 match of XXX and YYY in one particular Source file would be counted as 1 for this purpose. Splunk search: SEARCH (patterns matching Events of type A) OR (patterns matching Events of type B) | eval isDEP=if(match(NAME, "(?i).*(XXX).*"), 1, 0) | eval isPERF=if(match(NAME, ".*(YYY).*"), 1, 0) | stats list(NAME),list(isDEP),list(isPERF),count by source In the Search part of the query I find the type of Events of interest, then determine the count of matches for XXX and YYY. This works fine, the problem is that I do not know how to tell Splunk to give me: If XXX > 0 AND YYY > 0 for a particular Source file (aggregate by Source), then count this as 1. This is an example output for the above query: The issue seems to be that Splunk works on an "per Event" basis so each result is tied with the Event and not the Source file. Any ideas on how to do this?  
My application logs json object . Sample logs look like this:     {"ts":"05 25 2023 14:57:05.114","msg":"Listeners is invoked" } {"ts":"05 25 2023 15:05:00.031","msg":"jvm.memory.used{... See more...
My application logs json object . Sample logs look like this:     {"ts":"05 25 2023 14:57:05.114","msg":"Listeners is invoked" } {"ts":"05 25 2023 15:05:00.031","msg":"jvm.memory.used{area=nonheap,id=Metaspace} value=117.305855 MiB" } {"ts":"05 25 2023 15:05:00.031","msg":"jvm.memory.used{area=nonheap,id=CodeHeap 'profiled nmethods'} value=41.941772 MiB" } {"ts":"05 25 2023 15:05:00.031","msg":"jvm.memory.used{area=nonheap,id=CodeHeap 'non-profiled nmethods'} value=18.53479 MiB" } {"ts":"05 25 2023 15:05:00.031","msg":"jvm.memory.used{area=heap,id=G1 Old Gen} value=82.355469 MiB" }       if you notice above , my application prints } in next line along with extra tabs \t In splunk, these logs are not represented as json , All these above text are shown in one line . I learn about LINE_BREAKER and tried following line break regex but nothing worked 1)  SHOULD_LINEMERGE=false LINE_BREAKER=([\t]+{) 2)  SHOULD_LINEMERGE=false LINE_BREAKER=([\n\t]+{) 3)  SHOULD_LINEMERGE=false BREAK_ONLY_BEFORE=\{"ts": SEDCMD-add_closing_bracket=s/\"$/"}/g #3 works , splunk shows extra ending bracket with tabs     }      I want splunk should consider every json object irrespective of tab and ending bracket } in next line . Please help
I want to create a new identity but it gives me the error in the photo        
Hello, I have field that is called Bootuptime it is displayed like 20230521050657.500000-300 it is not string field and I have used a command like | eval Boot=strptime(Bootuptime, "%Y-%m-%d %H:%M... See more...
Hello, I have field that is called Bootuptime it is displayed like 20230521050657.500000-300 it is not string field and I have used a command like | eval Boot=strptime(Bootuptime, "%Y-%m-%d %H:%M:S"). Which returns nothing or converts it to UNIX. Which does work If you look at the result the part I care about is 20230521050657 which Need to display like  2023-05-21 05:06:57,  there is no converting of numbers, I just need to add the appropriate dashes and colons and remove the part after the period. Any help is appreciated   Thanks  
I have a dashboard that has a dropdown selector to the same type of data for different vlans. for some reason one of my latest additions says that it cannot find the object id=<vlan name> when I sele... See more...
I have a dashboard that has a dropdown selector to the same type of data for different vlans. for some reason one of my latest additions says that it cannot find the object id=<vlan name> when I select it from the dropdown. The data shows every other vlan in the dropdown and info for that specific vlan shows up for different dashboards, just not that one. 
home dashboard Does not appear as home dashboard for everybody
Hi All,    Need little help I need to find EPS/GB of my existing data.  How to find out that data do we have any SPL for this or how it this.   Thanks   
Count error_manager 1 System 2 System 3 System 4 System 5 System 6 System   How to delete last row in a table?