All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

It looks like it might be milliseconds, so divide by 1000 before using strftime() to format it into human readable format. | eval start_time=strftime(startTime/1000,"%F %T.%3N")
Hi Team, Below is my raw logs: 2023-09-29 14:10:05.598 [ERROR] [Thread-3] CollateralFileGenerator - *****************************************FAILURE in sending control file collateral files to ABS ... See more...
Hi Team, Below is my raw logs: 2023-09-29 14:10:05.598 [ERROR] [Thread-3] CollateralFileGenerator - *****************************************FAILURE in sending control file collateral files to ABS Suite!!!***************************************** I want to separate "FAILURE in sending control file collateral files to ABS Suite!!!" as my ERROR message  Can someone guide me on this
@gcusello  yes its true for both
Hello, I have a field called "startTime" startTime: 1699148280000 I would like to convert it to human readable. Not having any luck. Any help would be appreciated.
I reviewed the _internal index and discovered that the heartbeat records to the _internal index (the SplunkHECExporter code shows this too).  My HEC token doesn't allow that index and was erroring.  ... See more...
I reviewed the _internal index and discovered that the heartbeat records to the _internal index (the SplunkHECExporter code shows this too).  My HEC token doesn't allow that index and was erroring.  This explains why the heartbeat wasn't working but I'm still unable to determine why my otlp logs aren't making it to Splunk via the exporter.  I reviewed the other _internal logs but am unable to find anything. For additional context, I'm reusing an existing Splunk Heavy Forwarder that has many logs going through it.  This is the first time I've used the SplunkHecExporter exporter in the OtelCollector.  The debug logs show that splunk_hec is registered to export logs.
Thanks for the reply Actually my json file contains total 2 stages as below. per_stage_info_vendor_data: [ [-]      { [-]        Stage: stage1        WallClockTime: 0h:30m:23s      }      { [-]... See more...
Thanks for the reply Actually my json file contains total 2 stages as below. per_stage_info_vendor_data: [ [-]      { [-]        Stage: stage1        WallClockTime: 0h:30m:23s      }      { [-]        Stage: stage2         WallClockTime: 0h:52m:36s      }       ]   with following regular expression we are able to get the hours mins and seconds. rex field=per_stage_info_vendor_data{}.WallClockTime max_match=0 "((?<hours>\d+)h:(?<minutes>\d+)m:(?<seconds>\d+)s)" But when i tried |eval  stagetime=hours*3600+minutes*60+seconds  it's not working, when i checked further any of the arithmetic  operation on these three fields(hours,minutes and seconds).   do i need to convert these fields to any other format.          
but I want to get the most unique SHA256HashData in the last 24h for example. and then fwd to summary index and start static about it, so for that I need to get more data
Both stats and rare are transforming commands, meaning only the fields used in or produced by the commands are available to later commands.  So the only fields available after stats are count and SHA... See more...
Both stats and rare are transforming commands, meaning only the fields used in or produced by the commands are available to later commands.  So the only fields available after stats are count and SHA256HashData; and the only fields available after rare are SHA256HashData, count, and percent. To get additional fields out of stats, include them in the command.  To work around rare, use sort and tail. index=myindex | stats count, max(_time) as _time by SHA256HashData | sort - SHA256HashData | tail 10  
I have events that return different structured fields depending on the value of a field called TYPE.  This all comes from the same sourcetype.  For example: if type=TYPE1, I might have fields called... See more...
I have events that return different structured fields depending on the value of a field called TYPE.  This all comes from the same sourcetype.  For example: if type=TYPE1, I might have fields called: TYPE1.exe, TYPE1.comm, TYPE1.path, TYPE1.filename if type=TYPE2, I might have fields called: TYPE2.comm, TYPE2.path, TYPE2.host As you can see, each type brings a different set of base fields.  We are using data model searches so I want to get these base fields into CIM compliance.   Is there a way to create stanzas in props.conf or transforms.conf that will allow me to field alias these values based on the type value?  I tried straight-out field aliasing in props.conf only to find I was actually overwriting values due to precedence/order of my field alias commands. Thanks in advance,
Hi @gcusello  Below is the query.....Firstly I want them to get aligned all rows with each values in a single row.Currently there are 3 different rows for 1 particular result.....then apply sum of c... See more...
Hi @gcusello  Below is the query.....Firstly I want them to get aligned all rows with each values in a single row.Currently there are 3 different rows for 1 particular result.....then apply sum of columns index=""     source IN "" "input params" OR "sqs sent count" OR "Total messages published to SQS successfully" | rex "\"objectType\":\"(?<objectType>[^\"]+)"   | rex "\"objectIdsCount\":\"(?<objectIdsCount>[^\"]+)"   | rex "\"sqsSentCount\":\"(?<sqsSentCount>[^\"]+)"     | rex "\"totalMessagesPublishedToSQS\":\"(?<totalMessagesPublishedToSQS>[^\"]+)"   | table objectType,objectIdsCount,sqsSentCount,totalMessagesPublishedToSQS
hello! I have this search, and I want to add more parameters like time etc. the thing is - when I'm using rare its show only the SHA256HashData and count ```index=myindex | stats count by SHA256H... See more...
hello! I have this search, and I want to add more parameters like time etc. the thing is - when I'm using rare its show only the SHA256HashData and count ```index=myindex | stats count by SHA256HashData | rare SHA256HashData any idea? thanks! 
a problem where the cluster master and deployment server in the distributed Splunk environment cannot be logged in via the GUI and state that there are no users. Set up a user
Firstly, why are you using "CallDuration" as you text anchor in the regex when your sample show "TimeTaken"? Secondly, your JSON array has multiple entries, so, if you know how many entries (Stages)... See more...
Firstly, why are you using "CallDuration" as you text anchor in the regex when your sample show "TimeTaken"? Secondly, your JSON array has multiple entries, so, if you know how many entries (Stages) will be present, you can include that in your regex and extract each one separately, or you can use max_match=0 to extract them all into a multi-value field. Thirdly, for it to be valid JSON, there would be double quotes in the string which would need to be escaped in the regex. | rex field="_raw" max_match=0 "\"TimeTaken\": \"(?<hours>\d+)h:(?<minutes>\d+)m:(?<seconds>\d+)s\"" Fourthly, if it is JSON, why aren't you using spath to extract the fields (assuming you haven't extracted them as part of the sourcetype definition)?  
In this blog post, we'll take a look at key terms, best practices, and tools commonly used for Server Monitoring. Splunk can monitor the performance of all your servers, containers, and apps in re... See more...
In this blog post, we'll take a look at key terms, best practices, and tools commonly used for Server Monitoring. Splunk can monitor the performance of all your servers, containers, and apps in real-time with Splunk Infrastructure Monitoring. The term “server monitoring” is complex because of the exceptionally wide range of servers that exist.    More visit Official website: https://www.splunk.com/en_us/blog/learn/server-monitoring/ServiceNow/.html   Regards:  @marksmith991   
i have a splunk array  per_stage_data: [ [-]      { [-]        Stage: S1         TimeTaken: 0h:30m:23s      }      { [-]        Stage: S2         TimeTaken: 0h:52m:36s      }     ] i am im... See more...
i have a splunk array  per_stage_data: [ [-]      { [-]        Stage: S1         TimeTaken: 0h:30m:23s      }      { [-]        Stage: S2         TimeTaken: 0h:52m:36s      }     ] i am implementing a splunk dashboard , in that i want to convert the per_stage_data{}.TimeTaken into seconds. i had tried multiple ways but it didn't worked. tried using the below solution  but it's not giving any output. rex field="_raw" "CallDuration: (?<hours>\d+)h:(?<minutes>\d+)m:(?<seconds>\d+)s" | eval CallDurationInSeconds = ((hours*60*60)+(minutes*60)+(seconds))   appreciate any inputs.   Thanks in advance. 
hi, I have powershell script that does an API call to an external service. how can I use this script in phantom? 
You can use addtotals to sum the numbers. Is this what you are after? If not, how do you correlate different events to generate separate rows with their respective counts?
Assuming your dropdown token name is "environment", try this env=$environment$ `app_logs(application_name)` "my unique text"
Hi @nithys, could you share some samples of your logs and your search (please both in text format, not screenshot)? Ciao. Giuseppe
Hi @aditsss, are you sure that  tha additional conditions ("Message successfully sent to Cornerstone" source!="/var/log/messages") are true for both the sourcetypes?, maybe you have to use parenthes... See more...
Hi @aditsss, are you sure that  tha additional conditions ("Message successfully sent to Cornerstone" source!="/var/log/messages") are true for both the sourcetypes?, maybe you have to use parenthesis to separate conditions. Ciao. Giuseppe