All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Here is a snippet of the URL I am sending and the time format in which it needs to be: startTime=2023-12-01T16%3A27%3A45.000Z&endTime=2023-12-01T16%3A32%3A45.000Z However, when I try to send "l... See more...
Here is a snippet of the URL I am sending and the time format in which it needs to be: startTime=2023-12-01T16%3A27%3A45.000Z&endTime=2023-12-01T16%3A32%3A45.000Z However, when I try to send "latesttime" or "earliesttime", splunk is sending it in epoch. How do I get the proper format of time for the URL within the workflow action? Thanks!
Yes it was installed.
Yes, I believe it can be done.  The first search should be followed by a <done> element within which a token is set to one of the result fields of the search.  That token is then referenced in the se... See more...
Yes, I believe it can be done.  The first search should be followed by a <done> element within which a token is set to one of the result fields of the search.  That token is then referenced in the second search. <search><query>blahblahblah</query> <done> <set token=foo>$results.foo$</set> </done> </search> If the second search expects the contents of the token to be in a particular format then the query should generate that format or you may be able to use an <eval> element within <done> to produce the desired structure.
Hello, I'm trying to find information on how to use Splunk with Visual Studio Code. I have an authentication token on my development instance. I've installed the Visual Studio Code Extension for S... See more...
Hello, I'm trying to find information on how to use Splunk with Visual Studio Code. I have an authentication token on my development instance. I've installed the Visual Studio Code Extension for Splunk on GitHub. I'm lost from here on. What do I enter in the url and webroot fields in the launch.json file? "configurations": [ { "type": "chrome", "request": "launch", "name": "Launch Chrome against localhost", "url": "https://<host name>:8080", "webRoot": "${workspaceFolder}" } ] This opens Splunk in my Chrome browser, but it is an empty search field. I created splnb file in VSC, but when I run it, I receive ERROR: Unauthorized. Thanks in advance for any direction provided. God bless, Genesius
Hi @1ueshkil .. I did this UseCases bit long back.. so i got some confusions now. i am just giving you my educated guesses..  Let us know, if you have Security Essentials App - ( https://splunkbase.... See more...
Hi @1ueshkil .. I did this UseCases bit long back.. so i got some confusions now. i am just giving you my educated guesses..  Let us know, if you have Security Essentials App - ( https://splunkbase.splunk.com/app/3435 ) >>> We have already integrated linux, palo alto,SAP log sources. Nice. most of the problems solved. You no need to worry about data/logs required for the UseCase creation. Now you need to focus only on UseCase Creation >>>Just looking to create Linux, Palo alto, SAP use cases which is based on MITRE framework or any attack pattern use cases, as we don't have that much knowledge to create SPL use cases. Pls select a simple usecase to start with. Lets say DDOS attack on Linux systems. then we can try to work on the UseCase creation step by step. 
Hi @Pooja1 .. pls give us more details..  >>>I am facing issue with "no recent logs found for the sourcetype =abc:xyz do you see these lines in Splunk internal logs ah >>>Like we are able to see... See more...
Hi @Pooja1 .. pls give us more details..  >>>I am facing issue with "no recent logs found for the sourcetype =abc:xyz do you see these lines in Splunk internal logs ah >>>Like we are able to see the logs till 25th of Nov you were able to see logs(pls let us know how are able to see the logs... did you search.. if yes, pls provide the SPL) but you get warning msgs saying otherwise.. is it right  
>>> MongoDB 4.2 running on my Splunk hosts May we know, if you integrated MongoDB 4.2 to Splunk? i assume Yes, you have integrated MongoDB to Splunk.. may we know, if you use - Splunk DBX Add-on fo... See more...
>>> MongoDB 4.2 running on my Splunk hosts May we know, if you integrated MongoDB 4.2 to Splunk? i assume Yes, you have integrated MongoDB to Splunk.. may we know, if you use - Splunk DBX Add-on for MongoDB JDBC https://splunkbase.splunk.com/app/7095 Please suggest, thanks. a  
Hello, everyone! Currently, I have the Splunk Add-on for Unix and Linux version 8.1.0 installed on my heavy forwarder. However, I need to upgrade it to the latest version, and I am seeking recommend... See more...
Hello, everyone! Currently, I have the Splunk Add-on for Unix and Linux version 8.1.0 installed on my heavy forwarder. However, I need to upgrade it to the latest version, and I am seeking recommendations on how to carry out this process. Additionally, I would appreciate guidance on utilizing the deployment server to distribute the update to the Universal Forwarders. God bless. Regards
There was a typo in my solution - try this basesearch | bin _time span=1d | eval days_ago = ((relative_time(now(), "@d") - _time) / 86400) + 1 | stats sum(days_ago) as day_flag by User_Id file | wh... See more...
There was a typo in my solution - try this basesearch | bin _time span=1d | eval days_ago = ((relative_time(now(), "@d") - _time) / 86400) + 1 | stats sum(days_ago) as day_flag by User_Id file | where day_flag < 3
Does the custom index exist on all of the indexers?  If not, then you won't find data in it. If the data is not onboarded properly then finding it may be a challenge.  In particular, the timestamp f... See more...
Does the custom index exist on all of the indexers?  If not, then you won't find data in it. If the data is not onboarded properly then finding it may be a challenge.  In particular, the timestamp field must be accurate.  If events are dated in the future (easy to do) then most searches will not find it. Splunk tsidx files are not meant to be read by humans.  The contents are stored in a proprietary format.  Use Splunk to read the files (not directly, but by running a search).
I wanted to share a solution on behalf of @saurav.kosankar  We have to just add the "U limits" in Event Service Server in /etc/security/limits.conf file and restart the server to apply changes take... See more...
I wanted to share a solution on behalf of @saurav.kosankar  We have to just add the "U limits" in Event Service Server in /etc/security/limits.conf file and restart the server to apply changes take place.
Hi, sorry, please try this: index="XXXX" | rename "response_details.response_payload.entities{}" as status | where name="YYYY" | stats count(eval(status="offline")) AS offline_count count(eval(stat... See more...
Hi, sorry, please try this: index="XXXX" | rename "response_details.response_payload.entities{}" as status | where name="YYYY" | stats count(eval(status="offline")) AS offline_count count(eval(status="online")) AS online_count earliest(eval(if(status="offline",_time,""))) AS offline earliest(eval(if(status="online",_time,""))) AS online | fillnull value=0 offline_count | fillnull value=0 online_count | eval condition=case( offline_count=0 AND online_count>0,"Online", offline_count>0 AND online_count=0,"Offline", offline_count>0 AND online_count>0 AND online>offline, "Offline but newly online", offline_count>0 AND online_count>0 AND online>offline, "Offline", offline_count=0 AND online_count=0, "No data") | search condition="Offline" OR condition="Offline but newly online" | table condition Ciao. Giuseppe
 I tried this and it seems to returns no results. What I am trying is to compare the file received previous day and whether that's is there in today. and return the actual file name. for example, f... See more...
 I tried this and it seems to returns no results. What I am trying is to compare the file received previous day and whether that's is there in today. and return the actual file name. for example, file name in the log say abc.1, abc.2 received previous day and today it will be expected that the same file names and counts are received. Due to some reason, if  abc.1 is not received and we want to display, the abc.1 Current SPL: basesearch | bin _time span=1d | eval days_ago = ((relative_time(now(), "@d") - _time) / 84600) + 1 | stats sum(days_ago) as day_flag by User_Id file | where day_flag < 3  
Hi @gcusello  I tried which you given code, it is not working throwing some error. "Error in 'EvalCommand': Type checking failed. 'AND' only takes boolean arguments" index="XXXX"  | rename "res... See more...
Hi @gcusello  I tried which you given code, it is not working throwing some error. "Error in 'EvalCommand': Type checking failed. 'AND' only takes boolean arguments" index="XXXX"  | rename "response_details.response_payload.entities{}" as status | where name="YYYY" | stats count(eval(status="offline")) AS offline_count count(eval(status="online")) AS online_count earliest(eval(if(status="offline",_time,""))) AS offline earliest(eval(if(status="online",_time,""))) AS online | fillnull value=0 offline_count | fillnull value=0 online_count | eval condition=case( offline_count=0 AND online_count>0,"Online", offline_count>0 AND online_count=0,"Offline", offline_count>0 AND online_count>0 AND online>offline, "Offline but newly online"), offline_count>0 AND online_count>0 AND online>offline, "Offline"), offline_count=0 AND online_count=0, "No data") | search condition="Offline" OR condition="Offline but newly online" | table condition
Thanks @richgalloway 
Hello! Our Splunk server receives dc logs on a daily basis from another network team. Under Files & Directories in Data Inputs, I have the file path for those logs configured to be continuously moni... See more...
Hello! Our Splunk server receives dc logs on a daily basis from another network team. Under Files & Directories in Data Inputs, I have the file path for those logs configured to be continuously monitored since we receive those logs from another organization. I set a custom index for those logs and it's not showing any data in that index. I've verified that it's not a permissions issue. I decided to manually upload one of those files into Splunk and noticed that they are .tsidx files. After uploading, I wasn't able to read any of the data on the .tsidx file. Is that normal? Am I doing anything incorrect? We need to be able to audit those dc logs. Thanks in advance!
Thanks for the reply! I have read through that, but I don't see anything related to why we wouldn't be receiving direct notifications. To clarify, I'm hoping for slack notifications (not phone app no... See more...
Thanks for the reply! I have read through that, but I don't see anything related to why we wouldn't be receiving direct notifications. To clarify, I'm hoping for slack notifications (not phone app notifications or just slack channel messages). Is this something that needs to be configured in annotations or something?
The presence of "Z" on the end of the timestamp means the time is in UTC.  If that is incorrect then the application writing the logs should be changed to use the correct time zone designation. A wo... See more...
The presence of "Z" on the end of the timestamp means the time is in UTC.  If that is incorrect then the application writing the logs should be changed to use the correct time zone designation. A workaround would be to remove "%Z" from the TIME_FORMAT setting so Splunk ignores the time zone.  It will default to the local (to the HF) time zone.
Thanks for your answers. I tried both options. I removed everything about time from SHs and Indexers. Put your config to HF (changed timezone),  restart splunkd process and generate a few new events... See more...
Thanks for your answers. I tried both options. I removed everything about time from SHs and Indexers. Put your config to HF (changed timezone),  restart splunkd process and generate a few new events. And again, it was indexed incorrect. I dont know why, but HF still thinking that time inside event is UTC, how i understood it is because timestamp inside event has "Z", but it is incorrect, this timestamp in my timezone, and splunk convert this correct timestamp into my timezone because he is thinking that it is UTC. Timezone i used from this url : https://en.wikipedia.org/wiki/List_of_tz_database_time_zones
Hello   Im working on testing something but Im not sure exactly would be the best solution. What I am trying to do is, using the timepicker, have a panel that loads id's. Then I'd like another pane... See more...
Hello   Im working on testing something but Im not sure exactly would be the best solution. What I am trying to do is, using the timepicker, have a panel that loads id's. Then I'd like another panel to search over the same timespan, in a different dataset, but only for the id's from the first panel. Is there a way to pass the results of a search that runs on page load to another search, maybe with a token(s)? the catch is that there may be a single id or there may be many id's. It would have to be a boolean of some sort I believe unless there's a better way to search one to many instances of a something.   My thinking is something like  search 1: <base search> | stats count by MSGID | fields - count that populates a <tok> on page load(or time selection) but the results would have to be formatted like    654165464 OR MSGID=584548549494 OR MSGID=54654645645   search 2 <base search2> MSGID=<tok> | stats count by MSGID | fields - count Is this something that can be done? What might I have to do to accomplish this? Thanks for the assistance!