All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @livehybrid  Could you please help me out with this issue? Thanks
Hey bowesmana,  Thanks for the replying. Use case:  I would like to backup indexed Splunk logs into a third party backup/restore system say on daily or weekly basis and it is required to be in huma... See more...
Hey bowesmana,  Thanks for the replying. Use case:  I would like to backup indexed Splunk logs into a third party backup/restore system say on daily or weekly basis and it is required to be in human readable format. So they can be retrieved anytime in future within the backup/restore system and can be read with a notepad or other similar tool if needed   Would love to know your thoughts on above? 
@Splunkduck09  The simplest approach is to use Splunk’s search interface. Run a search query for the data you want (e.g., index=your_index), then export the results. In the Splunk Web UI, after runn... See more...
@Splunkduck09  The simplest approach is to use Splunk’s search interface. Run a search query for the data you want (e.g., index=your_index), then export the results. In the Splunk Web UI, after running your search, click the "Export" button, choose a format like CSV, JSON, or raw text, and download the file. This will give you the raw event data in a readable format that you can open in a text editor.   https://docs.splunk.com/Documentation/Splunk/9.4.1/Search/ExportdatausingSplunkWeb 
Hi, We have Splunk ITSI and team wants to have Batch Job Monitoring with Splunk ITSI capabilities. With this, do we have some pre-built dashboards or services/KPI SPL queries available that you can ... See more...
Hi, We have Splunk ITSI and team wants to have Batch Job Monitoring with Splunk ITSI capabilities. With this, do we have some pre-built dashboards or services/KPI SPL queries available that you can share with us to start with? We will appreciate your guidance on this. Looking forward to your response.
You can search them and see the raw data in the search window, you can export them as CSV files, but what's your use case? Seems like an odd thing to want to do...
@sylim_splunk  After apply STEP 9 (Changing the Replication Factor back to what it was), the KV store went back to STARTING. Every other step worked to get the KV store to ready but it went back to ... See more...
@sylim_splunk  After apply STEP 9 (Changing the Replication Factor back to what it was), the KV store went back to STARTING. Every other step worked to get the KV store to ready but it went back to starting after changing the replication factor back to 3 which was the original. is there any splunk solution for this KVstore issue? It is very painful not to find a workaround about this kvstore issue
Hi All, I would like to read Splunk indexed log/data using text editor tool (like Notepad, etc.). I understand Splunk indexed logs are compressed and further processed which creates a Splunk only rea... See more...
Hi All, I would like to read Splunk indexed log/data using text editor tool (like Notepad, etc.). I understand Splunk indexed logs are compressed and further processed which creates a Splunk only readable format (Not-Human readable) - which works well for searching or other capabilities of Splunk.  But, wondering if there is way to convert these indexed logs into a human readable format?  
Hi @ITWhisperer , I will try to find out this with our Splunk enterprise team . But if that is true, this should also happen in this case also where space between date and  hrs is replaces by :    ... See more...
Hi @ITWhisperer , I will try to find out this with our Splunk enterprise team . But if that is true, this should also happen in this case also where space between date and  hrs is replaces by :    then it is running fine but I am not sure if it running correct timings which I mean is  12-March-2025 to  13-march-2025 
It is not clear where the _time>= and _time< are coming from but these are where the issue is being introduced. Do you have any restrictions etc. associated with the role?
Hi @ITWhisperer , I am observing one thing when I am changing to following format , instead of space giving : (highlighted in red ) , then it is running but not getting values of earliest, latest. N... See more...
Hi @ITWhisperer , I am observing one thing when I am changing to following format , instead of space giving : (highlighted in red ) , then it is running but not getting values of earliest, latest. Not sure is this correct way to display values .  index=aws_np [| makeresults | eval earliest=strptime("12/03/2025:13:00","%d/%m/%Y %H:%M") | eval latest=relative_time(earliest,"+1d") | table earliest, latest] | table earliest, latest
    it is showing error
What about your search job inspector and search log (particularly the expanded index search  
That seems unlikely. What does the search job inspector show?
Could there a possibility this playing any role is error  
index="aws_np" [| makeresults | eval earliest=strptime("12/03/2025 13:00","%d/%m/%Y %H:%M") | eval latest=relative_time(earliest,"+1d") | table earliest latest] Even b... See more...
index="aws_np" [| makeresults | eval earliest=strptime("12/03/2025 13:00","%d/%m/%Y %H:%M") | eval latest=relative_time(earliest,"+1d") | table earliest latest] Even bare minimum when I am running I am getting this issue . Am I really making some trivial mistake . Could there a possibility  that last 24 hours which got selected playing any role for error
I don't think Splunk parsing has changed that much since 9.1 (I'm using 9.4). Please share your full search (obfuscated as little as possible) so we can figure out where that error might be coming fr... See more...
I don't think Splunk parsing has changed that much since 9.1 (I'm using 9.4). Please share your full search (obfuscated as little as possible) so we can figure out where that error might be coming from.
This does not produce a parsing error for me Which version of Splunk are you using?
index="aws_np" [| makeresults | eval earliest=strptime("12/03/2025 13:00","%d/%m/%Y %H:%M") | eval latest=relative_time(earliest,"+1d") | table earliest latest] | rex field=... See more...
index="aws_np" [| makeresults | eval earliest=strptime("12/03/2025 13:00","%d/%m/%Y %H:%M") | eval latest=relative_time(earliest,"+1d") | table earliest latest] | rex field=_raw "messageGUID\": String\(\"(?<messageGUID>[^\"]+)" | rex field=_raw "source\": String\(\"(?<source>[^\"]+)" | rex field=_raw "type\": String\(\"(?<type>[^\"]+)" | rex field=_raw "addBy\": String\(\"(?<addBy>[^\"]+)" | where type="Contact" | stats count by source     I tried exactly same way , Error in 'search' command: Unable to parse the search: 'AND' operator is missing a clause on the left hand side.