All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thank you for the feedback!  I will take your suggestions into consideration!
Thanks Luca, this works!  Appreciated!
Is there a way to send all logs data to an NFS file system for required log retention from Splunk Opentelemetry?  
I have a json that looks like this: { "Field1" : [ { "id": 1234 "name": "John" }, { "id": 5678 "name": "Mary" "occupation": { "title": "lawyer", "employer": "law firm" } } ] } I want... See more...
I have a json that looks like this: { "Field1" : [ { "id": 1234 "name": "John" }, { "id": 5678 "name": "Mary" "occupation": { "title": "lawyer", "employer": "law firm" } } ] } I want to extract the value of the "name" field from the object that contains an occupation field (could be any). In this case I want to get "Mary" and store it inside a variable. How would I do this using splunk search language?
I configured a Macro name securemsg(1), I use this Marco in the following search: ....| eval log_info=_raw | 'securemsg(log_info)' | .... When I run this search I got following error: Error in 'Se... See more...
I configured a Macro name securemsg(1), I use this Marco in the following search: ....| eval log_info=_raw | 'securemsg(log_info)' | .... When I run this search I got following error: Error in 'SearchParser': Missing a search command before '''. Error at position '264' of search query 'search index="linuxos" sourcetype="syslog" host="C...{snipped} {errorcontext = fo=_raw | 'securemsg(}'. Please help. Thanks      
Yeah I did that and it works for them. In the logs, I see the following error: 03-06-2024 05:05:01.427 -0800 ERROR ScriptRunner [1509543 AlertNotifierWorker-0] - stderr from '/opt/splunk/bin/python3... See more...
Yeah I did that and it works for them. In the logs, I see the following error: 03-06-2024 05:05:01.427 -0800 ERROR ScriptRunner [1509543 AlertNotifierWorker-0] - stderr from '/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/search/bin/sendemail.py "results_link=https://splunksrver:8000/app/search/@go?sid=scheduler_bHVkQGVjbi5vZW0uZG9lLmdvdg__search__RMD500f838a99e0e9d56_at_1709730300_4407" "ssname=Test Windows Encode" "graceful=True" "trigger_time=1709730300" results_file="/opt/splunk/var/run/splunk/dispatch/scheduler_bHVkQGVjbi5vZW0uZG9lLmdvdg__search__RMD500f838a99e0e9d56_at_1709730300_4407/results.csv.gz" "is_stream_malert=False"':  _csv.Error: line contains NUL
How would I add a permanent search or field to a sourctype?  For example: I have a set of a data that I have been able to snag a field out of using this search sourcetype="collectedevents" | rex fi... See more...
How would I add a permanent search or field to a sourctype?  For example: I have a set of a data that I have been able to snag a field out of using this search sourcetype="collectedevents" | rex field=_raw "<Computer>(?<Computer>[^<]+)</Computer>" Our sourcetype is "collectedevents"  And I found the way to pull the <Computer> field that was in the XML data down to a field "Computer" But what I would like to be able to do is to have that field be permanent, or transpose the "host =" to not be the host of the WEC but the host of the origin server that it came from.   Long story short, we have servers that we don't want the Splunk Forwarder on because we know that it can execute scripts creating a vulnerability with the Splunk Forwarder on these servers.  Any help is appreciated, thank you!
Thanks for the response! With the test query, I'm seeing both sources. For further research I counted all the field_AB by sourcetype and found that there are significantly more source_2 than source... See more...
Thanks for the response! With the test query, I'm seeing both sources. For further research I counted all the field_AB by sourcetype and found that there are significantly more source_2 than source_1, not sure if that affects anything though. sourcetype (last 5 min) source_1 -> count 147 source_2 -> count 66359 In my initial query, I do get both event from both sources. It's not populating the fields in each event the way I want it though. Inital Query Results field_D field_AB field_C field_E DeviceType UniqueID   Up DeviceType UniqueID   Down   UniqueID Data_2     UniqueID Data_1     Expected Query Results field_D field_AB field_C field_E DeviceType UniqueID Data_1 Up DeviceType UniqueID Data_2 Down DeviceType UniqueID Data_2 Down DeviceType UniqueID Data_1 Down  
Hi @karthi2809, for this sourcetype use INDEXED_EXTRACTIONS = json in the sourcetype definitions (for more infos see at http://docs.splunk.com/Documentation/Splunk/9.2.0/admin/Propsconf) othrwise, ... See more...
Hi @karthi2809, for this sourcetype use INDEXED_EXTRACTIONS = json in the sourcetype definitions (for more infos see at http://docs.splunk.com/Documentation/Splunk/9.2.0/admin/Propsconf) othrwise, use the spath command https://docs.splunk.com/Documentation/Splunk/9.2.0/SearchReference/Spath Ciao. Giuseppe
Hi @mappu, check with the following search: index=your_index | eval diff=_indextime-_time | eval indextime=strftime(_indextime,"%Y-%m-%d %H:%M:$S") | table _time indextime diff if you have high di... See more...
Hi @mappu, check with the following search: index=your_index | eval diff=_indextime-_time | eval indextime=strftime(_indextime,"%Y-%m-%d %H:%M:$S") | table _time indextime diff if you have high differences between _time and indextime, you have a queue issue, if not the problem is another. About timestamp, check if in the loosing logs you have the timestamp definition or not, but using the formats you described, you souldn't have this issue. Ciao. Giuseppe
Hello @yuanliu, I tried your suggestion and modified column name "Grades" into "Grade", and it worked fine. I accepted your solution So, we had to split the data manually.      Note that there is s... See more...
Hello @yuanliu, I tried your suggestion and modified column name "Grades" into "Grade", and it worked fine. I accepted your solution So, we had to split the data manually.      Note that there is still an issue with snap-to is shifting the data as discussed in my other post. I appreciate your assistance. Thank you
Hello, We have been investigating on missing 30% of Splunk logs in our production environment. I'm thinking it maybe due to TIME_FORMAT or due to high volume logs on production. Can you please let m... See more...
Hello, We have been investigating on missing 30% of Splunk logs in our production environment. I'm thinking it maybe due to TIME_FORMAT or due to high volume logs on production. Can you please let me know what should be the key-value for TIME_FORMAT on props.conf file?  Lagsec value is 1.5seconds on source logs and the splunk forwarder log source type where we are checking has 1.13s.  Additionally, source logs have format: 05/Mar/2024 SplunkForwarder logs have format: 2024-03-05 2048kbps on both dev and prod config file. Also, have ignoreOlderThan=1d so, looking to remove this parameter and fix TIME_FORMAT and check out. Can you please help or provide additional information to check?
@shakti  You can also refer to some of Simple XML JavaScript extension examples on Splunk Dashboard Examples app on Splunkbase ( https://splunkbase.splunk.com/app/1603/  ) Or the Splunk Web Framewo... See more...
@shakti  You can also refer to some of Simple XML JavaScript extension examples on Splunk Dashboard Examples app on Splunkbase ( https://splunkbase.splunk.com/app/1603/  ) Or the Splunk Web Framework tutorial: http://dev.splunk.com/view/webframework-tutorials/SP-CAAAERB  https://answers.splunk.com/answers/579537/how-to-use-javascript-code-in-splunk-cloud-dashboa.html  https://community.splunk.com/t5/Dashboards-Visualizations/Creating-an-quot-About-this-dashboard-quot-popup-modal-view-when/m-p/426913  "Happy Splunking!!!"
@toporagno Remember that savedsearches.conf is a per-app/user configuration file, and the order of precedence matters. Configuration file precedence - Splunk Documentation
Glad to hear this is what you needed.    You can accept this solution to indicate the question was answered to your liking.  thanks!
@toporagno allow_skew value should be in the savedsearches.conf. You can set the value here.  For reference the link to the official documentation : Offset scheduled search start times - Splunk Docu... See more...
@toporagno allow_skew value should be in the savedsearches.conf. You can set the value here.  For reference the link to the official documentation : Offset scheduled search start times - Splunk Documentation 
I am able to solve this by add ' before and after $result.fieldname$ example '$result.fieldname$'
@anandhalagaras1 You can apply in the HF's if you have.   
Thanks in Advance. 1.I have a json object as content.payload{} and need to extract the values inside the payload.Already splunk extract field as content.payload{} and the result as  AP Import flow ... See more...
Thanks in Advance. 1.I have a json object as content.payload{} and need to extract the values inside the payload.Already splunk extract field as content.payload{} and the result as  AP Import flow related results : Extract has no AP records to Import into Oracle". But I want to extract all the details inside the content.payload. How can extract from splunk query or from props.conf file.I tried spath but cant able to get it. 2.How to rename wildcard value of content.payload{}* ?     "content" : { "jobName" : "AP2", "region" : "NA", "payload" : [ { "GL Import flow processing results" : [ { "concurBatchId" : "4", "batchId" : "6", "count" : "50", "impConReqId" : "1", "errorMessage" : null, "filename" : "CONCUR_GL.csv" } ] }, "AP Import flow related results : Extract has no AP records to Import into Oracle" ] },      
Hello @yuanliu , Thank you for assistance.  Comparing  | bin _time span=1w   (Left)  and    | bin _time span=1w@w   (Right) When the span was changed from 1w to 1w@w,  it  looks like the data wa... See more...
Hello @yuanliu , Thank you for assistance.  Comparing  | bin _time span=1w   (Left)  and    | bin _time span=1w@w   (Right) When the span was changed from 1w to 1w@w,  it  looks like the data was shifted from 2024-02-08 to 2024-02-04.     Why did Splunk shift the data? Is this a normal behavior? I expect the data for 2024-02-04 to be NULL.  Is there a way to leave the data as is (not shifting), when moving the start date to 2024-02-04? Please suggest.  I appreciate your help.