All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

0 0-21 * * *
Hi @kiran_panchavat , yes your are correct but my requirement is in 24 hours I  don’t want to receive the alert at 10pm, 11pm only. how can I do that????
Thank you for illustrating input in text format.  But please make sure JSON is conformant when doing mockups. Speaking of JSON, I always say do not treat structured data as text.  regex is not a sui... See more...
Thank you for illustrating input in text format.  But please make sure JSON is conformant when doing mockups. Speaking of JSON, I always say do not treat structured data as text.  regex is not a suitable tool for structured data in most cases.  Splunk's robust, QA tested tool will save you countless hours down the road.  Traditional tool for this is spath.  Since 9.0, Splunk also added fromjson that can simplify this work.  I'll begin with the simpler one.  You didn't say which field the JSON is in, so I'll assume that's _raw in the following.   | fromjson _raw | mvexpand Field1 | fromjson Field1    This gives you Field1 id name occupation {"id":1234,"name":"John"} 1234 John   {"id":5678,"name":"Mary","occupation":{"title":"lawyer","employer":"law firm"}} 5678 Mary {"title":"lawyer","employer":"law firm"} The spath alternative is - again assuming JSON is in _raw   | spath path=Field1{} | mvexpand Field1{} | spath input=Field1{}   This gives Field1{} id name occupation.employer occupation.title { "id": 1234, "name": "John" } 1234 John     { "id": 5678, "name": "Mary", "occupation": { "title": "lawyer", "employer": "law firm" } } 5678 Mary law firm lawyer There  can be many variants in between.  But the essence is to extract elements of the JSON array, then handle the array as a multivalue field as a whole.  If, for example, there are too many elements and you worry about RAM, you can use mvfilter to get data about Mary as you are not interested in other entries:   | fromjson _raw | eval of_interest = mvfilter(json_extract(Field1, "name") == "Mary")   (Note you need 8.0 to use json_extract.) You get Field1 of_interest {"id":1234,"name":"John"} {"id":5678,"name":"Mary","occupation":{"title":"lawyer","employer":"law firm"}} {"id":5678,"name":"Mary","occupation":{"title":"lawyer","employer":"law firm"}} Hope this helps. By the way, the conformant form of your mock data is   { "Field1" : [ { "id": 1234, "name": "John" }, { "id": 5678, "name": "Mary", "occupation": { "title": "lawyer", "employer": "law firm" } } ] }   You can play with the following emulation and compare with real data   | makeresults | eval _raw = "{ \"Field1\" : [ { \"id\": 1234, \"name\": \"John\" }, { \"id\": 5678, \"name\": \"Mary\", \"occupation\": { \"title\": \"lawyer\", \"employer\": \"law firm\" } } ] }" ``` data emulation above ```    
It looks like you are using single quotes around the macro rather than backquotes Are you sure the macro expands correctly - try using <ctrl><shift>E to expand the macro
| spath Field1{} output=Field1 | mvexpand Field1 | spath input=Field1 occupation | where isnotnull(occupation) | spath input=Field1 name | table name
Thank you for the feedback!  I will take your suggestions into consideration!
Thanks Luca, this works!  Appreciated!
Is there a way to send all logs data to an NFS file system for required log retention from Splunk Opentelemetry?  
I have a json that looks like this: { "Field1" : [ { "id": 1234 "name": "John" }, { "id": 5678 "name": "Mary" "occupation": { "title": "lawyer", "employer": "law firm" } } ] } I want... See more...
I have a json that looks like this: { "Field1" : [ { "id": 1234 "name": "John" }, { "id": 5678 "name": "Mary" "occupation": { "title": "lawyer", "employer": "law firm" } } ] } I want to extract the value of the "name" field from the object that contains an occupation field (could be any). In this case I want to get "Mary" and store it inside a variable. How would I do this using splunk search language?
I configured a Macro name securemsg(1), I use this Marco in the following search: ....| eval log_info=_raw | 'securemsg(log_info)' | .... When I run this search I got following error: Error in 'Se... See more...
I configured a Macro name securemsg(1), I use this Marco in the following search: ....| eval log_info=_raw | 'securemsg(log_info)' | .... When I run this search I got following error: Error in 'SearchParser': Missing a search command before '''. Error at position '264' of search query 'search index="linuxos" sourcetype="syslog" host="C...{snipped} {errorcontext = fo=_raw | 'securemsg(}'. Please help. Thanks      
Yeah I did that and it works for them. In the logs, I see the following error: 03-06-2024 05:05:01.427 -0800 ERROR ScriptRunner [1509543 AlertNotifierWorker-0] - stderr from '/opt/splunk/bin/python3... See more...
Yeah I did that and it works for them. In the logs, I see the following error: 03-06-2024 05:05:01.427 -0800 ERROR ScriptRunner [1509543 AlertNotifierWorker-0] - stderr from '/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/search/bin/sendemail.py "results_link=https://splunksrver:8000/app/search/@go?sid=scheduler_bHVkQGVjbi5vZW0uZG9lLmdvdg__search__RMD500f838a99e0e9d56_at_1709730300_4407" "ssname=Test Windows Encode" "graceful=True" "trigger_time=1709730300" results_file="/opt/splunk/var/run/splunk/dispatch/scheduler_bHVkQGVjbi5vZW0uZG9lLmdvdg__search__RMD500f838a99e0e9d56_at_1709730300_4407/results.csv.gz" "is_stream_malert=False"':  _csv.Error: line contains NUL
How would I add a permanent search or field to a sourctype?  For example: I have a set of a data that I have been able to snag a field out of using this search sourcetype="collectedevents" | rex fi... See more...
How would I add a permanent search or field to a sourctype?  For example: I have a set of a data that I have been able to snag a field out of using this search sourcetype="collectedevents" | rex field=_raw "<Computer>(?<Computer>[^<]+)</Computer>" Our sourcetype is "collectedevents"  And I found the way to pull the <Computer> field that was in the XML data down to a field "Computer" But what I would like to be able to do is to have that field be permanent, or transpose the "host =" to not be the host of the WEC but the host of the origin server that it came from.   Long story short, we have servers that we don't want the Splunk Forwarder on because we know that it can execute scripts creating a vulnerability with the Splunk Forwarder on these servers.  Any help is appreciated, thank you!
Thanks for the response! With the test query, I'm seeing both sources. For further research I counted all the field_AB by sourcetype and found that there are significantly more source_2 than source... See more...
Thanks for the response! With the test query, I'm seeing both sources. For further research I counted all the field_AB by sourcetype and found that there are significantly more source_2 than source_1, not sure if that affects anything though. sourcetype (last 5 min) source_1 -> count 147 source_2 -> count 66359 In my initial query, I do get both event from both sources. It's not populating the fields in each event the way I want it though. Inital Query Results field_D field_AB field_C field_E DeviceType UniqueID   Up DeviceType UniqueID   Down   UniqueID Data_2     UniqueID Data_1     Expected Query Results field_D field_AB field_C field_E DeviceType UniqueID Data_1 Up DeviceType UniqueID Data_2 Down DeviceType UniqueID Data_2 Down DeviceType UniqueID Data_1 Down  
Hi @karthi2809, for this sourcetype use INDEXED_EXTRACTIONS = json in the sourcetype definitions (for more infos see at http://docs.splunk.com/Documentation/Splunk/9.2.0/admin/Propsconf) othrwise, ... See more...
Hi @karthi2809, for this sourcetype use INDEXED_EXTRACTIONS = json in the sourcetype definitions (for more infos see at http://docs.splunk.com/Documentation/Splunk/9.2.0/admin/Propsconf) othrwise, use the spath command https://docs.splunk.com/Documentation/Splunk/9.2.0/SearchReference/Spath Ciao. Giuseppe
Hi @mappu, check with the following search: index=your_index | eval diff=_indextime-_time | eval indextime=strftime(_indextime,"%Y-%m-%d %H:%M:$S") | table _time indextime diff if you have high di... See more...
Hi @mappu, check with the following search: index=your_index | eval diff=_indextime-_time | eval indextime=strftime(_indextime,"%Y-%m-%d %H:%M:$S") | table _time indextime diff if you have high differences between _time and indextime, you have a queue issue, if not the problem is another. About timestamp, check if in the loosing logs you have the timestamp definition or not, but using the formats you described, you souldn't have this issue. Ciao. Giuseppe
Hello @yuanliu, I tried your suggestion and modified column name "Grades" into "Grade", and it worked fine. I accepted your solution So, we had to split the data manually.      Note that there is s... See more...
Hello @yuanliu, I tried your suggestion and modified column name "Grades" into "Grade", and it worked fine. I accepted your solution So, we had to split the data manually.      Note that there is still an issue with snap-to is shifting the data as discussed in my other post. I appreciate your assistance. Thank you
Hello, We have been investigating on missing 30% of Splunk logs in our production environment. I'm thinking it maybe due to TIME_FORMAT or due to high volume logs on production. Can you please let m... See more...
Hello, We have been investigating on missing 30% of Splunk logs in our production environment. I'm thinking it maybe due to TIME_FORMAT or due to high volume logs on production. Can you please let me know what should be the key-value for TIME_FORMAT on props.conf file?  Lagsec value is 1.5seconds on source logs and the splunk forwarder log source type where we are checking has 1.13s.  Additionally, source logs have format: 05/Mar/2024 SplunkForwarder logs have format: 2024-03-05 2048kbps on both dev and prod config file. Also, have ignoreOlderThan=1d so, looking to remove this parameter and fix TIME_FORMAT and check out. Can you please help or provide additional information to check?
@shakti  You can also refer to some of Simple XML JavaScript extension examples on Splunk Dashboard Examples app on Splunkbase ( https://splunkbase.splunk.com/app/1603/  ) Or the Splunk Web Framewo... See more...
@shakti  You can also refer to some of Simple XML JavaScript extension examples on Splunk Dashboard Examples app on Splunkbase ( https://splunkbase.splunk.com/app/1603/  ) Or the Splunk Web Framework tutorial: http://dev.splunk.com/view/webframework-tutorials/SP-CAAAERB  https://answers.splunk.com/answers/579537/how-to-use-javascript-code-in-splunk-cloud-dashboa.html  https://community.splunk.com/t5/Dashboards-Visualizations/Creating-an-quot-About-this-dashboard-quot-popup-modal-view-when/m-p/426913  "Happy Splunking!!!"
@toporagno Remember that savedsearches.conf is a per-app/user configuration file, and the order of precedence matters. Configuration file precedence - Splunk Documentation
Glad to hear this is what you needed.    You can accept this solution to indicate the question was answered to your liking.  thanks!