All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

If your problem is resolved, then please click the "Accept as Solution" button to help future readers.
I set my client test script to connect to 9997 on the IF server that is the and currently has ~1000 client connected to it, and it immediately fails to establish any more connections.  I have since b... See more...
I set my client test script to connect to 9997 on the IF server that is the and currently has ~1000 client connected to it, and it immediately fails to establish any more connections.  I have since built a new server that is using the exact same IF configurations and my client script connect to 9997 up to 16000 connections as expected.  I don’t understand this, I guess the splunk process with the 1000 connection knows there is data traversing and that it is too busy to accept more connections. In the metrics.log I see this ever minute: <date time> Metrics - group=queue, name=parsingqueue, blocked=true, max_size_kb=512, current_size_kb=511, current_size=1217, largest_size=1217,smallest_size=0 I don’t know what this mean or if it is significant and I will start researching this. I also am not sure what you mean by adding pipeline so I will look into that too.
I believe this has something to do with the lookup having time_field set in the transforms.conf. e.g. "time_field = d"
Yup @richgalloway ,  it worked now. I have set it to 1 and tested. Thanks
Running a lookup where I have verified the fields exist and match and its not returning an output field. So, I verified by running the lookup by itself and it still doesn't match. I have checked perm... See more...
Running a lookup where I have verified the fields exist and match and its not returning an output field. So, I verified by running the lookup by itself and it still doesn't match. I have checked permissions, ran the search from the app it belongs to. I can view the lookup with "| inputlookup <name>".   Example running the lookup on itself: | inputlookup myfile | table a, b | lookup myfile a OUTPUT b AS c | table a, b, c c always shows as empty for this one lookup
Hi. Thanks for your reply but input is firehose AWS and we don’t have inputs. Is it possible for you to review my props.conf and if you can test in any dummy environment 
Tell us more about the partial match on LKUP_DSN.  What is it matched against?  What part needs to match?
another question , these lowercase sourcetype are creating multiple sourcetypes in splunk . The latest TA that I downloaded from splunkbase has XmlWinEventLog & WinEventLog only sourcetype  . We want... See more...
another question , these lowercase sourcetype are creating multiple sourcetypes in splunk . The latest TA that I downloaded from splunkbase has XmlWinEventLog & WinEventLog only sourcetype  . We want to just use this as multiple different format are creating issue for detection cases. I was going to upgrade this app because the latest version of app  has  XmlWinEventLog & WinEventLog. will this create issue ?i don't think so but checking .  
It looks good to me.  Perhaps there are no users who haven't logged in for 30 days.  Try changing 30 to 1 as a test.
Hi @TeflonJohn , Visit the Free Trials and Downloads section on splunk.com and download the latest version of Splunk Enterprise for your operating system.  Follow the installation instructions prov... See more...
Hi @TeflonJohn , Visit the Free Trials and Downloads section on splunk.com and download the latest version of Splunk Enterprise for your operating system.  Follow the installation instructions provided on the website for your specific operating system. Once you install Splunk Enterprise, an Enterprise Trial license will be automatically generated for that instance. This trial license is valid for 60 days and allows you to index up to 500 MB of data per day. If you exceed this limit, you will receive a license warning. If you are an app developer then you can request a 10gb or 50gb *dev only* license: https://docs.splunk.com/Documentation/Splunk/9.4.0/Admin/TypesofSplunklicenses#Splunk_developer_licenses Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
 Hi @JJCO  Ensure that you correctly reference your `alert_message` using the correct token syntax. In the email alert settings, you should use `$result.alert_message$`. This assumes that `alert_mes... See more...
 Hi @JJCO  Ensure that you correctly reference your `alert_message` using the correct token syntax. In the email alert settings, you should use `$result.alert_message$`. This assumes that `alert_message` is part of the result set. Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
A Trial license is included in the Splunk Enterprise download.
Unless there was an error copying into the post, the event does not parse because it is not well-formed JSON. Also, the sourcetype attribute in props.conf is better placed in inputs.conf.  It's redu... See more...
Unless there was an error copying into the post, the event does not parse because it is not well-formed JSON. Also, the sourcetype attribute in props.conf is better placed in inputs.conf.  It's redundant in props.
Hi @lavs  Yes, it's normal for the Splunk Add-on for Windows (version 8.8.0) to use lowercase representations for sourcetypes such as `xmlwineventlog` and `wineventlog`. This change is consistent wi... See more...
Hi @lavs  Yes, it's normal for the Splunk Add-on for Windows (version 8.8.0) to use lowercase representations for sourcetypes such as `xmlwineventlog` and `wineventlog`. This change is consistent with other sorcetypes in Splunk, with better consistency. In your experience with `XmlWinEventLog` and `WinEventLog`, these sourcetypes have simply been renamed for this version of the add-on. If you are transitioning from older versions or have existing configurations that reference the previous sourcetype names, these should still work fine as values in SPL are not case sensitive. If you run into any issues with the new sourcetypes, feel free to reach out for help with querying, dashboard creation, or other operational aspects related to this change. Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
If this is your literal search, you're not assigning a field correctly with eval. The eval command must have a destination field name. The if  and case commands just return a value. You have to assi... See more...
If this is your literal search, you're not assigning a field correctly with eval. The eval command must have a destination field name. The if  and case commands just return a value. You have to assign this value somewhere. And you're using if with case syntax.
splunk Add-on for windows 8.8.0 has renaming of sourcetype to lower case xmlwineventlog and wineventlog   is this normal because I am familiar with using XmlWinEventLog & WinEventLog formatted sour... See more...
splunk Add-on for windows 8.8.0 has renaming of sourcetype to lower case xmlwineventlog and wineventlog   is this normal because I am familiar with using XmlWinEventLog & WinEventLog formatted sourcetype. 
Have you tried $results.alert_message$?
Howdy, I'm building out some alerting in Splunk ES, and created a new correlation search. That is all working, but I'm unable to pass my eval as a value into email alert. What I have: | eval al... See more...
Howdy, I'm building out some alerting in Splunk ES, and created a new correlation search. That is all working, but I'm unable to pass my eval as a value into email alert. What I have: | eval alert_message=range.":".sourcetype." log source has not checked in ".'Communicated Minutes Ago'." minutes. On index=".index.". Latest Event:".'Latest Event' | table alert_message Just running the search works, the table is there and looks correct. I've tried variations of $alert_message$ with and without quotes, but the alert_message never gets passed to the email alert. I haven't tried to generate a notable, but I'm guessing I'll have the same issue. I feel like I'm missing something easy here...
So I have been trying to use if statements, but I don't seem to be getting the if statement correct: index=kafka-np sourcetype="KCON" connName="CCNGBU_*" ERROR=ERROR OR ERROR=WARN Action="restart fa... See more...
So I have been trying to use if statements, but I don't seem to be getting the if statement correct: index=kafka-np sourcetype="KCON" connName="CCNGBU_*" ERROR=ERROR OR ERROR=WARN Action="restart failed" OR Action="disconnected" OR Action="Task threw an uncaught an unrecoverable exception" | eval if(Action="restart failed", "restart failed", "OK", Action="disconnected","disconnected","OK", Action="Task threw an uncaught an unrecoverable exception", "ok") | table Action host connName   I've tried several different formats for the if, but it keeps telling me the if statements are wrong.  What am I not seeing here?
Hi All,   I am trying to parse raw data with json elements to proper JSON format in Splunk. I have tried multiple props.conf but failed to parse it as per expected output. Below I have attached the... See more...
Hi All,   I am trying to parse raw data with json elements to proper JSON format in Splunk. I have tried multiple props.conf but failed to parse it as per expected output. Below I have attached the data coming as a single event on Splunk and expected data what we want to see. Can someone please correct my props.conf ?   Events on Splunk with default sourcetype   {"messageType":"DATA_MESSAGE","owner":"381491847064","logGroup":"tableau-cluster","logStream":"SentinelOne Agent Logs","subscriptionFilters":["splunk"],"logEvents":[{"id":"38791169637844522680841662226148491272212438883591651328","timestamp":1739456206172,"message":"[2025-02-13 15:16:41.413885] [110775] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24388.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24388.tmp: No such file or directory\n[2025-02-13 15:16:42.213970] [110823] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/hyper_transient.112335.24390.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/hyper_transient.112335.24390.tmp: No such file or directory\n[2025-02-13 15:16:42.214870] [110830] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24389.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24389.tmp: No such file or directory\n[2025-02-13 15:16:42.218488] [110823] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24391.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24391.tmp: No such file or directory\n[2025-02-13 15:16:43.815051] [110827] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24392.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24392.tmp: No such file or directory\n[2025-02-13 15:16:44.617525] [110773] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24394.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24394.tmp: No such file or directory\n[2025-02-13 15:16:45.413954] [110823] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24393.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24393.tmp: No such file or directory"},{"id":"38791169749325947928296247310685546917181598051987750913","timestamp":1739456211171,"message":"[2025-02-13 15:16:47.014642] [110770] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/hyper_transient.112335.24395.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/hyper_transient.112335.24395.tmp: No such file or directory\n[2025-02-13 15:16:47.813934] [110823] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24396.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24396.tmp: No such file or directory\n[2025-02-13 15:16:47.814459] [110828] [warning] DV process create: Couldn't fetch grandparent process of process 26395 from the data model\n[2025-02-13 15:16:47.815399] [110828] [warning] DV process create: Couldn't fetch grandparent process of process 26396 from the data model\n[2025-02-13 15:16:47.816855] [110827] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24397.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24397.tmp: No such file or directory\n[2025-02-13 15:16:48.616944] [110825] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream   Expected Output with fiedls extraction   { "messageType": "DATA_MESSAGE", "owner": "381491847064", "logGroup": "tableau-cluster", "logStream": "SentinelOne Agent Logs", "subscriptionFilters": ["splunk"], "logEvents": [ { "id": "38791169637844522680841662226148491272212438883591651328", "timestamp": 1739456206172, "message": "[2025-02-13 15:16:41.413885] [110775] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24388.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24388.tmp: No such file or directory\n[2025-02-13 15:16:42.213970] [110823] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/hyper_transient.112335.24390.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/hyper_transient.112335.24390.tmp: No such file or directory\n[2025-02-13 15:16:42.214870] [110830] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24389.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24389.tmp: No such file or directory\n[2025-02-13 15:16:42.218488] [110823] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24391.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24391.tmp: No such file or directory\n[2025-02-13 15:16:43.815051] [110827] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24392.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24392.tmp: No such file or directory\n[2025-02-13 15:16:44.617525] [110773] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24394.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24394.tmp: No such file or directory\n[2025-02-13 15:16:45.413954] [110823] [error] full_file_overwrite_flag: failed to stat /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24393.tmp: stat failed on path: /app/tableau/tableau_data/data/tabsvc/temp/hyper_0.20233.24.0718.1630/copyexternalstream.112335.24393.tmp: No such file or directory" } ] }   Props.conf   [json_splunk_logs] # Define the source type for the logs sourcetype = json_splunk_logs # Time configuration - Parse the timestamp in your message TIME_FORMAT = %Y-%m-%d %H:%M:%S.%6N TIME_PREFIX = \["message"\] \[ # Specify how to break events in the multiline message SHOULD_LINEMERGE = false LINE_BREAKER = ([\r\n]+) # Event timestamp extraction DATETIME_CONFIG = NONE # JSON parsing - This tells Splunk to extract fields from JSON automatically KV_MODE = json # The timestamp is embedded in the message, so the following configuration is necessary for time extraction. EXTRACT_TIMESTAMP = \["messageType":"DATA_MESSAGE","owner":"\d+","logGroup":"\w+","logStream":"\w+","subscriptionFilters":\[\\"splunk\\"\],\s"timestamp":(\d+),".*?