All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

In my indexer cluster, one of my indexers has inflight files in the cold and warm storage that range from 1.5-2 months old.  There doesn't seem to be any inflight files newer than these and these dir... See more...
In my indexer cluster, one of my indexers has inflight files in the cold and warm storage that range from 1.5-2 months old.  There doesn't seem to be any inflight files newer than these and these directories/files are not empty.  One index has old inflight files coming out to 1.8TiB of storage. 1.) Why are these files still sticking around for so long? Why isn't Splunk cleaning them up? 2.) How can I get splunk to take care of these files (i.e. delete them)?
Hi Everyone, I've added a txt file to SA-Eventgen sample folder and wrote the configuration in the eventgen.conf file as follows. [mihealth-https_error] mode = sample interval = 15 earliest = -1... See more...
Hi Everyone, I've added a txt file to SA-Eventgen sample folder and wrote the configuration in the eventgen.conf file as follows. [mihealth-https_error] mode = sample interval = 15 earliest = -15s latest = now count = 25 hourOfdayRate = { "0": 0.8, "1": 1.0: "2": 0.9, "3":0.7, "4":0.7, "5":0.7, "6":0.7, "7":0.7, "8":0.7, "9":0.7, "10":0.7, "11":0.7, "12":0.7, "13":0.7, "14":0.7, "15":0.7, "16":0.7, "17":0.7, "18":0.7, "19":0.7, "20":0.7, "21":0.7, "22":0.7, "23":0.7 } dayOfWeekRate = { "0": 0.7, "1": 0.7, "2": 0.7, "3": 0.6, "4": 0.8, "5": 1.0, "6": 0.9 } randomizeCount = 0.2 randomizeEvents = true outputMode = modinput sourcetype = eventgen_test3 source = eventgendemo3 index = eventgen token.0.token = \[(\w+\s\w+\s\d+\s\d+:\d+:\d+.\d+\s\d+)\] token.0.replacementType = timestamp token.0.replacement = %a %b %d %H:%M:%S.%6N %Y token.1.token = \(\w+\s\w+.(\w+).\w+:\d+\) token.1.replacementType = file token.1.replacement = $SPLUNK_HOME/etc/apps/SA-Eventgen/samples/orderType.sample   the txt data look like this in the sample folder: [Thu Jun 04 09:37:31.838874 2020] [ssl:info] [pid 24583] [client 10.10.10.1:39900] NC00000: Connection to child 8 established (server core.Company.com:443) it is not generating any events, could you please help me? Thanks in advance
Is there a free trial available for Splunk ITSI?
After reloading Splunk enterprise version 8.0.3, csv files emailed out via alerts have an extra line between rows of data.  Prior to the reload they did not. If the result of the alert is downloaded ... See more...
After reloading Splunk enterprise version 8.0.3, csv files emailed out via alerts have an extra line between rows of data.  Prior to the reload they did not. If the result of the alert is downloaded via the Job Manager, the file is as expected.  If the two versions of csv are opened with notepad++ I can see that the file normally has "CRLF" at the end of each line and the incorrect one has "CRCRLF" The alerts are the result from a simple |table type search and have worked for months without issue.  The server was reloaded by out IT people due to unrelated issues with the kv store SPLUNK is set to use python 3, which I believe is the same as before the reload. Any help would be appreciated as the automation fed by these csv's is failing due to the additional blank lines
Under "Format", there's a setting for "Click Selection". I remember that in Splunk 6, I could set that to "None" (or the moral equivalent at the time), which I did way-back-when. Now that our instal... See more...
Under "Format", there's a setting for "Click Selection". I remember that in Splunk 6, I could set that to "None" (or the moral equivalent at the time), which I did way-back-when. Now that our installation is running Splunk 8, I was noticing that my colleagues got this highlighting behavior that I promptly tried out, trying to set Click Selection to "Full", "Inner" and "Outer". Unfortunately, none of these settings work well for me and I'd love to disable the setting again. Is that possible anymore? Attached is a screenshot of the three selection options I see. I could swear there was a "none" or "off" there in earlier versions. We're running Splunk Enterprise 8.0.1, build 6db836e2fb9e.
How to ignore a field from search if the value is null, search based on the second input.? I have two inputs and this search will work only if i have some value in both the fields. I need the result... See more...
How to ignore a field from search if the value is null, search based on the second input.? I have two inputs and this search will work only if i have some value in both the fields. I need the result, even if one value is null.   name="$field4$" OR EmpID="$field5$" Found a similar one here, but this did not resolve my issue. Appreciate the help in advance. https://community.splunk.com/t5/Getting-Data-In/How-to-omit-a-field-from-search-on-a-text-input-if-the-field-is/m-p/330788
I have XML files I'm trying to break-up into individual events based on the following XML format. I need to break these up based on the info between each <issue> and </issue>. Also, these <issue>'s a... See more...
I have XML files I'm trying to break-up into individual events based on the following XML format. I need to break these up based on the info between each <issue> and </issue>. Also, these <issue>'s are not the top-level XML tag, which I think is causing my issues. <issue> <finding info> <more info, etc. etc.> </issue> <issue> next finding info </issue> <issue> third finding info </issue> I've been trying to use the following with no luck. [xml-2] DATETIME_CONFIG = CURRENT SHOULD_LINEMERGE = false LINE_BREAKER = \s\s(<\/issue>)\s\s\s(<issue>)  I've also tried BREAK_ONLY_BEFORE as well as KV_MODE = xml with the above configuration. Any help on this would be GREATLY appreciated!
Hello All, Our SH in Splunk cloud will be upgraded to 7.2.10.2 from v7.0.13. Our on - prem SH is on 7.0.5. As per the document here.  "The search head must be at the same or a higher level than the... See more...
Hello All, Our SH in Splunk cloud will be upgraded to 7.2.10.2 from v7.0.13. Our on - prem SH is on 7.0.5. As per the document here.  "The search head must be at the same or a higher level than the search peers. See the note later in this section for a precise definition of "level" in this context." Would it be safe to say we do not have to upgrade the on-prem Search Head for now? Thanks for the inputs.
I am trying to create a Db connect input for Oracle DB for which the Rising column has TIMESTAMP WITH LOCAL TIMEZONE format.  Sample values from the Rising column field :  2020-10-30 07:32:35.015... See more...
I am trying to create a Db connect input for Oracle DB for which the Rising column has TIMESTAMP WITH LOCAL TIMEZONE format.  Sample values from the Rising column field :  2020-10-30 07:32:35.015828 America/New_York I have used the below format for the Datetime Format for the raising column while creating hthe DB Input, but its throwing an error which says : "'Invalid argument(s) in call'" yyyy-MM-dd HH:mm:ssZ What should be the correct Datetime format for this? 
Hi,  Looking forward to learn from you guys. I am stucked at this calculation: Total of product in contract. I made a simple dataset to simplify my data.    | makeresults | eval date = "2017-01-... See more...
Hi,  Looking forward to learn from you guys. I am stucked at this calculation: Total of product in contract. I made a simple dataset to simplify my data.    | makeresults | eval date = "2017-01-30" , source = "a", id="111" | makemv delim="," id | makemv delim="," date | makemv delim="," source | append [| makeresults | eval date = "2017-01-30" , source = "b", id="222" | makemv delim="," id | makemv delim="," date | makemv delim="," source] | append [| makeresults | eval date = "2019-08-20" , source = "a", id="333" | makemv delim="," id | makemv delim="," date | makemv delim="," source] | append [| makeresults | eval date = "2020-01-20" , source = "a", id="444" | makemv delim="," id | makemv delim="," date | makemv delim="," source] | append [| makeresults | eval date = "2020-03-20" , source = "b", id="555" | makemv delim="," id | makemv delim="," date | makemv delim="," source]   INPUT: let's image _time is date time of buying record _time signed contract date id source 2020-10-30 14:55:55 2017-01-30 111 a 2020-10-30 14:55:55 2017-01-30 222 b 2020-08-30 14:55:55 2019-08-20 333 a 2020-01-30 14:55:55 2020-01-20 444 a 2020-09-30 14:55:55  2020-03-20 555 b The expected output: Count total product in contract from 12/2019 to 03/2020 Time total_nb_product Source 12/2019 2 a 12/2019 1 b 01/2020 3 a 01/2020 1 b 02/2020 3 a 02/2020 1 b 03/2020 3 a 03/2020 2 b Thank for your time and hope to received your suggestion.  Great weekend
Hi, I have a question about restricting user/role access to apps. Today I have about 80 apps in my SH. I would like one specific role to be able to access and see just one app. Is there any way to ... See more...
Hi, I have a question about restricting user/role access to apps. Today I have about 80 apps in my SH. I would like one specific role to be able to access and see just one app. Is there any way to perform this without having to change every app that contains the "Everyone" permissions to read or write? Thanks.
My current splunk search stops after 5 errors of "Streamed search execute failed because: Error in 'rex' command: ". It does say "exceeded configured match_limit" so i think on some log lines this ex... See more...
My current splunk search stops after 5 errors of "Streamed search execute failed because: Error in 'rex' command: ". It does say "exceeded configured match_limit" so i think on some log lines this execution take too much steps. Now up till here it's pretty clear to me, but it does succeed for thousands to millions of lines before stopping completely. i would like to know which line it fails on so i can investigate the problem and fix it. Does anyone know how to investigate? the query: the error is reported on the regex line with "ExecutionContext", line 3       host="abc*-solutionname-*" "Elapsed:" "Finished" "ExecutionContext:" | rex "Elapsed: (?<elapsed>[^\s]+)" | rex "ExecutionContext:(?<method>.*)Elapsed:" | eval stamp=strptime(elapsed, "%H:%M:%S.%Q.")-strptime(strftime(now(),"%m/%d/%y"),"%m/%d/%y")         A few example lines being processed:       2020-10-30 12:15:37.0373Z|TRACE|ServiceLog|UID:[]. CID:[fb749af30b9e2b]. PID:[15]. SID:[4cfda0137]. ExecutionContext:Help.Index. Elapsed: 00:00:00.0006980. Finished. Message: blabla.| 2020-10-30 12:15:37.0373Z|TRACE|ServiceLog|UID:[]. CID:[fb749af30b9e2b]. PID:[15]. SID:[4cfda0137]. ExecutionContext:HelpController.Index. Elapsed: 00:00:00.0000061. Finished. | 2020-10-30 12:15:37.0287Z|TRACE|ServiceLog|UID:[]. CID:[5b0f8455873d1a]. PID:[10]. SID:[d41870240]. ExecutionContext:Help.Index. Elapsed: 00:00:00.0007222. Finished. Message: blabla.| 2020-10-30 12:15:37.0287Z|TRACE|ServiceLog|UID:[]. CID:[5b0f8455873d1a]. PID:[10]. SID:[d41870240]. ExecutionContext:HelpController.Index. Elapsed: 00:00:00.0000061. Finished. |       i do know THESE lines work fine, but of course there are quite some different possibilities, so i would like to know when it goes wrong EXACTLY instead of those "yeah something went wrong and i'm not telling you where" errors.
Does anyone have any queries or dashboards they would be willing to share for use with data being ingested via syslog and leveraging the Splunk Add-on for McAfee ePO Syslog? I started to write my own... See more...
Does anyone have any queries or dashboards they would be willing to share for use with data being ingested via syslog and leveraging the Splunk Add-on for McAfee ePO Syslog? I started to write my own but would rather not re-create the wheel if I don't need to.   Thanks in advance!
Hi, banging my head against a wall with this, some background, Basically i have had to recreate a new indexer search head which is standalone to the version 8, i have managed to import the old data i... See more...
Hi, banging my head against a wall with this, some background, Basically i have had to recreate a new indexer search head which is standalone to the version 8, i have managed to import the old data in after some head scratching.   I want to use the splunk app for windows infrastructure but when going through the guided setup i get this  All prereqs were passed    I have followed the guide and deployed the Splunk_TA_Windows app to one of my DC's  On further investigation and after restarting the splunk services i noticed all the indexes it is trying to search in the guided setup dont exist.  Im obviously out of my depth with this and am going around in circles have what have i missed? Any help appreciated because im stumped. Regards M
Hi everyone, I am interested in the question, how can I parse data which sent via POST to the dashboard. I mean, I have two dashboards. From the first dashboard, I send big data using JS to the seco... See more...
Hi everyone, I am interested in the question, how can I parse data which sent via POST to the dashboard. I mean, I have two dashboards. From the first dashboard, I send big data using JS to the second dashboard. How to parse this data in the second dashboard.   Thanks for your help!
I am attempting to mask sensitive information using SEDCMD. However, it does not seem to take effect. I've run btool, but am not sure what to make of the output. It does not seem like an issue with ... See more...
I am attempting to mask sensitive information using SEDCMD. However, it does not seem to take effect. I've run btool, but am not sure what to make of the output. It does not seem like an issue with splunk conf file precedence.   %SPLUNK_HOME%\bin> .\splunk.exe btool props --app=my_app list --debug %SPLUNK_HOME%\etc\apps\my_app\local\props.conf SEDCMD-cred_mask1 = s/-varPass\s(.+?)\;/xxxxxxxx/g %SPLUNK_HOME%\etc\apps\my_app\local\props.conf SEDCMD-cred_mask2 = s/-sensitivePass=(.+)/xxxxxxxx/g     Running a search with rex in sed mode with the same regex works as expected.   <base search> | rex field=Process_Command_Line mode=sed "s/-varPass\s(.+?)\;/xxxxxxxx/g" | rex field=Process_Command_Line mode=sed "s/--sensitivePass=(.+)/xxxxxxxx/g" | table _time host source Process_Command_Line    
Hi All, I installed splunk add on for service now and configuration and inputs were made. But i  am not receiving any logs in the console.  In the messages i see "Some configuration are missing in... See more...
Hi All, I installed splunk add on for service now and configuration and inputs were made. But i  am not receiving any logs in the console.  In the messages i see "Some configuration are missing in Splunk add-on for ServiceNow" Fix the configuration to resume data collection. But in add-on configuration everything is fine i believe. Please help here. @isoutamo @thambisetty     
Hi, I have created a dashboard where I can save queries by entering it in the input field. It works fine when I enter a simple query:      sourcetype = WinEventLog EventCode = 4624 | stats count ... See more...
Hi, I have created a dashboard where I can save queries by entering it in the input field. It works fine when I enter a simple query:      sourcetype = WinEventLog EventCode = 4624 | stats count (EventCode) by host     When I run the following query I get the error below:      sourcetype="pan:traffic" user!="xxxx" earliest=-14d | bucket _time span=5m | stats sum(bytes_out) by user, _time | anomalydetection "sum(bytes_out)" "user" action=annotate | eval isOutlier = if(probable_cause != "", "1", "0") | where isOutlier=1 | table "sum(bytes_out)" "user", "_time", probable_cause, isOutlier | stats count by user | sort -count | head 20       How can I escape the query in way that I can save it in the lookup file. Or are there better ways to save a query in a dashboard? dashboard:   <form> <label>Threat Hunting Query</label> <search> <query>| makeresults | eval Panel=$tokPanel|s$ </query> <done> <condition match="$result.Panel$==&quot;1&quot;"> <set token="tokPanelSelected">1</set> <set token="pan1"></set> <unset token="pan2"></unset> </condition> <condition match="$result.Panel$==&quot;2&quot;"> <set token="pan2"></set> <unset token="pan1"></unset> <set token="tokPanelSelected">2</set> </condition> <condition> <unset token="tokPanelSelected"></unset> </condition> </done> </search> <fieldset submitButton="true"> <input type="text" searchWhenChanged="false" token="discription_query"> <label>Omschrijving query:</label> <default></default> </input> <input type="text" token="query"> <label>Query:</label> <default></default> </input> <input type="text" token="user"> <label>Naam:</label> <default></default> </input> <input type="dropdown" token="tokPanel" searchWhenChanged="false"> <label></label> <choice value="1">Toevoegen</choice> <choice value="2">Verwijderen</choice> <default>Kies toevoegen of verwijderen</default> </input> </fieldset> <row> <panel depends="$pan1$"> <title>Query is toegevoegd/add query</title> <table> <search> <query> <![CDATA[ | inputlookup threat_hunting.csv | append [ | stats count | eval query_discription="$discription_query$" | eval query_q="$query$" | lookup dnslookup clientip As src OUTPUT clienthost AS src_host | lookup dnslookup clientip As dest OUTPUT clienthost AS dest_host | stats count(src) by src src_host | eval tnow=strftime(now(), "%a %m/%d/%Y %H:%M") | eval user="$user$" | eval id=100 ] | stats count by query_discription query_q id tnow user| fields query_discription query_q id tnow user | outputlookup threat_hunting.csv ]]> </query> </search> </table> </panel> <panel depends="$pan2$"> <title>Query is verwijderd/delete query</title> <table> <search> <query> | inputlookup threat_hunting.csv | stats count by query_discription query_q id tnow user | fields - count | where query_discription !="$discription_query$" | outputlookup threat_hunting.csv </query> </search> </table> </panel> </row> </form>  
is it possible to get alerts through whatsapp? and how?
Hello Everyone, I am trying to delete my modular input and the input inside configuration tab -->Account, trying to add new account but not able to delete that getting error message like "Object id... See more...
Hello Everyone, I am trying to delete my modular input and the input inside configuration tab -->Account, trying to add new account but not able to delete that getting error message like "Object id=xxx://Test cannot be deleted in config=inputs in splunk ". Can anyone please let me know ho to delete that?   Thank You.