All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

After upgrading our environment from 8.1.3 to 8.2.3, some searches return "StatsFileWriterLz4 file open failed". Our environment is one Search Head, one Indexer, one Cluster Master connected in an in... See more...
After upgrading our environment from 8.1.3 to 8.2.3, some searches return "StatsFileWriterLz4 file open failed". Our environment is one Search Head, one Indexer, one Cluster Master connected in an indexer cluster, with one Heavy forwarder and a few Universal Forwarders. All servers are running on Windows 2019.  A specific error message looks like this: "StatsFileWriterLz4 file open failed file=E:\Splunk\var\run\splunk\srtemp\374915603_1292_at_1637749050.1\statstmp_merged_5.sb.lz4" after running this tstats search: | tstats max(_time) as time max(_indextime) as indextime where index IN ("operations_script_log","telemetry*") sourcetype=* earliest=-30d@d latest=@d by sourcetype host source date_year date_month date_mday date_hour date_minute This search completed and returned the desired results prior to the upgrade, but won't complete after. It runs for about nine seconds before the "StatsFileWriterLz4 file open failed" error message appears and stops the search. Has anybody encountered this problem before and know a solution?
Hello,   Can you tell me please why the below does not work? | rest splunk_server=local servicesNS/-/-/data/ui/views/ | where update > relative_time(now(),"-10d@d")   I want to search the dashb... See more...
Hello,   Can you tell me please why the below does not work? | rest splunk_server=local servicesNS/-/-/data/ui/views/ | where update > relative_time(now(),"-10d@d")   I want to search the dashboards that were updated in the last 10 days but it does not seem to return anything. Is it because I need to fix the timestamp format?   Thanks!
Hello guys i'm new on splunk and I would like to know if it was possible to view the logs of a date on each page. Would it be possible to have 4 pages? Considering that it is not possible for... See more...
Hello guys i'm new on splunk and I would like to know if it was possible to view the logs of a date on each page. Would it be possible to have 4 pages? Considering that it is not possible for me to choose the number of logs to show per page because I do not know how many logs per date I could have, it is a coincidence that there are 2 per date. Thanks to all.  
Hello, I'm having some issues finding a proper solution for this problem: I created a custom alert action for a Splunk v6.6.3 (I know, it is veery old) that executes a python script that use an SPL... See more...
Hello, I'm having some issues finding a proper solution for this problem: I created a custom alert action for a Splunk v6.6.3 (I know, it is veery old) that executes a python script that use an SPL query through Splunk REST API. At the moment the used credentials for Splunk REST API are written in cleartext into the script; is there a way to encrypt them so they are not clearly visible to other users able to read the script code? Token authentication would have been the proper solution but it's not available in this old splunk version. Do you know any additional solutions for this issue?   Thank you in advice, have a good day!  
Hello! I love the Lookup File Editor app https://splunkbase.splunk.com/app/1724/  ...but for some time it has behaved weird for me on our SplunkCloud instance, especially annoying in that it doesn´... See more...
Hello! I love the Lookup File Editor app https://splunkbase.splunk.com/app/1724/  ...but for some time it has behaved weird for me on our SplunkCloud instance, especially annoying in that it doesn´t list any of the lookups/kvstores ie /en-GB/app/lookup_editor/lookup_list is completely empty. I can´t tell you exactly when it stopped behaving but likely 4-5months ago, just haven´t had time to look into it... There is an information icon in top right corner that upon clicking reads: "Dashboard Updated This dashboard has been updated. If you have any issues with this dashboard, contact the dashboard's owner. You can also temporarily open a previous view of this dashboard."  That used to open up an list of the lookups and kvstores. But upon updating to latest version 3.5 today ( and subsequently uninstalling and reinstalling the app again in troubleshooting) now just gives me another blank list even when using the temporary link ie its not completely useless. Sharing is "Global", Permissions set so "Everyone" has "Read" and "sc_admin" has "Write". App is Enabled. Health dashboards for "Status" and "Logs" doesnt say anything. Tested to create lookups and kvstores in multiple apps both before and after, nothing shows in the list. Any ideas on what is wrong? Any tips on what I can look for in troubleshoot would be apprechiated! Best regards, Victor
Hi, I can't understand why this query works in search while if I insert it in the dashboard it doesn't work. I assign the option chosen in the filter to the month variable, then I verify the choice... See more...
Hi, I can't understand why this query works in search while if I insert it in the dashboard it doesn't work. I assign the option chosen in the filter to the month variable, then I verify the choice and, based on what has been chosen, I have the sum of the X_MESE_PREVIOUS columns or the chosen month returned. The options are "Solar Year", "Fiscal Year" and the months. Where am I wrong? Why it does not work ? Tks Antonio --------------------------------------------------------- | loadjob savedsearch="antonio:enterprise:20211025_PASSAGGIO_AGGREGATO_DATE" |where (sourcetype="fs_ampliamenti_ip" AND OFFERTA="DIRETTA" AND STATO="OK") OR (sourcetype= "fs_diretta" AND TIPOLOGIA="SUBNETIP" AND OFFERTA="DIRETTA" AND STATO="OK") |eval MESEATTUALE=strftime(relative_time(now(), "-0d@d"), "%m") |eval MESEATTUALE= 11 |eval mese="Anno Fiscale" (manual setting of the chosen filter) |eval ANNOFISCALE=if(MESEATTUALE -3 <= 0,MESEATTUALE-3+12,MESEATTUALE-3) |rename PROGRESSIVO_MESE as "0_MESE_PRECEDENTE" |eval SOLARE = mvappend($0_MESE_PRECEDENTE$,$1_MESE_PRECEDENTE$,$2_MESE_PRECEDENTE$,$3_MESE_PRECEDENTE$,$4_MESE_PRECEDENTE$,$5_MESE_PRECEDENTE$,$6_MESE_PRECEDENTE$,$7_MESE_PRECEDENTE$, $8_MESE_PRECEDENTE$,$9_MESE_PRECEDENTE$,$10_MESE_PRECEDENTE$,$11_MESE_PRECEDENTE$,$12_MESE_PRECEDENTE$) | eval FISCALE=0 | foreach *_MESE_PRECEDENTE [|eval FISCALE = if (<<MATCHSTR>> < ANNOFISCALE, FISCALE + '<<FIELD>>', FISCALE)] | eval CHI=case( mese="0_MESE_PRECEDENTE", $0_MESE_PRECEDENTE$ , mese="1_MESE_PRECEDENTE", $1_MESE_PRECEDENTE$ , mese="2_MESE_PRECEDENTE",$2_MESE_PRECEDENTE$ , mese="3_MESE_PRECEDENTE",$3_MESE_PRECEDENTE$ , mese="4_MESE_PRECEDENTE",$4_MESE_PRECEDENTE$ , mese="5_MESE_PRECEDENTE",$5_MESE_PRECEDENTE$ , mese="6_MESE_PRECEDENTE",$6_MESE_PRECEDENTE$ , mese="7_MESE_PRECEDENTE",$7_MESE_PRECEDENTE$ , mese="8_MESE_PRECEDENTE",$8_MESE_PRECEDENTE$ , mese="9_MESE_PRECEDENTE",$9_MESE_PRECEDENTE$ , mese="10_MESE_PRECEDENTE",$10_MESE_PRECEDENTE$ , mese="11_MESE_PRECEDENTE",$11_MESE_PRECEDENTE$ , mese="12_MESE_PRECEDENTE",$12_MESE_PRECEDENTE$ , mese="Anno Solare",$SOLARE$ , mese="Anno Fiscale",$FISCALE$ , 1=1, "INV") |eval RIS = case( mese = "Anno Fiscale", FISCALE, mese = "Anno Solare", SOLARE, 1=1, CHI) | stats sum(RIS) as RISULTATO |table RISULTATO ------------------------------------------------ <query>| loadjob savedsearch="antonio:enterprise:20211025_PASSAGGIO_AGGREGATO_DATE" |where (sourcetype="fs_ampliamenti_ip" AND OFFERTA="DIRETTA" AND STATO="OK") OR (sourcetype= "fs_diretta" AND TIPOLOGIA="SUBNETIP" AND OFFERTA="DIRETTA" AND STATO="OK") |eval MESEATTUALE=strftime(relative_time(now(), "-0d@d"), "%m") |eval mese="$previousmonth$" (this is the token, chosen filter) |eval ANNOFISCALE=if(MESEATTUALE -3 &lt;= 0,MESEATTUALE-3+12,MESEATTUALE-3) |rename PROGRESSIVO_MESE as "0_MESE_PRECEDENTE" |eval SOLARE = mvappend($$0_MESE_PRECEDENTE$$,$$1_MESE_PRECEDENTE$$,$$2_MESE_PRECEDENTE$$,$$3_MESE_PRECEDENTE$$,$$4_MESE_PRECEDENTE$$,$$5_MESE_PRECEDENTE$$,$$6_MESE_PRECEDENTE$$,$$7_MESE_PRECEDENTE$$, $$8_MESE_PRECEDENTE$$,$$9_MESE_PRECEDENTE$$,$$10_MESE_PRECEDENTE$$,$$11_MESE_PRECEDENTE$$,$$12_MESE_PRECEDENTE$$) | eval FISCALE=0 | foreach *_MESE_PRECEDENTE [|eval FISCALE = if (&lt;&lt;MATCHSTR&gt;&gt; &lt; ANNOFISCALE, FISCALE + '&lt;&lt;FIELD&gt;&gt;', FISCALE)] | eval CHI=case( mese="0_MESE_PRECEDENTE", $$0_MESE_PRECEDENTE$$ , mese="1_MESE_PRECEDENTE", $$1_MESE_PRECEDENTE$$ , mese="2_MESE_PRECEDENTE",$$2_MESE_PRECEDENTE$$ , mese="3_MESE_PRECEDENTE",$$3_MESE_PRECEDENTE$$ , mese="4_MESE_PRECEDENTE",$$4_MESE_PRECEDENTE$$ , mese="5_MESE_PRECEDENTE",$$5_MESE_PRECEDENTE$$ , mese="6_MESE_PRECEDENTE",$$6_MESE_PRECEDENTE$$ , mese="7_MESE_PRECEDENTE",$$7_MESE_PRECEDENTE$$ , mese="8_MESE_PRECEDENTE",$$8_MESE_PRECEDENTE$$ , mese="9_MESE_PRECEDENTE",$$9_MESE_PRECEDENTE$$ , mese="10_MESE_PRECEDENTE",$$10_MESE_PRECEDENTE$$ , mese="11_MESE_PRECEDENTE",$$11_MESE_PRECEDENTE$$ , mese="12_MESE_PRECEDENTE",$$12_MESE_PRECEDENTE$$ , mese="Anno Solare",$$SOLARE$$ , mese="Anno Fiscale",$$FISCALE$$ , 1=1, "INV") |eval RIS = case( mese = "Anno Fiscale", FISCALE, mese = "Anno Solare", SOLARE, 1=1, CHI) | stats sum(RIS) as RISULTATO |table RISULTATO</query>
How to sum the values of a multivalue token in Simple XML? Let's say you have a mv token named test1 with values of: 1,2,3 How to achieve something like: <eval token="test2">sum($test1$)</eval> T... See more...
How to sum the values of a multivalue token in Simple XML? Let's say you have a mv token named test1 with values of: 1,2,3 How to achieve something like: <eval token="test2">sum($test1$)</eval> Thanks!
Hi All,  I have a log with 3 event inside of it, ( you can see it on the screenshot, I paste the sample logs here : https://regex101.com/r/EvmMeR/1 1st Event -  Short event  2nd Event - Short eve... See more...
Hi All,  I have a log with 3 event inside of it, ( you can see it on the screenshot, I paste the sample logs here : https://regex101.com/r/EvmMeR/1 1st Event -  Short event  2nd Event - Short event and multi-line 3rd Event - VERY LONG and multi-line, need to be dropped as per the client. I manage to DROP the 3rd event by finding the LOGS that are greater than 2000 characters. The problem is , I dropped the event but Splunk still raise the issue: 11-24-2021 08:02:57.049 +0000 WARN LineBreakingProcessor [6453 parsing] - Truncating line because limit of 2000 bytes has been exceeded with a line length >= 55179 - data_source="SAMPLETOSHARE.txt", data_host="5bfd55dbdcdd", data_sourcetype="sample"   Is there a way to stop splunk from flagging issue for those logs that  was dropped ? 
Hi , A user is complaining that : From hostname1, we are pushing the syslog to Splunk indexer server IP - 10.20.30.40 via Port 55XY, can you please check if anything needs to be done from Splunk e... See more...
Hi , A user is complaining that : From hostname1, we are pushing the syslog to Splunk indexer server IP - 10.20.30.40 via Port 55XY, can you please check if anything needs to be done from Splunk end to see the data in Splunk. Can anyone please me on this. Regards, Rahul
Dear Professor, I have two alert search like this 1. Search 1: index="abc" sourcetype="abc" service.name=financing request.method="POST" request.uri="*/applications" response.status="200" |timech... See more...
Dear Professor, I have two alert search like this 1. Search 1: index="abc" sourcetype="abc" service.name=financing request.method="POST" request.uri="*/applications" response.status="200" |timechart span=2m count as applicaton_today |eval mytime=strftime(_time,"%Y-%m-%dT%H:%M") |eval yesterday_time=strftime(_time,"%H:%M") |fields _time,yesterday_time,applicaton_today And here is output 2. Search 2: index="xyz" sourcetype="xyz" "Application * sent to xyz success" |timechart span=2m count as omni_today |eval mytime=strftime(_time,"%Y-%m-%dT%H:%M") |eval yesterday_time=strftime(_time,"%H:%M") |fields _time,yesterday_time,omni_today And here is output  3. I try to combine two search like this then calculate spike. index="abc" sourcetype="abc" service.name=financing request.method="POST" request.uri="*/applications" response.status="200" |timechart span=2m count as app_today |eval mytime=strftime(_time,"%Y-%m-%dT%H:%M") |eval yesterday_time=strftime(_time,"%H:%M") | append [search index="xyz" sourcetype="xyz" "Application * sent to xyz" | timechart span=2m count as omni_today] |fields _time,yesterday_time,app_today,omni_today |eval spike=if(omni_today < app_today AND _time <= now() - 3*60 AND _time >= relative_time(now(),"@d") + 7.5*3600, 1, 0) Here is output  But it shows two time span (like image).  How can I combine two search with only time span like this.   Thank you for your help.    
Can some one help me to extract correlation _id from the below sample data. requirement is to extract the correlation_id into a field.   ys_class_name="Incident",closed_by="",dv_closed_by="",follo... See more...
Can some one help me to extract correlation _id from the below sample data. requirement is to extract the correlation_id into a field.   ys_class_name="Incident",closed_by="",dv_closed_by="",follow_up="",dv_follow_up="",parent_incident="",dv_parent_incident="",reopened_by="",dv_reopened_by="",reassignment_count="1",dv_reassignment_count="1",assigned_to="c8c62ea2db51f090439694d3f39619dc",dv_assigned_to="pusapati dixitulu",u_reopening_reason="",dv_u_reopening_reason="None",sla_due="",dv_sla_due="UNKNOWN",comments_and_work_notes="",u_transfer_between_users="",dv_u_transfer_between_users="",agile_story="",dv_agile_story="",escalation="0",dv_escalation="Normal",upon_approval="proceed",dv_upon_approval="Proceed to Next Task",correlation_id="f725d663-7c62-4f50-82b1-1483df23562e",dv_correlation_id="f725d663-7c62-4f50-82b1-1483df23562e",u_business_area="",dv_u_business_area="None",u_plb="",dv_u_plb="None",u_division="",dv_u_division="",u_bu_code="",dv_u_bu_code="",u_is_escalated="false",dv_u_is_escalated="false",child_incidents="0",dv_child_incidents="0",task_effective_number="INC4750863",dv_task_effective_number="INC4750863",u_last_assignment="2021-11-24 05:49:28",dv_u_last_assignment="2021-11-24 06:49:28",resolved_by="",dv_resolved_by Thanks
Hi, The following is my search: index=pace ERROR OR FATAL OUI=* Number=* | stats count by OUI Number | sort -count   After executing the above search i get the following results:   OUI Number... See more...
Hi, The following is my search: index=pace ERROR OR FATAL OUI=* Number=* | stats count by OUI Number | sort -count   After executing the above search i get the following results:   OUI Number count 9C3DCF 4W12757WA51F6 18 80CC9C 4W15177LA0AD1 10 0836C9 4W150B70A3837 4 100C6B 4W15077PA0682 3 80CC9C 4W151778A0A39 3 80CC9C 4W15177GA0A5D 3 Note: The number column are the results I am interested in. I have a separate table named subsdeviceextract.csv as per the following: MAC Model OUI Post Code Serial Number 08:36:C9:9A:F4:6C V6510 0836C9 2775 4W150B70A012A 08:36:C9:9B:5C:FE V6510 0836C9 6437 4W150B70A07A8 08:36:C9:9C:A8:20 V6510 0836C9 2641 4W150B70A110A I would like to look up the Serial number to get the Model Number Please help me, thank you
hi, when i am trying to install splunkforwarder getting error. Please help me. Linux machine: root@client1:/opt/splunkforwarder/bin# uname -r 4.19.0-18-cloud-amd64 package: splunkforwarder-8... See more...
hi, when i am trying to install splunkforwarder getting error. Please help me. Linux machine: root@client1:/opt/splunkforwarder/bin# uname -r 4.19.0-18-cloud-amd64 package: splunkforwarder-8.2.3-cd0848707637-Linux-armv8.tgz error: root@client1:/opt# cd splunkforwarder root@client1:/opt/splunkforwarder# cd bin/ root@client1:/opt/splunkforwarder/bin# ./splunk start --accept-license -bash: ./splunk: cannot execute binary file: Exec format error root@client1:/opt/splunkforwarder/bin# ./splunk start -bash: ./splunk: cannot execute binary file: Exec format error root@client1:/opt/splunkforwarder/bin# ./splunkd start -bash: ./splunkd: cannot execute binary file: Exec format error
Hi I am trying to filter data using week data using 2 dropdowns. Please find info below snippet. the below code throws an error " Error in 'where' command: The operator at '&gt;=2101 and week&lt;=215... See more...
Hi I am trying to filter data using week data using 2 dropdowns. Please find info below snippet. the below code throws an error " Error in 'where' command: The operator at '&gt;=2101 and week&lt;=2152' is invalid." pLease suggest. input type is dropdown and Dropdown menu for starting week : token is "from_week_token" label is "From week" fieldForLabel is "week" fieldForValue is "week" default is "2101" query is source="pdfthroughput_pdf_patches.json" host="LT433534" index="pdf_patches" sourcetype="_json"|eval weekNday=split(planned_stopped_on,".") | eval week=mvindex(weekNday,0) | table week | dedup week | where week&gt;=2101 and week&lt;=2152 input type is dropdown and dropdown menu for end week: token is "to_week_token" label is "To week" default is "2152" fieldForLabel is "week" fieldForValue is "week" query: source="pdfthroughput_pdf_patches.json" host="LT433534" index="pdf_patches" sourcetype="_json"|eval weekNday=split(planned_stopped_on,".") | eval week=mvindex(weekNday,0) | table week | dedup week | where week&gt;=2101 and week&lt;=2152
Hi  I am trying to speed up a query. When I run >>>   index=foo | stats values(host) as F_host   It take less than a minute to return the results. I want to take those results and create an ou... See more...
Hi  I am trying to speed up a query. When I run >>>   index=foo | stats values(host) as F_host   It take less than a minute to return the results. I want to take those results and create an outlookup and match host values against another lookup.  However I need to deliminate the stats results to individual values.   Something like,    ... | makemv delim=" " F_host | outputlookup ...   or maybe,    ... | eval D_host = split(F_host, " ") etc   If I run the original query >>>    index=foo | lookup bar-host.csv barHost AS host OUTPUTNEW barHost as match-host | stats values(match-host) by host   it takes forever.   In this case, bar-host.csv is the lookup filename and barHost is the fieldname. Maybe this is just plain old wrong, any advice appreciated... Thank you
Hello all, I am trying to setup a search that logs ufw commands, while ignoring any ufw status commands. I have tried a number of methods so far but cannot get the COMMAND field to filter appropriat... See more...
Hello all, I am trying to setup a search that logs ufw commands, while ignoring any ufw status commands. I have tried a number of methods so far but cannot get the COMMAND field to filter appropriately. Here is a version of the search:  ``` index="*" host="*dev*" source="/var/log/auth.log" process="sudo" COMMAND="/usr/sbin/ufw" | table _time host user _raw | where COMMAND!="*/usr/sbin/ufw status*" ``` I've tried a number of things including trying NOT instead of !, searching for various strings (status, *status*, etc.), filtering on the _raw field instead of COMMAND, using search instead of where, putting the table after the where, etc. I cannot get the events to filter out. It seems like I either get all the events or none of the events depending on the filter I choose. Any help here? Thank you!
hello, I would like to ask a question on how to assign the value to another variable and set an alert. I have a this data output from Splunk. I would like to assign the value to another variables ... See more...
hello, I would like to ask a question on how to assign the value to another variable and set an alert. I have a this data output from Splunk. I would like to assign the value to another variables and set an alert when the value become(s) is greater than a threshold like 10 or 20. for example when TX_UPS value >= 10, then I send an alert. how should I approach this in Splunk Alert job? shipper count TX_UPS 10 TX_USPS 15 TX_FedEx 5 CO_UPS 5 CO_USPS 9 CO_FedEx 2 MO_UPS 5 MO_USPS 20 MO_FedEx 3 GA_UPS 15 GA_USPS 10 GA_FedEx 5 PA_UPS 9 PA_USPS 21 PA_FedEx 8 NY_UPS 30 NY_USPS 99 NY_FedEx 20 index=main AND "*TRACKING*" | stats count by shipper    
I have a base search: index=oswin EventCode=19 SourceName="Microsoft-Windows-WindowsUpdateClient" earliest=-10d ComputerName=*.somedomain.com | rex "\WKB(?<KB>.\d+)\W" The result populates field ‘... See more...
I have a base search: index=oswin EventCode=19 SourceName="Microsoft-Windows-WindowsUpdateClient" earliest=-10d ComputerName=*.somedomain.com | rex "\WKB(?<KB>.\d+)\W" The result populates field ‘KB’ with a list of values similar to: 5007192 5008601 890830 I need to test if ‘KB’ contains one of the following: “5008601”, “5008602”, “5008603”, “5008604”, “5008605”, “5008606” If a match is found, populate field HotFixID (new field) with the matched value. If no match is found, populate field HotFixID with “NotInstalled”. Using search KB IN (5008601,5008602,5008603,5008604,5008605,5008606) results in matched values only. Case function works only if the matched value is the last one evaluated, otherwise it returns "notInstalled" even though a match is present.
Greetings, Has anyone successfully connected to Azure SQL with DB Connect with Azure Active Directory? I have installed the driver for MSSQL JDBC version 9.4 which Azure supports. I am using the fol... See more...
Greetings, Has anyone successfully connected to Azure SQL with DB Connect with Azure Active Directory? I have installed the driver for MSSQL JDBC version 9.4 which Azure supports. I am using the following connection string:     jdbc:sqlserver://<instance_url>:<instance_port>;database=<db>;encrypt=true;trustServerCertificate=false;hostNameInCertificate=<instance_domain>;loginTimeout=30;authentication=ActiveDirectoryPassword      I am getting error: Failed to load MSAL4J Java library for performing ActiveDirectoryPassword authentication.  Thanks in Advance
Hello, I have a dashboard with a text input that has id "text_input".  With JavaScript I am listening to changes to that input, and when that happens I just want to read the new value into a variabl... See more...
Hello, I have a dashboard with a text input that has id "text_input".  With JavaScript I am listening to changes to that input, and when that happens I just want to read the new value into a variable.  Instead of reading the newly inserted value, though, it reads the previous value.  The following code gives an idea of what I am doing:   var def_tok = mvc.Components.get("default"); var sub_tok = mvc.Components.get("submitted"); ... $("#text_input").on('change', function () { console.log("Change Detected"); var sub_tok_input = sub_tok.get("text_input"); var sub_tok_input_form = sub_tok.get("form.text_input"); var def_tok_input = def_tok.get("text_input"); var def_tok_input_form = def_tok.get("form.text_input"); console.log("sub_tok_input: " + sub_tok_input); console.log("sub_tok_input_form: " + sub_tok_input_form); console.log("def_tok_input: " + def_tok_input); console.log("def_tok_input_form: " + def_tok_input_form); });   When I fill in the first value from blank the first print reads blank instead of the value.  Then when I update the value it reads the first value, and so on...   What I am doing wrong and how can I ensure that I get the newly inserted value instead of the previously inserted one?   Thanks! Andrew