All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

When an array of dictionaries is assigned to the output variable of a code block, only the whole array can be used as input on the following blocks, but using a data path selector for key selection t... See more...
When an array of dictionaries is assigned to the output variable of a code block, only the whole array can be used as input on the following blocks, but using a data path selector for key selection throws an error. Value of the output variable:   [ { "tags": [ "tag1" ], "domain": "example1.com" }, { "tags": [ "tag2" ], "domain": "example2.com" } ]   Data Paths:   Working: function_name:custom_function:result_domains Not Working: function_name:custom_function:result_domains.*.domain   The not working selection throws the following error:   Python Error: Traceback (most recent call last): File "../pylib/phantom/decided/internals.py", line 445, in call_playbook_action_callback File "../pylib/phantom/decided/internals.py", line 159, in invoke_callback_for_callable File "../pylib/phantom/decided/internals.py", line 268, in _invoke_callback_for_callable File "<playbook>", line 93, in get_url_calltrace_callback File "<playbook>", line 399, in domains File "<playbook>", line 499, in code_11 File "/opt/phantom/usr/python36/lib/python3.6/json/__init__.py", line 354, in loads return _default_decoder.decode(s) File "/opt/phantom/usr/python36/lib/python3.6/json/decoder.py", line 339, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/opt/phantom/usr/python36/lib/python3.6/json/decoder.py", line 357, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)   Is this a known limitation of the code block (not custom function) in Splunk SOAR?
My requirement,  is to run this alert with a time range of 12 hours and send email twice a day (every 12 hour) based on what it finds. Here is my configuration, Cron Expression : * */12 * * * Ti... See more...
My requirement,  is to run this alert with a time range of 12 hours and send email twice a day (every 12 hour) based on what it finds. Here is my configuration, Cron Expression : * */12 * * * Time Range: Last 12 hours Schedule Priority : Default Schedule Window : 5 minutes In my local time it runs between 9:30 AM - 10:30 AM and 9:30 PM - 10:30 PM. But, Between those (say between 9:30 AM to 10:30 AM), it triggers multiple emails alerts, like one alert in every 2 min kind of frequency.  What I want is, It should send one email during each run. (i.e. One email after every 12 hours). Can anyone guide what to change in the scheduling options to achieve this ?
  getting below error message when i configured the alert.Coiuld you please suggest what the further step. Pathname [9188 AlertNotifierWorker-0] - Pathname 'C:\Program Files\Splunk\bin\Python3.ex... See more...
  getting below error message when i configured the alert.Coiuld you please suggest what the further step. Pathname [9188 AlertNotifierWorker-0] - Pathname 'C:\Program Files\Splunk\bin\Python3.exe C:\Program Files\Splunk\etc\apps\search\bin\sendemail.py   For  below steps i followed 1) configured email settings 2)port number also enabled  
Hi, Splunkers, I have a doubt. now currently using Splunk enterprise 8.2.5, today morning the etc/password file auto-updated and detected by a third party software ( confidential ). I never changed... See more...
Hi, Splunkers, I have a doubt. now currently using Splunk enterprise 8.2.5, today morning the etc/password file auto-updated and detected by a third party software ( confidential ). I never changed the file, so my question is-- does Splunk auto-update the $SPLUNK_HOME/etc/password file? please provide any Splunk documentation 
I would like to transfer data from the data source to Forwarder via Syslog over TLS. Is it possible to use the default SSL certificate provided by Splunk to transfer data from the data source to the... See more...
I would like to transfer data from the data source to Forwarder via Syslog over TLS. Is it possible to use the default SSL certificate provided by Splunk to transfer data from the data source to the forwarder over Syslog over TLS? Is it possible to use the default SSL certificate provided by Splunk on non-Splunk equipment?
Hi community, I am new to Splunk and considering to evaluate it as our enterprise log collection and SIEM setup. If I want to forward logs to a Splunk forwarder and then it forwards to a Splunk s... See more...
Hi community, I am new to Splunk and considering to evaluate it as our enterprise log collection and SIEM setup. If I want to forward logs to a Splunk forwarder and then it forwards to a Splunk server, will the splunk server be able to parse the real IP address of the log source itself? Or will it see the splunk forwarder IP as the real source IP? We want to forward all our server logs to this splunk forwarder, and then to server. But being able to see real IP addresses is what we are concerned with.   thanks  
Hi, I would like to implement some splunk alert to check if there's any special event that happened after a certain event, all the events are grouped by the same request-id,  wonder if you could help... See more...
Hi, I would like to implement some splunk alert to check if there's any special event that happened after a certain event, all the events are grouped by the same request-id,  wonder if you could help on this, thanks queryA -   index=app  class=ClassA  conditionA=aVal | fields  rid, _ time | table rid, _time,   each result (rid, _time) is unique queryB -   index=app  class=ClassB conditionB=bVal   rid=queryA.rid and _time > queryA._time I would like to get the alert if queryB has a result.   If it is represented as a SQL, it would be like this  select  field1, fiedl2 ....  from queryB as B,               (select id, _time from queryA where  afield1=someval and afield2=val2) as A where B.id=A.id and B._time > A._time Any help would be greatly appreciated, thanks
Hi All,  I have 12 months worth of data that is contained on individual separate .csv files within my lookup tables.  Ie - | inputlookup JanStats.csv & | Inputlookup FebStats.csv etc I have a da... See more...
Hi All,  I have 12 months worth of data that is contained on individual separate .csv files within my lookup tables.  Ie - | inputlookup JanStats.csv & | Inputlookup FebStats.csv etc I have a dashboard that has about 12 different panels each showing different stats and visuals all coming from the one lookup table. Ie-  one panel is showing how many hours spent in which category : | inputlookup JanStats.csv | stats sum(Duration) as total by Entry Instead of creating 12 different monthed dashboards with all the same panels, I want to be able to use a drop down input button to select the month and have that months data from the individual lookup to display across the 12 panels on the one dash.  I started mucking around with this below, which works as such, but it is only showing the 1 panel search..where as i have 12 panels to search each time the drop down is selected - i am struggaling to figure out how to add in the extra panels i have ( they are of similar searchs, ie stats sum(Recovery) by User.     <form>   <label>Drop Down Testing</label>   <fieldset submitButton="false"></fieldset>   <row>     <panel>       <input type="dropdown" token="Month" searchWhenChanged="true">         <choice value="1">January</choice>         <choice value="2">February</choice>         <choice value="3">March</choice>         <default></default>         <change>           <condition value="1">             <set token="new_search">| inputlookup JanStats.csv  | stats sum(Duration) AS total by Entry  | table total</set>           </condition>           <condition value="2">             <set token="new_search">| inputlookup FebStats.csv  | stats sum(Duration) AS total by Entry  | table total</set>           </condition>           <condition value="3">             <set token="new_search">| inputlookup MarchStats.csv  | stats sum(Duration) AS total by Entry  | table total</set>           </condition>         </change>       </input>       <single>         <search>           <query>$new_search$</query>           <earliest>-4h@m</earliest>           <latest>now</latest>         </search>         <option name="colorMode">block</option>         <option name="drilldown">all</option>         <option name="rangeColors">["0x53a051","0x0877a6","0xf8be34","0xf1813f","0xdc4e41"]</option>         <option name="useColors">1</option>       </single>     </panel>   </row> </form>
We have synthetic tests and use the availability metric in our SLA reporting. Sometimes our tests fail due to a false positive. It would be great to be able to click a Failed Session in the control... See more...
We have synthetic tests and use the availability metric in our SLA reporting. Sometimes our tests fail due to a false positive. It would be great to be able to click a Failed Session in the controller UI and mark it as OK.
We are running the latest update for Splunk Enterprise Security, which includes the new "Cloud Security" option., In Cloud Security, I can see some data when using the "Microsoft 365 Security Option... See more...
We are running the latest update for Splunk Enterprise Security, which includes the new "Cloud Security" option., In Cloud Security, I can see some data when using the "Microsoft 365 Security Option". However, no data is shown for the following options: Security Groups IAM Activity Network ACLs Access Analyzer Is there some configuration that I have missed? Thanks. Steve Rogers
We are running in Splunk Cloud and have configured the "Splunk Add-On for Microsoft Cloud Services" based on the provided configuration documentation. I am trying to use the Microsoft Azure App for... See more...
We are running in Splunk Cloud and have configured the "Splunk Add-On for Microsoft Cloud Services" based on the provided configuration documentation. I am trying to use the Microsoft Azure App for Splunk to view Azure data (which I presumed would be pulled in by the "Splunk Add-On for Microsoft Cloud Services", but the Microsoft Azure App for Splunk shows no data at all. I have verified the Add-on configuration, but still not seeing any data?  Does anyone have this app working and displaying results? Best regards, Steve Rogers    
Hello,   I need to build a search where I can subtract a token from the previous value in a row. Example I know how to get the first count (800) which is simply calculated through a query I alr... See more...
Hello,   I need to build a search where I can subtract a token from the previous value in a row. Example I know how to get the first count (800) which is simply calculated through a query I already have. I do not know how to get the token to subtract from the value of the cell right above. Does anyone who how to write this into Splunk query logic that can compute these values? _time Count Notes 05:00 800 Saved token = 100 05:05 700 800-100 05:10 600 700-100
Looking for the new location to check for Splunk patches, it used to be here - https://www.splunk.com/en_us/product-security/announcements-archive.html We are required to check for any new patches m... See more...
Looking for the new location to check for Splunk patches, it used to be here - https://www.splunk.com/en_us/product-security/announcements-archive.html We are required to check for any new patches monthly and the location has moved.
I'm trying to create a table of availabilities (percent uptime) for a given service for a set of hosts.  My desired output is a simple 2-column table of "Host" and "Availability (%)", like the one be... See more...
I'm trying to create a table of availabilities (percent uptime) for a given service for a set of hosts.  My desired output is a simple 2-column table of "Host" and "Availability (%)", like the one below: Host Availability my-db-1 100% my-db-2 97.5% my-db-3 100% my-db-4 72.2% rhnsd Availabilities I have a query I currently use to get just availability of a service for a single host, but I'd like to scale it larger to create the above output.  It assumes ps.sh is running every 1800 seconds and uses the number of events it finds over a give time period (info_max_time-info_min_time) and divides that by the total number of 1800 second intervals that can fit in the given time period, along with some conditions if no host matches or if the availability is >100.  That query is as follows:   index=os host="my-db-1.mydomain.net" sourcetype=ps rhnsd | stats count, distinct_count(host) as hostcount | addinfo | eval availability=if(hostcount=0,0,if(count>=(info_max_time-info_min_time)/1800,100,count/((info_max_time-info_min_time)/1800))*100) | table availability    Or if there's a much easier way to accomplish this that I don't know about, I'm all ears.  Any help is greatly appreciated.
I am new to Splunk and I am trying to parse an Aide scan log file to display each line. Currently, Splunk just reads all the lines as a single event.  I know I may have to build a regex once I have S... See more...
I am new to Splunk and I am trying to parse an Aide scan log file to display each line. Currently, Splunk just reads all the lines as a single event.  I know I may have to build a regex once I have Splunk reading the file correctly, but currently Splunk isn't extracting the events by the newline character. Sample data below:     How can I get Splunk to parse each line vs reading the entire file as a single event?    
Hello, Looking for a way to partially join 2 inputlookups. Lookup 1: username, name jsmith, John jdoe, Joe Lookup 2:username,status jsmith-sa, enabled I would like to return a match on j... See more...
Hello, Looking for a way to partially join 2 inputlookups. Lookup 1: username, name jsmith, John jdoe, Joe Lookup 2:username,status jsmith-sa, enabled I would like to return a match on jsmith to jsmith-sa but have not been able to figure out how to partially join.  ie search for jsmith* against lookup2 not for exact matches.  The 2nd lookup may have the entire keyword or keyword-something  Search returns: jsmith,jsmith-sa,enabled
Hello, We are configuring the Splunk add-on for Microsoft Cloud Services.  Is there a corresponding Splunk app for visualization of the data that is ingested by this add-on?   Steve Rogers
I am new to Splunk and need some serious practice to learn all the cool things Splunk can do. I am trying to load the BOTSV1 JSON dataset into my lab environment so I can start learning the basics of... See more...
I am new to Splunk and need some serious practice to learn all the cool things Splunk can do. I am trying to load the BOTSV1 JSON dataset into my lab environment so I can start learning the basics of SPL. According to the comments in GitHub this dataset is 120GB uncompressed. This brings up the following two issues. 1) The Splunk web file importer will only load files up to 500MB. How am I supposed to load a 120GB file? 2) The Splunk development license that I received is limited to 10GB, so how am I supposed to load this 120GB file once question #1 is resolved? I am sure I am not the only one encountering this issue, so forgive me for asking a question that has probably already been answered numerous time. 
 I have created a lookup table with filename and cutofftime within which we have to receive the file. I have to compare Cutofftime to check if its falling within 30 mins of current time and retrieve ... See more...
 I have created a lookup table with filename and cutofftime within which we have to receive the file. I have to compare Cutofftime to check if its falling within 30 mins of current time and retrieve that particular file name from lookup  and search for it. Please help me with query
I have changed the certificate on server.conf to take my created cert for port 8089. This is the same cert that I have been using for port 8000. I change server.conf according to documentation, reboo... See more...
I have changed the certificate on server.conf to take my created cert for port 8089. This is the same cert that I have been using for port 8000. I change server.conf according to documentation, reboot and verify the certificate. Port 8089 is pointing to the new cert. However, when trying to sign in to port 8000, I get the following error. Login failed due to incorrectly configured Multifactor authentication. Contact Splunk support for resolving this issue   PLease advise, we are using DUO.  This is what my sslConfig portion looks like in my server.conf [sslConfig] sslPassword = ****************************************************************************** privKeyPath = C:\Program Files\Splunk\etc\auth\mycerts\*****************.key serverCert = C:\Program Files\Splunk\etc\auth\mycerts\*****************.pem sslRootCAPath = C:\Program Files\Splunk\etc\auth\mycerts\************.pem requireClientCert = False