All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

The new SentinelOne App 5.1.1, doesn't show a rank field for the threats anymore under sourcetype="sentinelone:channel:threats"   How can we determine the rank of a threat now?
if we have an Enterprise license can we get rid of [lmpool:auto_generated_pool_download-trial] from our server.conf file I read the documentation in the server.conf write up and it is totally lackin... See more...
if we have an Enterprise license can we get rid of [lmpool:auto_generated_pool_download-trial] from our server.conf file I read the documentation in the server.conf write up and it is totally lacking basically says   * This is the auto generated pool for the download trial stack * Field descriptions are the same as that for the "lmpool:auto_generated_pool_forwarder"
we have an Indexer Cluster and the Indexers have a [clustering] stanza in their server.conf files, same goes for the Search Heads, however do the two Deployment servers that we have need a [clusterin... See more...
we have an Indexer Cluster and the Indexers have a [clustering] stanza in their server.conf files, same goes for the Search Heads, however do the two Deployment servers that we have need a [clustering] stanza in their server.conf files as well? I am new to Splunk, and I ask because I noticed one Deployment server has it while the other one does not.  
How to make a read-only user/role?  Try to make a new role, but it inherited capabilities from defaults roles. Any suggestions? 
I have a search string that details the last log entry for all running jobs [shown in ascending order] bar a few jobs [as either completed OR of little interest]... index=ee_rpa_uipath_platform AND ... See more...
I have a search string that details the last log entry for all running jobs [shown in ascending order] bar a few jobs [as either completed OR of little interest]... index=ee_rpa_uipath_platform AND OrganizationUnitID=19 | dedup RobotName | table User, RobotName, _time, Message | sort _time | search (NOT Message IN("*complete*", "*ended*") AND NOT RobotName="ALWORKER*") What I now want is to reduce the returned set to logs that are over 30 mins old i.e. if the log file has NOT been updated in the past 30mins then there is a good chance that the job has hung I have tried the following command... index=ee_rpa_uipath_platform AND OrganizationUnitID=19 | dedup RobotName | table User, RobotName, _time, Message | sort _time | search (NOT Message IN("*complete*", "*ended*") AND NOT RobotName="ALWORKER*") | where earliest=-30m@m ...but I receive the following error... "Error in 'where' command: The operator at 'm@m' is invalid." Two questions What's wrong with my syntax and/or Is there a better way to to get the oldest last log entry from a set of job log files
Hi, In Splunk, I have Test Automation results logs which has details like Test case name, Test Status, Error, Duration, Date etc in multiple events. Each event has nearly 25 - 20 test cases details ... See more...
Hi, In Splunk, I have Test Automation results logs which has details like Test case name, Test Status, Error, Duration, Date etc in multiple events. Each event has nearly 25 - 20 test cases details in an array. I need to fetch each test case as a single record.  When I use spath, it is fetching the fields, but all the test case of a single event is written as a single record. index=jenkins OR source=\test OR job_name:"Dev/TestAutomation/Regression"| spath I am very new to Splunk. Is there any way by which I can write each testcase details as a single record. With these details, my requirement is to create an Regression Test Automation dashboard. Thanks  
Hi Team, I have installed PingAccess (https://splunkbase.splunk.com/app/5368/) and PingFederate (https://splunkbase.splunk.com/app/976/) app in our Splunk Search head and based on this below mention... See more...
Hi Team, I have installed PingAccess (https://splunkbase.splunk.com/app/5368/) and PingFederate (https://splunkbase.splunk.com/app/976/) app in our Splunk Search head and based on this below mentioned documentation we have edited the log4j2.xml file in the client machines where pingaccess and pingfederate app has been installed.  post which I can see a huge amount of files log files got generated in /log directory and I have ingested all the log files into Splunk.   Documentation Link: https://docs.pingidentity.com/bundle/pingfederate-102/page/qst1564002981075.html  https://docs.pingidentity.com/bundle/pingaccess-62/page/gyx1564006725145.html  But when I navigated to the app and I couldn't able to see any data in the dashboard ? And it is showing as no results and search is waiting for inputs. So do i need to configure additional things to reflect in the dashboard? Since in both the apps I couldnt able to see any data getting populated so kindly help on the same.   FYI. I have ingested all the log files using an index as main and for all the logs from pingaccess i used the sourcetype as pingaccess and for all the log files from pingfederate i used the index as main and sourcetype as pingfederate. And the logs from pingconsole i used the index as main and sourcetype as pingconsole. So am i missing anything in the configuration kindly let me know since I want the dashboard to work as expected.    
  I have 3 different indexes and they asked me to search by document number. The structure of the logs is different including the name of the field that contains the document number index = index1... See more...
  I have 3 different indexes and they asked me to search by document number. The structure of the logs is different including the name of the field that contains the document number index = index1 OR index = index2 OR index = index3 1234567 As you know, that query is limited to looking for the number 1234567 I can't tell it to show certain fields in a table using table or stats count. Any suggestion?
Hi All I'm new on splunk and have following problem. We need data from a table depending on the value of a variable. For this lookup is the right function i think. The special on that csv/file is t... See more...
Hi All I'm new on splunk and have following problem. We need data from a table depending on the value of a variable. For this lookup is the right function i think. The special on that csv/file is the value can be in different columns. Return data will be always the same.  The csv table has following structur:  column1 column2 column3 column4 column5 441 F205E 77889   22558 441 F204E 77998 44556 33669 442 G2071 88992 66557   442     11223 11559   it's possible there is no value inside a field in the table.  In case we won't found a match on column2 we search in column 4 and last in column6.  I tried it as follow:     | lookup Inventar_SBB column2 as Test_field output "column1" as "value1" , "column3" as "value2", "column4" as "value3", "column5" as "value4" | lookup Inventar_SBB "column4" as Test_field outputnew "column1" as "value1" , "column3" as "value2", "column4" as "value3", "column5" as "value4" | lookup Inventar_SBB "column5" as Test_field outputnew "column1" as "value1" , "column3" as "value2", "column4" as "value3", "column5" as "value4"     It looks like lookup function returns NULL in case there is no match.  With outputnew it won't overwrite the NULL value and with output it always overwrite my values with NULL   I tried with different variable names for "value" in each lookup and compare it to NULL but this was also not working.  Any idea what could be the workaround for this problem? Thanks in advance for any help.
I have learned to make a list of all Apps including built-in apps in Splunk Ent. & ES, but need to make a list of hidden apps. Thank u for your answer in advance.
Hey all, I'm trying to separate out the IP address (Source Network Address:) from the Windows event Message field. I'm trying to find every instance of authentication for a certain user acct (includi... See more...
Hey all, I'm trying to separate out the IP address (Source Network Address:) from the Windows event Message field. I'm trying to find every instance of authentication for a certain user acct (including services & IIS app pools), but I'm having to parse it from the logs I'm getting from our Domain controllers. I'm pretty new to Splunk so my search is fairly basic. This is all I have so far, and I haven't been able to find info on whether that rex field should equal the new field I'm trying to create, or the field I'm searching, or what: index=* user=username EventCode=4740 OR EventCode=4648 OR EventCode=4672 OR EventCode=4624 | rex field=ServerIP "Source Network Address:(?<Message>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})" So what I'm trying to do: Search Windows event log "Message" field > Find string "Source Network Address:ipaddress" Create new field with Value being above IP address   Appreciate any help!  
The calculation has to be made on Team Availability, taking a value of 96000 reduces the current Time Required  value and display in the next row in Team Availability, the recent Team Availabilit... See more...
The calculation has to be made on Team Availability, taking a value of 96000 reduces the current Time Required  value and display in the next row in Team Availability, the recent Team Availability value must be taken for next subtraction with Time Required can be displayed  in the next row in Team Availability, so it continues for all Time Required valued. The last image is the expected one, Help me to rectify my doubt and share the query  
I am looking to run a search and filter out whitelisted exceptions in a lookup file.  2 of the fields could contain multiple values though. Here's the search I'm using:   index=microsoft365 source... See more...
I am looking to run a search and filter out whitelisted exceptions in a lookup file.  2 of the fields could contain multiple values though. Here's the search I'm using:   index=microsoft365 sourcetype IN (azure:aad:signin, o365:management:activity) (action=success OR status=success) NOT Operation=UserLoginFailed | eval user_id=lower(user_id) | dedup src user_id date_month | iplocation src | search NOT Country IN ("United States", "Canada") | lookup local=t asn ip AS src | lookup nonUSlogins.csv ca.user_id AS user_id OUTPUT a.country a.ticket a.user_id | table user_id src date_month Country Region City asn autonomous_system a.user_id a.country     I tried using a match but found you can't use match if there are multiple values in a single field. Here is an example result currently: user_id src  date_month Country Region City asn autonomous_system a.user_id a.country user1 1.1.1.1 june Albania Tirana District Tirana     user1 user1 Albania Canada user1 2.2.2.2 june Germany Land Berlin Berlin     user1 user1 Albania Canada   I'm trying to eliminate results where the value for user_id matches a.user_id (values in this filed will be the same when there are multiple) AND the value of Country matches one of the countries listed in a.country   I would expect to see this in the end: user_id src  date_month Country Region City asn autonomous_system a.user_id a.country user1 2.2.2.2 june Germany Land Berlin Berlin     user1 user1 Albania Canada  
Hello, i´m looking to get this result between each start /end time. hope you could help me For example: Start time Endtime Su M Tu W Th F Sa 2021/07/01  2021/07/17 2 2 2 2 ... See more...
Hello, i´m looking to get this result between each start /end time. hope you could help me For example: Start time Endtime Su M Tu W Th F Sa 2021/07/01  2021/07/17 2 2 2 2 3 3 3 2021/07/05  2021/07/20 2 3 3 2 2 2 2   Thanks in advance
Hi I have log file like this: 2021-07-15 00:00:01,869 INFO APP.InEE-p1-1234567 [AppListener] Receive Message[A123]: Q[p1.APP], IID[null], Cookie[{"NODE":"0000aa000"}] . . 2021-07-15 00:00:01,988... See more...
Hi I have log file like this: 2021-07-15 00:00:01,869 INFO APP.InEE-p1-1234567 [AppListener] Receive Message[A123]: Q[p1.APP], IID[null], Cookie[{"NODE":"0000aa000"}] . . 2021-07-15 00:00:01,988 INFO APP.InEE-p1-1234567 [AaaPowerManager] Send Message [X0000A0000] to [APP.p2] with IID[null], LTE[00000] . . 2021-07-15 00:00:11,714 INFO APP.InE-p2-9876543 [AppListener] Receive Message[Y000000Z00000]: Q[p2.APP], IID[null], Cookie[null] . . 2021-07-15 00:00:11,747 INFO APP.InEE-P2-9876543_CLIENT.InEE-p1-1234567 [AaaPowerManager] Send Message [A123] to [APP.p1] with IID[null], LTE[00000] . .     want to calculate duration of each transaction like this ("Send Message"-"Receive Message"="duration") 00:00:11,747  -  00:00:01,869  = 00:00:09:878 output exception: id                      duration 1234567       00:00:09:878 any idea? Thanks
Hi all,    I'm a very new splunk user so am learning as i go - i simply want to connect splunk enterprise to this api:https://api.beta.ons.gov.uk/v1  and start interrogating the data.   I have in... See more...
Hi all,    I'm a very new splunk user so am learning as i go - i simply want to connect splunk enterprise to this api:https://api.beta.ons.gov.uk/v1  and start interrogating the data.   I have installed splunk_app_db_connect and am following this guidance https://www.progress.com/tutorials/jdbc/connect-to-any-rest-api-from-splunk-enterprise   But am getting an error  Unable to initialize modular input "server" defined in the app "splunk_app_db_connect": Introspecting scheme=server: script running failed (exited with code 1)..   and looking at the log file I am getting this :  2021-07-16T12:07:42+0100 [WARNING] [settings.py], line 107: java home auto detection failed   I'm struggling to understand what else to tweak to resolve as its a desktop version of splunk and i can't see any other apps i am meant to have running in the background.   any ideas? or is there a better app to use to ingest api data  
Is there a corresponding utility according to SendToSplunk for Linux? (Splunk Universal Forwarder is oversized for my requirement) See https://helgeklein.com/free-tools/sendtosplunk-send-text-data-s... See more...
Is there a corresponding utility according to SendToSplunk for Linux? (Splunk Universal Forwarder is oversized for my requirement) See https://helgeklein.com/free-tools/sendtosplunk-send-text-data-splunk-tcp-port/ SendToSplunk – Send Text Data to a Splunk TCP Port  
Hello * how can i overwrite the default eval definition for field app in props.conf? default/props.conf   ... EVAL-app = "Blue Coat ProxySG" ...   I try to overwrite this field with following i... See more...
Hello * how can i overwrite the default eval definition for field app in props.conf? default/props.conf   ... EVAL-app = "Blue Coat ProxySG" ...   I try to overwrite this field with following in local/props.conf   ... FIELDALIAS-app = x_bluecoat_application_name as app ...   We use a distributed Environment so i changed this in SH and HF app. But no change to the results. What am i doing wrong?
Hi Splunk Community. I have an alert, which runs a query regularly, for example hourly 24*7*365. If the alert is triggered, an email is sent to a distrubution list. I need two distinct distrbution ... See more...
Hi Splunk Community. I have an alert, which runs a query regularly, for example hourly 24*7*365. If the alert is triggered, an email is sent to a distrubution list. I need two distinct distrbution lists on this alert. One distribution list should receive the email during working hours, Monday 7am to 6pm. The other distribution list should receive the email outside working hours. Is there a simple way to do it with just one alert, or is it better to create two identical alerts?   Thanks Michal 
Hi, I don't know if it is possible, but I would like to specify the time range of a join subsearch from a calculated value. I have a similar log record and query: Log record:   myField=abc, coll... See more...
Hi, I don't know if it is possible, but I would like to specify the time range of a join subsearch from a calculated value. I have a similar log record and query: Log record:   myField=abc, collectionTimeEpoch=1626358999, maxDurationInSeconds=10, items=[id=00000000-00000000-00000000-00000000#content=123,id=myId2#content=456]   The query is similar to the following:   index="..." sourcetype="..." myField=abc | sort -_time | head 1 | eval itemList=split(items,",") | mvexpand itemList | rex field=itemList "(?<id>[-\w\d]+)#content=(?<content>[-\w\d]+)" | eval start=(collectionTimeEpoch-maxDurationInSeconds) | join type=left id [search earliest=-2d@d index="..." sourcetype="..." someField=someValue ]   I would like to replace earliest=-2d@d  to something like earliest=start, but that is not working. I have also tried   | join type=left id [search earliest=[stats count | eval earliest=(collectionTimeEpoch-maxDurationInSeconds) |fields earliest ] index="..." sourcetype="..." someField=someValue ]   Could you help me with this? Thanks in advance