All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have a few sourcetypes, looking something like this: sourcetype=weather date, location, temperature sourcetype=actions date, machine, location, action sourcetype=repairs date, machine, relace... See more...
I have a few sourcetypes, looking something like this: sourcetype=weather date, location, temperature sourcetype=actions date, machine, location, action sourcetype=repairs date, machine, relacementPart I'd like to be able to pick out a specific machine, and then list all the dates on which it completed actions, and then link those dates and its location at that time to find out what the weather was like when the action was carried out, and also when any repairs were carried out. So I'd end up with the following, for a given machine: date, action, temperature, replacementPart I've done a fair bit of googling and trying out methods given that seemed like they might help, including join, coalesce and transaction commands, but nothing I've tried seems to have quite worked.  I'm pretty new to Splunk, so any help with a bit of an explanation is very much appreciated!
Dear Community Members , In splunk cloud instance : I am trying to get VPN login and logout for users in a single table sorted by Username and Time. The query is as below: eventtype="my_event... See more...
Dear Community Members , In splunk cloud instance : I am trying to get VPN login and logout for users in a single table sorted by Username and Time. The query is as below: eventtype="my_eventtype_1" eventtype="my_eventtype_2" (((EventIDValue=gateway-auth OR EventIDValue=clientlessvpn-login) EventStatus=success SourceUserName!="pre-logon") OR Stage=logout) | stats list(EventIDValue) as Activity,list(_time) as Time by SourceUserName |rename SourceUserName as username|convert ctime(Time)|eval username=upper(username)|sort username,-Time The search is for a period of 24 hours. I am getting the data but along with it, I see junk characters (if I may call them so). Kindly help to understand how to resolve the same. I also tried adding limit=0 along with stats command but no use. Below is the screenshot of the fields. I have not shown the username field for security reasons. I have used a similar query for another VPN and it works fine there and I don't see these characters ! Regards, Abhishek Singh
Hi, I have configured an input through aws splunk plugin to get data from a s3 bucket but when I search for it it don't show me anything. I use the test license (I'm trying a POC with the solution)... See more...
Hi, I have configured an input through aws splunk plugin to get data from a s3 bucket but when I search for it it don't show me anything. I use the test license (I'm trying a POC with the solution) Connection configuration is ok since I can list files in the bucket from splunk interface. Thanks for your help Saïd
Hi i have log file like this:   2021-07-15 00:00:01,869 INFO client.InEE-server1-1234567 [AppListener] Receive Message[A123]: Q[p1.APP], IID[null], Cookie[{"NODE_SRC":"server0"}] 2021-07-15 00:00... See more...
Hi i have log file like this:   2021-07-15 00:00:01,869 INFO client.InEE-server1-1234567 [AppListener] Receive Message[A123]: Q[p1.APP], IID[null], Cookie[{"NODE_SRC":"server0"}] 2021-07-15 00:00:01,871 INFO client.InEE-server1-1234567 [AlnProcessorService] Normal Message Received: A[000] B[00000] CD[00000-000000] EF[00:0000] GH[ 0000] SA[client.InEE-server1] 2021-07-15 00:00:01,892 INFO client.InEE-server1-1234567 [TransactionProcessorService] Message Processed: A[000] TA[client.OutEE-server2] Status[OK-GO,NEXT] 2021-07-15 00:00:01,988 INFO APP.InEE-server1-1234567 [AaaPowerManager] Send Message [X0000A0000] to [APP.p2] with IID[null], LTE[00000] . . . 2021-07-15 00:00:11,714 INFO APP.InE-p2-9876543 [AppListener] Receive Message[Y000000Z00000]: Q[p2.APP], IID[null], Cookie[null 2021-07-15 00:00:11,719 INFO client.InEE-server2-9876543_client.InEE-server1-1234567 [TransactionProcessorService] Normal Message Received:A[000] B[00000] CD[00000-000000] EF[00:0000] GH[ 0000] SA[client.InEE-server2] 2021-07-15 00:00:11,736 INFO client.InEE-server2-9876543_client.InEE-server1-1234567 [TransactionProcessorService] Message Processed:A[000] B[00000] CD[00000-000000] EF[00:0000] GH[ 0000] TA[client.OutEE-server1] Status[OK-OUT,null] . 2021-07-15 00:00:11,747 INFO APP.InEE-P2-9876543_CLIENT.InEE-server1-1234567 [AaaPowerManager] Send Message [A123] to [APP.p1] with IID[null], LTE[00000] Here is the flow: step1 (Receive Request): Server0> Client.InEE-server1>Client.OutEE-server2>   step2 (Reply to request) Client.InEE-server2> Client.OutEE-server1   expected result: id                                            Source                                   destination                                 State                   duration 1234567                            Server0                                  Client.InEE-server1                Received          00:00:00:002 1234567                            -                                                 -                                                      Processed        00:00:00:021 1234567,9876543        -                                                Client.InEE-server2               Send                    00:00:00:096 9876543                            Client.InEE-server2          -                                                     Receive              00:00:09:726 9876543                            -                                                  -                                                     Received           00:00:00:005 9876543                            -                                                 -                                                      Processed        00:00:00:017 9876543,1234567        -                                                Client.OutEE-server1            Send                    00:00:00:011 Total duration                                                                                                                                                           00:00:09:878        FYI:  SA=source address, TA=target address  Any idea  Thanks,
how to add gap between 2 panel in same row without using the empty panel between them
Hi, I want to install & download on-premise controller on trail basis, but unable to find setup of on-premise controller for windows under AppDynamics Download Center.  let me know from where I will... See more...
Hi, I want to install & download on-premise controller on trail basis, but unable to find setup of on-premise controller for windows under AppDynamics Download Center.  let me know from where I will be able to download setup for the same. Kindly help with this 
Seems like all of our Splunk servers are running the Monitoring Console in what I recon is the Standalone mode. When we go to the Monitoring Console on the Cluster Master it shows all the proper rol... See more...
Seems like all of our Splunk servers are running the Monitoring Console in what I recon is the Standalone mode. When we go to the Monitoring Console on the Cluster Master it shows all the proper roles on all our servers.  For example the Indexers are only running the Indexer role, the Search Heads are just Search Heads, the Deployment Servers are just Deployment Servers. However on each individual server their Monitoring Consoles show each server as having all the roles including License Master. I am a total new with Splunk with less than 2 years experience, and any time I think I know something, I discover something like this that makes me go wth. Can someone explain to me or suggest what I should do, I hate just ignoring things, so should I make all the individual Monitoring Consoles match the roles of the servers as represented on the Cluster Mater?
I just completed the Splunk 7.x Fundamentals Part 1 (eLearning) and passed the exam with 92% score, but when i see my profile it tells me that I have completed only 11/14 modules. Before test I veri... See more...
I just completed the Splunk 7.x Fundamentals Part 1 (eLearning) and passed the exam with 92% score, but when i see my profile it tells me that I have completed only 11/14 modules. Before test I verified that I completed all the modules and all of them showed as completed. Could you please help me by looking into my profile if I am doing anything wrong and let me know? Thank you.   Meghnad 240 344 1890    
Hello,  I am looking to clean up the result data from a Splunk query. How do I remove all the text prior to the user name at the end of the line? Server1234.prod.outlook.com/Microsoft Exchange Hos... See more...
Hello,  I am looking to clean up the result data from a Splunk query. How do I remove all the text prior to the user name at the end of the line? Server1234.prod.outlook.com/Microsoft Exchange Hosted Organizations/MyOrg.onmicrosoft.com/Smith, Joe I want the results to just return "Smith, Joe" thoughts?
Hi There, I have ingested the csv file via Splunk UF and I want to remove certain events that contains same field value, for example, field1 = xyz, abc, pqr,....     field2 = xyz I want to send the... See more...
Hi There, I have ingested the csv file via Splunk UF and I want to remove certain events that contains same field value, for example, field1 = xyz, abc, pqr,....     field2 = xyz I want to send the data to null queue if field1 = xyz and field2 = xyz This is my props.conf : [<sourcetype>] CHARSET = UTF-8 SHOULD_LINEMERGE = false NO_BINARY_CHECK = true LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true INDEXED_EXTRACTIONS = csv KV_MODE = none category = Structured disabled = false pulldown_type = true Any help would be appreciated. Thanks 
The new SentinelOne App 5.1.1, doesn't show a rank field for the threats anymore under sourcetype="sentinelone:channel:threats"   How can we determine the rank of a threat now?
if we have an Enterprise license can we get rid of [lmpool:auto_generated_pool_download-trial] from our server.conf file I read the documentation in the server.conf write up and it is totally lackin... See more...
if we have an Enterprise license can we get rid of [lmpool:auto_generated_pool_download-trial] from our server.conf file I read the documentation in the server.conf write up and it is totally lacking basically says   * This is the auto generated pool for the download trial stack * Field descriptions are the same as that for the "lmpool:auto_generated_pool_forwarder"
we have an Indexer Cluster and the Indexers have a [clustering] stanza in their server.conf files, same goes for the Search Heads, however do the two Deployment servers that we have need a [clusterin... See more...
we have an Indexer Cluster and the Indexers have a [clustering] stanza in their server.conf files, same goes for the Search Heads, however do the two Deployment servers that we have need a [clustering] stanza in their server.conf files as well? I am new to Splunk, and I ask because I noticed one Deployment server has it while the other one does not.  
How to make a read-only user/role?  Try to make a new role, but it inherited capabilities from defaults roles. Any suggestions? 
I have a search string that details the last log entry for all running jobs [shown in ascending order] bar a few jobs [as either completed OR of little interest]... index=ee_rpa_uipath_platform AND ... See more...
I have a search string that details the last log entry for all running jobs [shown in ascending order] bar a few jobs [as either completed OR of little interest]... index=ee_rpa_uipath_platform AND OrganizationUnitID=19 | dedup RobotName | table User, RobotName, _time, Message | sort _time | search (NOT Message IN("*complete*", "*ended*") AND NOT RobotName="ALWORKER*") What I now want is to reduce the returned set to logs that are over 30 mins old i.e. if the log file has NOT been updated in the past 30mins then there is a good chance that the job has hung I have tried the following command... index=ee_rpa_uipath_platform AND OrganizationUnitID=19 | dedup RobotName | table User, RobotName, _time, Message | sort _time | search (NOT Message IN("*complete*", "*ended*") AND NOT RobotName="ALWORKER*") | where earliest=-30m@m ...but I receive the following error... "Error in 'where' command: The operator at 'm@m' is invalid." Two questions What's wrong with my syntax and/or Is there a better way to to get the oldest last log entry from a set of job log files
Hi, In Splunk, I have Test Automation results logs which has details like Test case name, Test Status, Error, Duration, Date etc in multiple events. Each event has nearly 25 - 20 test cases details ... See more...
Hi, In Splunk, I have Test Automation results logs which has details like Test case name, Test Status, Error, Duration, Date etc in multiple events. Each event has nearly 25 - 20 test cases details in an array. I need to fetch each test case as a single record.  When I use spath, it is fetching the fields, but all the test case of a single event is written as a single record. index=jenkins OR source=\test OR job_name:"Dev/TestAutomation/Regression"| spath I am very new to Splunk. Is there any way by which I can write each testcase details as a single record. With these details, my requirement is to create an Regression Test Automation dashboard. Thanks  
Hi Team, I have installed PingAccess (https://splunkbase.splunk.com/app/5368/) and PingFederate (https://splunkbase.splunk.com/app/976/) app in our Splunk Search head and based on this below mention... See more...
Hi Team, I have installed PingAccess (https://splunkbase.splunk.com/app/5368/) and PingFederate (https://splunkbase.splunk.com/app/976/) app in our Splunk Search head and based on this below mentioned documentation we have edited the log4j2.xml file in the client machines where pingaccess and pingfederate app has been installed.  post which I can see a huge amount of files log files got generated in /log directory and I have ingested all the log files into Splunk.   Documentation Link: https://docs.pingidentity.com/bundle/pingfederate-102/page/qst1564002981075.html  https://docs.pingidentity.com/bundle/pingaccess-62/page/gyx1564006725145.html  But when I navigated to the app and I couldn't able to see any data in the dashboard ? And it is showing as no results and search is waiting for inputs. So do i need to configure additional things to reflect in the dashboard? Since in both the apps I couldnt able to see any data getting populated so kindly help on the same.   FYI. I have ingested all the log files using an index as main and for all the logs from pingaccess i used the sourcetype as pingaccess and for all the log files from pingfederate i used the index as main and sourcetype as pingfederate. And the logs from pingconsole i used the index as main and sourcetype as pingconsole. So am i missing anything in the configuration kindly let me know since I want the dashboard to work as expected.    
  I have 3 different indexes and they asked me to search by document number. The structure of the logs is different including the name of the field that contains the document number index = index1... See more...
  I have 3 different indexes and they asked me to search by document number. The structure of the logs is different including the name of the field that contains the document number index = index1 OR index = index2 OR index = index3 1234567 As you know, that query is limited to looking for the number 1234567 I can't tell it to show certain fields in a table using table or stats count. Any suggestion?
Hi All I'm new on splunk and have following problem. We need data from a table depending on the value of a variable. For this lookup is the right function i think. The special on that csv/file is t... See more...
Hi All I'm new on splunk and have following problem. We need data from a table depending on the value of a variable. For this lookup is the right function i think. The special on that csv/file is the value can be in different columns. Return data will be always the same.  The csv table has following structur:  column1 column2 column3 column4 column5 441 F205E 77889   22558 441 F204E 77998 44556 33669 442 G2071 88992 66557   442     11223 11559   it's possible there is no value inside a field in the table.  In case we won't found a match on column2 we search in column 4 and last in column6.  I tried it as follow:     | lookup Inventar_SBB column2 as Test_field output "column1" as "value1" , "column3" as "value2", "column4" as "value3", "column5" as "value4" | lookup Inventar_SBB "column4" as Test_field outputnew "column1" as "value1" , "column3" as "value2", "column4" as "value3", "column5" as "value4" | lookup Inventar_SBB "column5" as Test_field outputnew "column1" as "value1" , "column3" as "value2", "column4" as "value3", "column5" as "value4"     It looks like lookup function returns NULL in case there is no match.  With outputnew it won't overwrite the NULL value and with output it always overwrite my values with NULL   I tried with different variable names for "value" in each lookup and compare it to NULL but this was also not working.  Any idea what could be the workaround for this problem? Thanks in advance for any help.
I have learned to make a list of all Apps including built-in apps in Splunk Ent. & ES, but need to make a list of hidden apps. Thank u for your answer in advance.