All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello all, I do appreciate this question has been asked several times, but I am struggling to understand how to link searches together.  I have search A that highlights what src_ip communicated w... See more...
Hello all, I do appreciate this question has been asked several times, but I am struggling to understand how to link searches together.  I have search A that highlights what src_ip communicated with dest_ip on a specific port-  (index=netfw OR index=netproxy) AND ("192.168.*.*") AND (dest_port="23") | table src_ip, dest_ip, dest_port, _time I then take the results from the src_ip and link it to the second search in a new tab to find the computer hostname from the src_ip via DHCP logs -  index=oswinsec sourcetype=dhcp ip=192.168.*.* | table ip, dest, date, time Is it at all possible to combine these so when I do a search on a specific destination IP, the results src_ip is converted into the hostnames found in the DHCP source type? thank you
Hi All, trying to get WinEventlogs from SF to Indexer via HF. The logs are getting indexed but seems likes they are not getting parsed through TA as i am getting sourcetype as XMLWinEventLog instea... See more...
Hi All, trying to get WinEventlogs from SF to Indexer via HF. The logs are getting indexed but seems likes they are not getting parsed through TA as i am getting sourcetype as XMLWinEventLog instead or Wineventlog. Any help is appreciated. Splunk_TA_Windows is installed on SF,HF,Indexers. regards,  
hi, I have a question to ask: can you assign values to multiple variables in Splunk with the case command? I need that based on a filter chosen in the dashboard, it performs a different search base... See more...
hi, I have a question to ask: can you assign values to multiple variables in Splunk with the case command? I need that based on a filter chosen in the dashboard, it performs a different search based on what has been selected. I have a filter with options: red, green, yellow, blue, black If you choose red, the search must be: search field1 = A AND field2 = B if you choose green: search field1 = C AND field2 = D AND field3 = E if you choose yellow: search field1 = X AND field2 = Y ..... I wanted to use a case like: eval KK, HH, JJ = case ( color = "red", KK = A, HH = B, JJ = "", color = "green", KK = C, HH = D, JJ = E, color = "yellow", KK = X, HH = Y, JJ = "", 1 = 1, "INV") It can be done? Or do I have to use as many cases as there are variables I need in the search? Tks Bye Antonio
hi All, I need to send windows event logs from Splunkforwarder to Indexers via a heavyforwarder. I have done some configuration but it seems like something is incorrect as I am getting cooked data ... See more...
hi All, I need to send windows event logs from Splunkforwarder to Indexers via a heavyforwarder. I have done some configuration but it seems like something is incorrect as I am getting cooked data in splunk instead of logs. All the help is appreciated.   regards,  
Hello SPlunkers!! I have upgraded my HF from 8.0.0 to 8.1.2, while upgradation everything is working fine. But the issue is the i am not able to open the server on UI on port 8000. tcp 0 0 0.0.0.0:... See more...
Hello SPlunkers!! I have upgraded my HF from 8.0.0 to 8.1.2, while upgradation everything is working fine. But the issue is the i am not able to open the server on UI on port 8000. tcp 0 0 0.0.0.0:8000 0.0.0.0:* LISTEN 2170/splunkd port 8000 is also working fine. But the issue is that whenever i have tried to pen the server from Ui it throughing bad request error.   What the things i have done yet. 1. copy the web.conf file from the system default to system local 2. change the splunk user cmod splunk:splunk /opt/splunk But still the issue is persist please help me to get it resolve.    
Hi all, I need to get a list of all the saved searches that are created in a Splunk Cloud environment. I tried to execute the regular rest command from the search view but the following message appe... See more...
Hi all, I need to get a list of all the saved searches that are created in a Splunk Cloud environment. I tried to execute the regular rest command from the search view but the following message appears: I have been checking the capabilities available at the platform but the dispatch to indexers one doesn´t appear in the list (as it does in the on prem version). Do you know which is the proper way on getting this info in Splunk Cloud? Many thanks in advance. Best regards.
I want to make the panel and the font smaller in size.  So that I can put more panel in one line with the font visible. Does anyone know how to do this?
Hi , My query is like below,  index=s sourcetype=Fire | fillnull value="" | eval OS=case(like(OS,"%Windows%"),"Windows",like(OS,"%Linux%"),"Linux",1=1,"Others") | eval group = case(OS="Windows",... See more...
Hi , My query is like below,  index=s sourcetype=Fire | fillnull value="" | eval OS=case(like(OS,"%Windows%"),"Windows",like(OS,"%Linux%"),"Linux",1=1,"Others") | eval group = case(OS="Windows","Windows Host Intrusion Detection Prevention Agents Not Reporting",OS="Linux","Linux Host Intrusion Detection Prevention Agents Not Reporting") | search Environment="Production" OR Environment="PSE" | rename Reporting_Status as Compliance_Status | replace Reporting with Compliant "Not Reporting" with Noncompliant "Not Reporting (possibly due to ITAM FQDN field not populated)" with NotReporting "Not Reporting (ITAM FQDN field not populated)" with NotReporting in Compliance_Status | eval Compliance_Status=case(Compliance_Status="Compliant" OR Compliance_Status="Excluded from reporting, yet is reporting","Compliant",Compliance_Status="Noncompliant" OR Compliance_Status="Not Reporting" OR Compliance_Status="Error","NonCompliant")  | append [| search index=s  sourcetype=Work | fillnull value="" | eval group = case(Environment="Production" OR Environment="PSE","Workstations Host Intrusion Detection Prevention Agents Not Reporting") | rename Reporting_Status as Compliance_Status | replace Reporting with Compliant "Not Reporting" with Noncompliant "Not Reporting (possibly due to ITAM FQDN field not populated)" with NotReporting "Not Reporting (ITAM FQDN field not populated)" with NotReporting in Compliance_Status | eval Compliance_Status=case(Compliance_Status="Compliant" OR Compliance_Status="Excluded from reporting, yet is reporting","Compliant",Compliance_Status="Noncompliant" OR Compliance_Status="Not Reporting" OR Compliance_Status="Error","NonCompliant") ] | append [| search index=c sourcetype=cloud | fillnull value="" | eval group = case(Cloud_Platform="Azure","Azure Baseline Noncompliance",Cloud_Platform="Aws","AWS Baseline Noncompliance") | search Account_Environment="PROD" OR Account_Environment="PRD" OR Account_Environment="PSE" | stats sum(CountOf_Compliant_AssetsTested) as Compliant sum(CountOf_Noncompliant_AssetsTested) as NonCompliant ] |stats count by group The last append is not taken into count ,since Compliance and Non Compliance are not in Compliance_Status field. | stats sum(CountOf_Compliant_AssetsTested) as Compliant sum(CountOf_Noncompliant_AssetsTested) as NonCompliant Is there a way to bring them to new field Compliance_Status ,which will help to show after stats by group.
There is a dashboard which uses a scheduled search using the |loadjob command.  I recently changed the search query for that scheduled search, which was basically added a index = indexname stanza to... See more...
There is a dashboard which uses a scheduled search using the |loadjob command.  I recently changed the search query for that scheduled search, which was basically added a index = indexname stanza to it to make it a little more efficient.  However, after doing that, the dashboard started showing the below error Cannot find artifacts for savedsearch_ident... When I click on the "View Recent" option of the saved search, it still loads the results. Any ideas? 
Hi I'm looking to search a dataset to returns entries from yesterday's date based off a date field which has been converted as such (from another job):  | eval event_time = now() | convert ctime(even... See more...
Hi I'm looking to search a dataset to returns entries from yesterday's date based off a date field which has been converted as such (from another job):  | eval event_time = now() | convert ctime(event_time)  The value is stored as 11/24/2021 22:28 Please assist how to search and return this value using a yesterday variable? I hope that makes sense, forgive me I'm still learning.   To illustrate,  manually entering eventDate="11/24" works, but not sure how to get a 'yesterday' to work with the dataset. | inputlookup thisDataset.csv | search eventDate="11/24*" | sort Brand, eventDate | iplocation clientip | table _time Brand clientip City Region count eventDate
i have a query like index="default_index"  source="source1.csv" *calculations* | appendcols [search index="default_index"  source="source1.csv" |  * different calculations*] | appendcol... See more...
i have a query like index="default_index"  source="source1.csv" *calculations* | appendcols [search index="default_index"  source="source1.csv" |  * different calculations*] | appendcols [ search index="default_index" source="source1.csv" | *differenct calculations*] is there any way i can optimize the indexing in this search because the same index is used multiple times for multiple different calculations
HI  Is it possible to do left outer join after using two |mstats commands like below? I have Process_Name common to both - but I want the ones that are not in the seconds |mstat command only.    ... See more...
HI  Is it possible to do left outer join after using two |mstats commands like below? I have Process_Name common to both - but I want the ones that are not in the seconds |mstat command only.     | mstats prestats=t min("mx.replica.status") min("mx.process.resources.status") WHERE "index"="metrics_test" AND mx.env=http://mx20267vm:15000 span=10s BY service.name replica.name service.type | eval threshold="", pid="", cmd="", "host.name"="", "component.name"="" | mstats append=t prestats=t min("mx.process.threads") WHERE "index"="metrics_test" AND mx.env=http://mx20267vm:15000 span=10s BY pid cmd service.type host.name service.name replica.name component.name threshold        
I have Four Dashboards Level 1- Level 2- Level 3 - Level 4 Level 1 is a saved search and it has a field called months i want to drilldown using the month value to the next level 2 dashboard which ... See more...
I have Four Dashboards Level 1- Level 2- Level 3 - Level 4 Level 1 is a saved search and it has a field called months i want to drilldown using the month value to the next level 2 dashboard which is also a saved search using tok_mon =$click.name2$ How do i Pass a token from a dashboard with saved search to another with saved search same from level 2 to level 3 and level 3 to level 4
If i have a saved report that is scheduled to run every 1 hour. I have used that saved search as a reference to a search query in a dashboard panel. My question is that whenever that dashboard ... See more...
If i have a saved report that is scheduled to run every 1 hour. I have used that saved search as a reference to a search query in a dashboard panel. My question is that whenever that dashboard is loaded will it run the saved search again or will it be auto loaded from the last scheduled run of the saved search.
we need to delete three files from the index  I have used the |delete command to clean the indexed data and it’s deleted but still its showing under the source field. source='/var/log/splunk/syslog... See more...
we need to delete three files from the index  I have used the |delete command to clean the indexed data and it’s deleted but still its showing under the source field. source='/var/log/splunk/syslog/******/********' | delete source='/var/log/splunk/syslog/******/********' | delete source='/var/log/splunk/syslog/******/********' | delete
Hello, I have a distributed environment in which there are a cluster of indexers, 3 heavy forwarders and 3 search head How do you guys managed the high availability of add-on/TA on HF clusters ? B... See more...
Hello, I have a distributed environment in which there are a cluster of indexers, 3 heavy forwarders and 3 search head How do you guys managed the high availability of add-on/TA on HF clusters ? By example I would like to install AWS Add-on to get data by REST API: - If I configure TA on 3 HF I'll get 3 times same data. (Indexed 3 times, my licenses will explode) - If I configure TA on only one HF I'd have problems with high availability in case of this node have failure. Thanks      
Hi, I have to create a trending chart for 30 days using the below search .I am not getting the trending using timechart and chart .  index=s sourcetype=Fire | fillnull value="" | eval trmsc = cas... See more...
Hi, I have to create a trending chart for 30 days using the below search .I am not getting the trending using timechart and chart .  index=s sourcetype=Fire | fillnull value="" | eval trmsc = case(Environment="Production" OR Environment="PSE","Workstations Host Intrusion Detection Prevention Agents Not Reporting") | rename Reporting_Status as Compliance_Status | replace Reporting with Compliant "Not Reporting" with Noncompliant "Not Reporting (possibly due to ITAM FQDN field not populated)" with NotReporting "Not Reporting (ITAM FQDN field not populated)" with NotReporting in Compliance_Status | stats count(eval(Compliance_Status=="Compliant" OR Compliance_Status=="Excluded from reporting, yet is reporting")) as Compliant count(eval(Compliance_Status=="Noncompliant" OR Compliance_Status=="NotReporting" OR Compliance_Status=="Error")) as NonCompliant by trmsc  | append [| search index=c sourcetype=Asset | fillnull value="" | eval trmsc = case(Cloud_Platform="Azure","Azure Baseline Noncompliance",Cloud_Platform="Aws","AWS Baseline Noncompliance") | search Account_Environment="PROD" OR Account_Environment="PRD" OR Account_Environment="PSE" | stats sum(CountOf_Compliant_AssetsTested) as Compliant sum(CountOf_Noncompliant_AssetsTested) as NonCompliant by trmsc] | eval date_wday=strftime(_time,"%A") | search date_wday="Monday" | bin _time span=1d | eventstats count by trmsc | chart count(trmsc) over _time by Compliance_Status Please let me know how to get trending chart for the above search .
we  would like to integrate  the Auth0 data into Splunk enterprise  what will be the best way   ? is there any apps or add-ons available and can be used 
I am trying to remove some unwanted characters before the backslash, but it is ignoring some machines as they have different name standards. I want to remove the domain name and machine name from ... See more...
I am trying to remove some unwanted characters before the backslash, but it is ignoring some machines as they have different name standards. I want to remove the domain name and machine name from the Local Administrator group.  My data comes like this in one string as below labmachine000r\administrator labmachine000d\support  labdomain\admingroup labdomain\helpdesk I managed to remove the characters before the backslash using this   | eval adminlocal=replace(adminlocal, "\w+(\\\\)+","")   and my result is like below: administrator support  admingroup helpdesk That is working fine for the machine above, but if I have a machine name like "L-02labmachine000r", the replace command gives the result like this: L-administrator L-support admingroup helpdesk Is there any way to adjust my replace command to cover that machine name?    
Hi, I have json data being written to a log file and the log file is being forwarded to single Splunk index 'ti-l_asl'. The problem I have is the json data contains a field called 'index' which I want... See more...
Hi, I have json data being written to a log file and the log file is being forwarded to single Splunk index 'ti-l_asl'. The problem I have is the json data contains a field called 'index' which I want to transform into 'sourcetype' so it can be search on in Splunk. Is there a way I can do this without changing the system which writes the json to the log file i.e. transform the field name from 'index' to 'sourcetype' as part of the forwarder processing or some kind of pre-processing in Splunk before it is assigned to index 'ti-l_asl' ?