All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I am new to Splunk and using CentOS 8. localhost:8000 is not reachable after a restart.   please help.
Hi,  I need to create a report the will summary the countries that our users connected our network from. Using the "iplocation" command I got a results, but for sure I can say that I know about an ... See more...
Hi,  I need to create a report the will summary the countries that our users connected our network from. Using the "iplocation" command I got a results, but for sure I can say that I know about an employee connected from a country in Europe but in the logs it appears that he connected from a country in the middle east. Is there more accurate option to make sure I will present the correct information? Thanks.  
Hello, I am having values of a particular application as below. Looking to get the maximum version value or sorting them in order so that i can pick the last/first value in my search.   Values i ... See more...
Hello, I am having values of a particular application as below. Looking to get the maximum version value or sorting them in order so that i can pick the last/first value in my search.   Values i have for applicationA: 1.17.120 2.12.600 2.14.377 2.14.378 2.15.121 2.16.298 2.17.176 2.18.117 2.4.153 2.6.186 2.7.241 2.7.242 2.8.207 2.9.369       Value i am looking for : 2.18.117   I know we can use case() to add numbers and sort in order. But i have more than 20+ applications with similar data. Out of them i have to get the Maximum values for each application data.   Regards.
Now I want to remove one index. However I've already create some service and entity related to the index in ITSI. After removing index once, I'll create new index as same name. Is there some influ... See more...
Now I want to remove one index. However I've already create some service and entity related to the index in ITSI. After removing index once, I'll create new index as same name. Is there some influence for ITSI?
I have created a dashboard with the help of join which extract data in below format. Incoming event is mapped with outgoing event and time difference is displayed(AcK_time)  IncEvntIDTime           ... See more...
I have created a dashboard with the help of join which extract data in below format. Incoming event is mapped with outgoing event and time difference is displayed(AcK_time)  IncEvntIDTime                        IncoingEventID            OutEvntIDTime                             OutgngEvntID       Ack_time 13 Dec 2020  14:55:52   X12356565                   13 Dec 2020 14:55:54          X12356565                2 13 Dec 2020 14:55:53    X12356567                    13 Dec 2020 14:55:54          X12356567                2 Problem statement: I want average of Ack _time for each day over week in below format. Date               Avg of Ack_time Monday        2 Tuesday       3 Wednesday  3 and so on..... Thanks in advance!! @woodcock 
Hi,   Any thought off-hand as to what I'm not accounting for? Looking to extract values from a field in unstructured logs. Example event: ... { X-Request-Id:[<36_characters_of_interest>] ........ See more...
Hi,   Any thought off-hand as to what I'm not accounting for? Looking to extract values from a field in unstructured logs. Example event: ... { X-Request-Id:[<36_characters_of_interest>] ..... Was attempting to pull it from a named capture group (whose regex itself matches the correct characters), but no luck with any data showing up in the table. index="k8s_events" real-estate-app X-Request-Id | regex (?<x_request_id>(?<=X\-Request\-Id\:\[).............................................) | table x_request_id
We have the below data, out of which I wanted to extract a particular field and value from the json format.   PLATFORMINSTRUMENTS {“timestamp”:“1607797705",“instrumentList”:[{“name”:“dbcp.numIdle”,... See more...
We have the below data, out of which I wanted to extract a particular field and value from the json format.   PLATFORMINSTRUMENTS {“timestamp”:“1607797705",“instrumentList”:[{“name”:“dbcp.numIdle”,“value”:“100”},{“name”:“entity.versions.total”,“value”:“66137”}{“name”:“http.session.objects”,“value”:“2443”},{“name”:“dbcp.numActive”,“value”:“0”}, {“name”:“http.sessions”,“value”:“544”}, {“name”:“dbcp.maxActive”,“value”:“-1”}]}   expected output: dbcp.numActive : 0 
I built a dashboard to view the stats count of applications with the below query. Query : index="bw6_stg" ErrorReport| rex field=_raw "ApplicationName:\s+\[(?P&lt;Applname&gt;.*)];" | rex field=_raw... See more...
I built a dashboard to view the stats count of applications with the below query. Query : index="bw6_stg" ErrorReport| rex field=_raw "ApplicationName:\s+\[(?P&lt;Applname&gt;.*)];" | rex field=_raw "jobId: (?&lt;jId&gt;\w+);" |dedup jId |stats count by Applname For the above query below are the results. Applname       Error count abcd                     5 abcd.app            6 efgh                     4 efgh.app            3 Now my requirement is to add stats count of errors for each application this stats table like as below. When I click on the 1st application, it should show us like below. Applname       Error count abcd                     5      Error1                  2      Error2                  1      Error3                   2   When I click on the 2nd application, it should show us like below. Applname       Error count abcd .app                    6      Error1                  2      Error2                  2      Error3                   2   can you anyone please help me on this, Thanks in advance
Is it possible to change below splunk logo?- Can any one please guide me how to change this logo? Thanks,
Respected Sir/Medam                        How to create power user account while i have admin login in train version
Hello Splunk Community,  I have not mastered complex if statements. I would like to know how I can get the following results: 1. If  cookies are >=1 then output should be "Failure" 2. If cooki... See more...
Hello Splunk Community,  I have not mastered complex if statements. I would like to know how I can get the following results: 1. If  cookies are >=1 then output should be "Failure" 2. If cookies are < 0 then output should also be "Failure" 3. If cookies are = 0 AND if stage for this scenario is not ready, then output should be "In Progress"  Below is my logic for this query, but its not providing any results and its malformed, any help is appreciated.    | eval Bakes=if(cookies>="1", "Failure"), if (cookies<"0" "Failure"), if (cookies="0", "In Progress" AND Stages!="NotReady", "In Progress")  
Hello, So we have website hosted in Splunk. We are detecting these vulnerabilities Server header Detected, Incorrect X-Xss-Protection and Incorrect Set-Cookie which has issue category as Insecure HT... See more...
Hello, So we have website hosted in Splunk. We are detecting these vulnerabilities Server header Detected, Incorrect X-Xss-Protection and Incorrect Set-Cookie which has issue category as Insecure HTTP Header. We would like to know on how to resolve this vulnerabilities? Thanks! Regards, Joshua
I saw this one of the monitoring console dashboard and I have been wracking my brain trying to figure out how it gets done.      By "it" I mean the border around the table rows and the bar inside.  ... See more...
I saw this one of the monitoring console dashboard and I have been wracking my brain trying to figure out how it gets done.      By "it" I mean the border around the table rows and the bar inside.  I looked at the MC dashboard for it, but I don't see anything special about the search or dashboard (no js or other formatting tricks) that I could discern.      
hi, i wanted to fetch some information from my logs. here is the scenario: index=xyz host=xxx.com source="/as/df/gh/*.log" "[error]" | rex field=_raw "LoadPlanName:\s(?P<LP_Name>[^\]]*)" | tabl... See more...
hi, i wanted to fetch some information from my logs. here is the scenario: index=xyz host=xxx.com source="/as/df/gh/*.log" "[error]" | rex field=_raw "LoadPlanName:\s(?P<LP_Name>[^\]]*)" | table LP_Name | dedup LP_Name above query gives me the result as below LP_Name LP_abc LP_abc1 LP_abc2 now from the same source i want to fetch other details for the LP_Name extracted above i.e LP_abc, LP_abc1, LP_abc2, for that i tried to create below query which is not working: index=xyz host=xxx.com source="/dir1/dir2/*.log" "[error]" | rex field=_raw "LoadPlanName:\s(?P<LP_Name>[^\]]*)" | table LP_Name | dedup LP_Name | map search = "search index=xyz host=xxx.com source="/dir1/dir2/*.log" "[completed]" | rex field=_raw "LoadPlanName:\s(?P<LPN>[^\]]*)" LPN=$LP_Name" For above query i have been getting below error: Error in 'SearchParser': Missing a search command before '^'. Error at position '417' of search query 'search index=oitp host=ITCNCHN-LX4* source="/opt/o...{snipped} {errorcontext = s(?P<LPN>[^\]]*)" L}'. i have been struggling with it from a long time now, need help to get the the data that i desired. Thanks in advance.
Hi Splunkers, I am trying to access my Splunk instance through another computer on my network.  I have already added: allowRemoteLogin = always to the Server.conf file but that didn't work. Any ... See more...
Hi Splunkers, I am trying to access my Splunk instance through another computer on my network.  I have already added: allowRemoteLogin = always to the Server.conf file but that didn't work. Any Suggestions?  Thank you
Here is the test_lookup.cvs I'm using: c1 c2 c3 c4 c5 r1 1 2 3 4 r2 5 6 7 8 r3 9 10 11 12 r4 13 14 15 16   This works:   | inputlookup test_lookup.csv | eva... See more...
Here is the test_lookup.cvs I'm using: c1 c2 c3 c4 c5 r1 1 2 3 4 r2 5 6 7 8 r3 9 10 11 12 r4 13 14 15 16   This works:   | inputlookup test_lookup.csv | eval input="r1,r2" | makemv delim="," input | eval input_rule=if(c1=input,"1","0") | where input_rule=1 | format | eval search="\"".search."\""   Returns: "( ( c1="r1" AND c2="1" AND c3="2" AND c4="3" AND c5="4" AND ( input="r1" OR input="r2" ) AND input_rule="1" ) OR ( c1="r2" AND c2="5" AND c3="6" AND c4="7" AND c5="8" AND ( input="r1" OR input="r2" ) AND input_rule="1" ) )" So I created test_macro(1)   inputlookup test_lookup.csv | eval input="$rows$" | makemv delim="," input | eval input_rule=if(c1=input,"1","0") | where input_rule=1 | format | eval search="\"".search."\""   Run this:   | makeresults | eval rows="r1,r3" | eval score= [|`test_macro(rows)`]   Using the macro the results are: NOT () I have tried everything I can think of!  Pulling my hair out at this point.  Thanks.
I'm attempting to add some new fields to leverage the Asset Extraction for our Notables. As of today, we have what appear to be the default values: src,dest,dvc,orig_host. From my experience, when ... See more...
I'm attempting to add some new fields to leverage the Asset Extraction for our Notables. As of today, we have what appear to be the default values: src,dest,dvc,orig_host. From my experience, when src/dest are present in a search, the priority value is automatically assigned to the notable, and I believe that functionality is happening via this setting. I'm wanting to add the src_ip/dest_ip fields that are leveraged in most of our searches to obtain the priority value from our assets inventory. However, after running a test by adding dest_ip to the entries with a search with dest_ip populated, it didn't pull the priority value as expected. I'm wondering if there maybe a piece I'm missing that I should verify or if there may have been replication time I needed to account for.
All, I know there are a lot of postings with answers on lookup tables but I am still stuck.  I have not splunked in a few years and i hit a wall even when looking back at some of my old saved string... See more...
All, I know there are a lot of postings with answers on lookup tables but I am still stuck.  I have not splunked in a few years and i hit a wall even when looking back at some of my old saved strings. I have a csv file that has 2 columns.  One that contains IPAddress and the other that has SubnetMasks I am searching in my logs for IPAdresses that i want to compare with the IPAddresses that are in the lookup csv file.  if the IPAddresses are not found ... then display them in a table. MY query is as follows: index=blah  field3="*" | fields field3 field4 | dedup field3 | rename field3 as Source_IP | lookup ip_whitelist IPAddress AS Source_IP | eval InWhitelist="Yes" | table Source_IP IPAddress field4 InWhitelist | where InWhitelist="Yes" | sort -Source_IP where field3 is the field with the IP Addresses (extracted from delimited extractions) where field4 is the field that has the hostname This spits out a nice table but i notice IPs that are not in my whitelist are showing up. What is wrong here !?  Your help is greatly appreciated !  Thanks P
Hi crew, I have a JSON file from Vulnerability services generated one time per hour, and I just needed to get the last _raw event. How is it possible? I want to show the data from the last 7 or 15 d... See more...
Hi crew, I have a JSON file from Vulnerability services generated one time per hour, and I just needed to get the last _raw event. How is it possible? I want to show the data from the last 7 or 15 days using this condiction.       | rename Issues{}.details AS details Issues{}.file AS file Issues{}.severity AS severity Issues{}.confidence AS confidence Issues{}.line AS line | eval tempField=mvzip(mvzip(mvzip(mvzip(details, file), severity), confidence), line) | stats count by _time, service, source, tempField | eval details=mvindex(split(tempField,","),0), file=mvindex(split(tempField,","),1), severity=mvindex(split(tempField,","),2), confidence=mvindex(split(tempField,","),3), line=mvindex(split(tempField, ","),4) | stats max(_time) AS latest, count AS Issues by _time, severity | sort - _time      
We just loaded an AIX server with a forwarder and we're getting can't write file "/.splunk/authToken_#####_8089: Permission denied We have loaded a number of servers with the splunkforwarder, but ju... See more...
We just loaded an AIX server with a forwarder and we're getting can't write file "/.splunk/authToken_#####_8089: Permission denied We have loaded a number of servers with the splunkforwarder, but just started getting this message.  We have a particular user to run the process and it has worked on other servers, but not these new ones.  This is using Universal Forwarder 8.0.5.  The process actually starts, but when you do ./splunk list monitor is when you get the can't write file.  We have several more servers to load and would really like an assist here.