All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I would like to change the SPLUNK_DB value of my indexers inside my cluster. I want to write the logs to another directory but i don't want to bring the old logs. Will this cause problems? Cou... See more...
Hello, I would like to change the SPLUNK_DB value of my indexers inside my cluster. I want to write the logs to another directory but i don't want to bring the old logs. Will this cause problems? Could you please provide the detailed steps on how to do it? I have already tried to change the path on splunk-launch.conf (and the environment variable value on the os) but the servers keep writing in the old directory.   Thanks a lot
|inputlookup lookup1,csv |fields IP Host_Auth |lookup lookup2.csv IP output Host_Auth as Host_Auth.1 Some of the field values in each version of Host_Auth match and some don't. How can I find the e... See more...
|inputlookup lookup1,csv |fields IP Host_Auth |lookup lookup2.csv IP output Host_Auth as Host_Auth.1 Some of the field values in each version of Host_Auth match and some don't. How can I find the events that do not match? I've tried where Host_Auth != Host_Auth.1 and eval but nothing works
I am looking to display only one statistic row being named Total with the count of all of the hosts added up, which should equal around 450, give or take. I know how to add it up in the way of using ... See more...
I am looking to display only one statistic row being named Total with the count of all of the hosts added up, which should equal around 450, give or take. I know how to add it up in the way of using addcoltotals but I am only looking for the Total row, excluding the rest of the stats. I have attached the base search and the current output.  index=os source=ps host=deml* OR host=sefs* OR host=ingg* OR host=us* OR host=gblc* NOT user=dcv NOT user=root NOT user=chrony NOT user=dbus NOT user=gdm NOT user=libstor+ NOT user=nslcd NOT user=polkitd NOT user=postfix NOT user=rpc NOT user=rpcuser NOT user=rtkit NOT user=colord NOT user=nobody NOT user=sgeadmin NOT user=splunk NOT user=setroub+ NOT user=lp NOT user=68 NOT user=ntp NOT user=smmsp NOT user=dcvsmagent NOT user=libstoragemgmt | dedup user | stats count by host  
Hello, I am helping client of mine monitor network for malware or hacker and we are looking to built SPL to monitor command and control becon traffic. I search forum but not much info. Any help wil... See more...
Hello, I am helping client of mine monitor network for malware or hacker and we are looking to built SPL to monitor command and control becon traffic. I search forum but not much info. Any help will be great.
Hi all, I need your help in validating my query. Please help.. in indexA , fields are: user, login (user=firstname, login=login_id) in indexB , fields are: userName, city (city: location of the ... See more...
Hi all, I need your help in validating my query. Please help.. in indexA , fields are: user, login (user=firstname, login=login_id) in indexB , fields are: userName, city (city: location of the employee, userName:firstname comma lastname) I have userName in indexA but it was not extracted under any field name. So I am extracting this field and based on that userName combination, I need to get location of that employee. I am trying with the below query, but it is not giving location detail. Location is emplty for all rows (index=indexA sourcetype="A" user=*) OR (index=indexB sourcetype="B" userName=*) | rex field=_raw "user=(?<userName>[^.]*)\s+cat" | fields userName city login | stats count as events values(city) as city by userName login eg:user=aaa, login=aabb city=xyz, userName=aaa, bbb with my query I have to get result as userName login events city aaa, bbb aabb 1 xyz But Iam getting empty in city. please help.. Thanks
Hi Team,   We are getting the below error while installing the Enterprise security App    failed to extract app from /tmp/ to /opt/splunk/var/run/splunk/bundle_tmp/  no such file or directory
Hi Team,   We are getting the below error while installing the Enterprise security App    failed to extract app from /tmp/ to /opt/splunk/var/run/splunk/bundle_tmp/  no such file or directory... See more...
Hi Team,   We are getting the below error while installing the Enterprise security App    failed to extract app from /tmp/ to /opt/splunk/var/run/splunk/bundle_tmp/  no such file or directory    
Hi There, I am currently trying to install a Splunk Universal Forwarder on a Linux server (Ubuntu 18.04). I have installed the forwarder but am receiving the following error when trying to instal... See more...
Hi There, I am currently trying to install a Splunk Universal Forwarder on a Linux server (Ubuntu 18.04). I have installed the forwarder but am receiving the following error when trying to install the credentials pacakge:     Error during app install: failed to extract app from /tmp/splunkcloud.spl to /opt/splunkforwarder/splunkforwarder/var/run/splunk/bundle_tmp/08fff82e60ae81e9: No such file or directory     I transferred the file to the server using WinSCP and I have confirmed that the splunkcloud.spl file exists in the /tmp folder. I have also made sure that the permissions are correct on the directory. Any help would be appreciated, Jamie
Hi there, I have about 20 dashboards with many common features and code fragments. They are have the same checkboxes, and so on... how can I upload some files to Splunk, so that I can extract these ... See more...
Hi there, I have about 20 dashboards with many common features and code fragments. They are have the same checkboxes, and so on... how can I upload some files to Splunk, so that I can extract these repeated bits of code and store them in a file and just call that file in all the dashboards and only code the unique parts of that dashboard in its source xml code?  Thanks,
I have created a tag for a key-value pair (dvc=IP_Address) and shared it will all the apps. Which doing a search for the logs related to the above device, I can see the tag appearing for that key val... See more...
I have created a tag for a key-value pair (dvc=IP_Address) and shared it will all the apps. Which doing a search for the logs related to the above device, I can see the tag appearing for that key value pair (dvc=IP_Address(Tag_Name)). However, this tag is not working for the notable events. It does not appear for the notables under Incident Review of Splunk ES. These tags were working prior to the new ES update. Looking for the solution regarding the same
I'm a newbe and I try to configure Security Essential to search "net user /DOMAIN"  discovery on my  AD server. I've installed an UniversalForwarder into AD with sysmon and configured input.conf wi... See more...
I'm a newbe and I try to configure Security Essential to search "net user /DOMAIN"  discovery on my  AD server. I've installed an UniversalForwarder into AD with sysmon and configured input.conf with following entries [WinEventLog://Security] checkpointInterval = 5 current_only = 0 disabled = 0 start_from = oldest   [WinEventLog://Microsoft-Windows-Sysmon/Operational] checkpointInterval = 5 current_only = 0 disabled = 0 start_from = oldest If I run a simple search using index=* "net.exe" AND " user*" AND "*/do*"  I get result from source WinEventLog:Microsoft-Windows-Sysmon/Operational while If I use Analytic Story: Domain Account Discovery With Net App  that use datamodel Endpoint, no events returned. It seems that event in data model are only from source WinEventLog:Security What I miss ?
Hi there, I am trying to get a rolling number whenever proposals get activated. I am able to execute the following SQL script which gets me a figure for proposals activated today using this:   ... See more...
Hi there, I am trying to get a rolling number whenever proposals get activated. I am able to execute the following SQL script which gets me a figure for proposals activated today using this:     select * from cr_managementinformation where activation_date >= '06-JUNE-23' order by proposal_status     When I change the input type to Batch then i use this:     SELECT * FROM cr_managementinformation WHERE PROPOSAL_NUMBER > ? AND activation_date = TO_DATE(current_date) ORDER BY PROPOSAL_NUMBER ASC     The rising number does not match the batch number. i.e. the batch number of activations today is 302 but the rising number is only 5.   Can you help?
I have a table in splunk with  columns |table _time idx Event_count IsOutlier Actual_outlier atf_hour_of_day atf_day_of_week lowerBound upperBound Email_Alert X X1 outlier_high_index outlier_low_ind... See more...
I have a table in splunk with  columns |table _time idx Event_count IsOutlier Actual_outlier atf_hour_of_day atf_day_of_week lowerBound upperBound Email_Alert X X1 outlier_high_index outlier_low_index I need to check how many times an index appears in the idx column. I can use |stats count by idx. It will give only the columns idx and count. But I need all the other columns as well..
Hi, I'm trying to combine values from two different fields in two different indexes. But it seems to come up blank. Is there any other options like join to combine it and sort it after the combined... See more...
Hi, I'm trying to combine values from two different fields in two different indexes. But it seems to come up blank. Is there any other options like join to combine it and sort it after the combined values? | multisearch [search index=ABC UserID=* CheckEvent Alias=* ] [search index=CDE UserID=* classifications=SuperUser AliasTest=true ] | eval Combi = AliasTest." - ".Alias | stats values(UserID) as UserID, list(Combi) as Combined, list(AliasTest) as AliasTest ,list(classifications) as classifications, list(Alias) as Alias, dc(UserID) as users by Combi It works if I combine fields from same index, but not if I try and combine fields values from ABC and CDE indexes.  Thank you,
We are planning to migrate to a Two-Tiered Splunk Deployment Server Architecture. Do we have guidelines that we can check so we can plan accordingly. And also, with regard to app deployment in a Two-... See more...
We are planning to migrate to a Two-Tiered Splunk Deployment Server Architecture. Do we have guidelines that we can check so we can plan accordingly. And also, with regard to app deployment in a Two-Tiered Splunk Deployment Server Architecture, do we need to do the command splunk reload deploy-server twice (one in Master DS and another one in Slave DS) whenever we have some updates in the apps(deployment-apps)? Looking forward to your insights. Thank you
Hello network, I need help understanding how to increase the number of lines within the UI Field Extraction For example, I have an event containing 38 lines and when sampling for applying regex w... See more...
Hello network, I need help understanding how to increase the number of lines within the UI Field Extraction For example, I have an event containing 38 lines and when sampling for applying regex while field extracting, it gives me visibility of 20 lines only, which prevents me of seen what I actually want to extract as a field. I did check the ui-prefs.conf but not entirely sure if this is the right place to expand and maximize the window/workflow so I can see all lines and work with these. Thank you  
Hi, How can we effectively search for fields containing null values in the index, in order to limit license entitlement. What approach can be taken to accomplish this?   Thanks
Hello Splunk Experts, I've tried below query to use the 'previous_day' field in inputlookup and save it in outputlookup using today and append results if the file for today is not created. But it's ... See more...
Hello Splunk Experts, I've tried below query to use the 'previous_day' field in inputlookup and save it in outputlookup using today and append results if the file for today is not created. But it's not working. Could anyone please correct me on what I'm doing wrong here. Thanks in advance!!     index=test *exception | [ | eval previous_day="lookup" .strftime(relative_time(now(), "-1d"), "%m%d"). ".csv" | inputlookup $previous_day$ ] | eval today="lookup" .strftime(now(), "%m%d"). ".csv" | [| inputlookup $today$ | eval is_append= if(isnull(Exception), "append=1", "")] | stats count by first(_raw) as Exception | outputlookup $today$ $is_append$    
I'm attempting to use Splunk's API to extract some data. My configuration includes a max_count of 1, a search string with an index, oneshot execution (due to the nature of the design), earliest and l... See more...
I'm attempting to use Splunk's API to extract some data. My configuration includes a max_count of 1, a search string with an index, oneshot execution (due to the nature of the design), earliest and latest execution times of 2 minutes, and json output as the output mode. Minimum response time is 12 seconds. Is there anything I could do differently to improve it? I experimented with changing the search query's time limit to one minute and adding extra filters. I've tried both blocking and alternative normal exec_modes.
Why there is no results when I search index=_internal?