All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Greetings, I want to report on any Linux system that hasn't had an event in /var* for 30 minutes. I was going to use Source="/var/log/messages" but our admins told me that they want anything belo... See more...
Greetings, I want to report on any Linux system that hasn't had an event in /var* for 30 minutes. I was going to use Source="/var/log/messages" but our admins told me that they want anything below /var to be reported on. I tried using the metadata command but that didn't get me anywhere. Does anyone have any suggestions? Thanks.
I want to reformat any number of my search result to kWh ; as you see in pictures below for example 15 to 15 kWh.
Hi all, I have a general question on saving some space and grouping hosts in serverclass.conf. I have reviewed This from 2012, but doesn't exactly cover what I'm looking for and I am aware of the ... See more...
Hi all, I have a general question on saving some space and grouping hosts in serverclass.conf. I have reviewed This from 2012, but doesn't exactly cover what I'm looking for and I am aware of the answer supplied. I'm wondering if the same applies to the below for grouping hosts. Example: whitelist.0 = host10 whitelist.1 = host11 Question: whitelist.0 = host1[0-1] Thanks in advance,
Hi all, We receive the warning : The current bundle directory contains a large lookup file that might cause bundle replication fail. The path to the directory is /opt/splunk/var/run/68DAE477-6A5A... See more...
Hi all, We receive the warning : The current bundle directory contains a large lookup file that might cause bundle replication fail. The path to the directory is /opt/splunk/var/run/68DAE477-6A5A-40C6-A7CE-3778404DF158-1589383658-1589383891.delta When looking into this .delta file it contains a few lookupfiles but they are only a few MB's. Also the full bundle files do not contain large lookups, we already blacklisted some large lookupfiles.
Hi, On server with Splunk Universal Forwarder installed we are monitoring cvs log with a header and lines in the following format: "Status","Device Name","IP Address","Site","Last Backup Date... See more...
Hi, On server with Splunk Universal Forwarder installed we are monitoring cvs log with a header and lines in the following format: "Status","Device Name","IP Address","Site","Last Backup Date" "Success","Active Directory","10.123.456.78","Global","30-04-2020 20:11:05" "Failure","Active Directory","10.123.456.89","Global","30-04-2020 20:11:06" Splunk ingests "Header" line even with the following header related parameters in the props.conf: [cvs_custom_sourcetype] DATETIME_CONFIG = CURRENT SHOULD_LINEMERGE = false LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true INDEXED_EXTRACTIONS = csv HEADER_FIELD_LINE_NUMBER = 1 HEADER_FIELD_DELIMITER = , HEADER_FIELD_QUOTE = " FIELD_QUOTE = " KV_MODE = none Any suggestions why it is happening?
Hello I have this SPL which returns like 40 000 records when run alone however when it's appended to another SPL which is similar except with different Report ID and monitored commands. The record of... See more...
Hello I have this SPL which returns like 40 000 records when run alone however when it's appended to another SPL which is similar except with different Report ID and monitored commands. The record of this SPL shrinks down from 40 000 to 16k, what causes this weird problem? `comment(Standard Users)` (index=* source=/var/log/secure* AND TERM(sudo) AND (TERM(bin) OR TERM(sbin)) AND ((TERM(yum) AND (TERM(install) OR TERM(remove))) OR *bin/rpm*-*i* OR *bin/rpm*-*e* OR *bin/*tar*x*) AND COMMAND!="*egrep*") OR (index="*" source=/var/log/audit/audit.log* addr!=? acct=* res=success* [search index=* source=/var/log/secure* AND TERM(sudo) AND (TERM(bin) OR TERM(sbin)) AND ((TERM(yum) AND (TERM(install) OR TERM(remove))) OR *bin/rpm*-*i* OR *bin/rpm*-*e* OR *bin/*tar*x*) AND COMMAND!="*egrep*" | regex _raw != ".*bin\/grep|.*bin\/man|.*bin\/which" | regex _raw!= ".*user NOT in sudoers.*" | regex _raw = ".*rpm -[ivhe]+|.*yum.*|.*tar\s+(-|)[xvzjf]+" | stats latest(_time) as latest earliest(_time) as mod_time | eval earliest= relative_time(mod_time, "-24h@s") | fields earliest latest]) | regex _raw != ".*bin\/grep|.*bin\/man|.*bin\/which" | regex _raw!= ".*user NOT in sudoers.*" | regex _raw = ".*rpm -[ivhe]+|.*yum.*|.*tar\s+(-|)[xvzjf]|type.*addr.*success" | rename acct as Users | rex field=_raw "(?<=sudo:)\s*(?P<Users>[[:alnum:]]\S*[[:alnum:]])\s*(?=\:).*(?<=COMMAND\=)(?P<command>.*)" | eval "Command/Events" = replace(command,"^(\/bin\/|\/sbin\/)","") | eval Users = if(match(Users,"(?<=[[:alnum:]])\@[[:alnum:]]\S*[[:alnum:]]"), replace(Users,"(?<=[[:alnum:]])\@[[:alnum:]]\S*[[:alnum:]]",""), if(match(Users,"[[:alnum:]]+\\\(?=[[:alnum:]]\S*[[:alnum:]])"), replace(Users,"[[:alnum:]]+\\\(?=[[:alnum:]]\S*[[:alnum:]])","") ,Users)) | eval Time = if(match(_raw,"(?<=sudo:)\s*[[:alnum:]]\S*[[:alnum:]]\s*(?=\:).*(?<=COMMAND\=)*") ,strftime(_time, "%Y-%d-%m %H:%M:%S"),null()), Date = strftime(_time, "%Y-%d-%m") | eval "Report ID" = "ABLR-008" | eval "Agency HF" = if(isnull(agencyhf),"",agencyhf) | stats list(Time) as Time list("Command/Events") as "Command/Events" latest(addr) as "IP Address" by Users Date host index "Report ID" "Agency HF" | where 'Command/Events' !="" | eval counter=mvrange(0,mvcount(Time)) | streamstats count as sessions | stats list(*) as * by sessions counter | foreach Time "Command/Events" [ eval <<FIELD>> = mvindex('<<FIELD>>', counter)] | fields - counter sessions | rename index as Agency, host as Hostname | where isnotnull('IP Address') OR Users!="root" | fields "Report ID" Time Agency Command/Events Hostname Users "IP Address" "Agency HF" `comment(root doing sudo)` | union [search index=* source=/var/log/secure* AND TERM(sudo) AND (TERM(bin) OR TERM(sbin)) AND ((TERM(yum) AND (TERM(install) OR TERM(remove))) OR *bin/rpm*-*i* OR *bin/rpm*-*e* OR *bin/*tar*x*) AND COMMAND!="*egrep*" AND " root" | regex _raw = ".*rpm -[ivhe]+|.*yum.*|.*tar\s+(-|)[xvzjf]+" | rex field=_raw "(?<=sudo:)\s*(?P<Users>[[:alnum:]]\S*[[:alnum:]])\s*(?=\:).*(?<=COMMAND\=)(?P<command>.*)" | eval "Command/Events" = replace(command,"^(\/bin\/|\/sbin\/)","") | eval Users = "root" | eval Time = strftime(_time, "%Y-%d-%m %H:%M:%S"), Date = strftime(_time, "%Y-%d-%m") | eval "Agency HF" = if(isnull(agencyhf),"",agencyhf) | eval "Report ID" = "ABLR-008" | eval "Command/Events" = replace(command,"^(\/bin\/|\/sbin\/)","") | rename host as Hostname, index as Agency | table "Report ID" Time Date Users "Command/Events" Hostname Agency "Agency HF" | join type=left Date Hostname [search (index=* source=/var/log/secure* AND TERM(sudo) AND "session opened for user root" [ search index=* source=/var/log/secure* AND TERM(sudo) AND (TERM(bin) OR TERM(sbin)) AND ((TERM(yum) AND (TERM(install) OR TERM(remove))) OR *bin/rpm*-*i* OR *bin/rpm*-*e* OR *bin/*tar*x*) AND COMMAND!="*egrep*" AND " root" | regex _raw = ".*rpm -[ivhe]+|.*yum.*|.*tar\s+(-|)[xvzjf]+" | stats latest(_time) as latest earliest(_time) as mod_time | eval earliest= relative_time(mod_time, "-24h@s") | fields earliest latest]) OR (index="*" source=/var/log/audit/audit.log* addr!=? res=success* acct=* [ search index=* source=/var/log/secure* AND TERM(sudo) AND (TERM(bin) OR TERM(sbin)) AND ((TERM(yum) AND (TERM(install) OR TERM(remove))) OR *bin/rpm*-*i* OR *bin/rpm*-*e* OR *bin/*tar*x*) AND COMMAND!="*egrep*" AND " root" | regex _raw = ".*rpm -[ivhe]+|.*yum.*|.*tar\s+(-|)[xvzjf]+" | stats latest(_time) as latest earliest(_time) as mod_time | eval earliest= relative_time(mod_time, "-24h@s") | fields earliest latest]) | regex _raw != ".*LOGIN.*" | eval Time = if(match(_raw,"(?<=user)\s*[[:alnum:]]\S*[[:alnum:]].*(?<=by)\s*[[:alnum:]]\S*[[:alnum:]](?=\Suid)") ,strftime(_time, "%Y-%d-%m %H:%M:%S"),null()), Date = strftime(_time, "%Y-%d-%m") | stats list(Time) as Time values(addr) as ip by Date host | where Time !="" | rename host as Hostname | stats values(ip) as "IP Address" by Date Hostname] | fillnull "IP Address" value="localhost" | fillnull Hostname value="N.A" | fields "Report ID" Time Agency Command/Events Hostname Users "IP Address" "Agency HF"] | union [ search index=* source="/root/.bash_history" AND ((TERM(yum) AND (TERM(install) OR TERM(remove))) OR *rpm*-*i* OR *rpm*-*e* OR *tar*x*) | regex _raw = ".*rpm -[ivhe]+|.*yum.*|.*tar\s+(-|)[xvzjf]+" | rex field=_raw "(?P<command>.*)" | eval Users = "root" | eval Time = strftime(_time, "%Y-%d-%m %H:%M:%S"), Date = strftime(_time, "%Y-%d-%m") | eval "Agency HF" = if(isnull(agencyhf),"",agencyhf) | eval "Report ID" = "ABLR-008" | rename command as "Command/Events", host as Hostname, index as Agency | fields "Report ID" Time Date Users "Command/Events" Hostname Agency "Agency HF" | join type=left Date Hostname [search (index=* source=/var/log/secure* AND TERM(sudo) AND "session opened for user root" [ search index=* source="/root/.bash_history" AND ((TERM(yum) AND (TERM(install) OR TERM(remove))) OR *rpm*-*i* OR *rpm*-*e* OR *tar*x*) | regex _raw = ".*rpm -[ivhe]+|.*yum.*|.*tar\s+(-|)[xvzjf]+" | stats latest(_time) as latest earliest(_time) as mod_time | eval earliest= relative_time(mod_time, "-24h@s") | fields earliest latest]) OR (index="*" source=/var/log/audit/audit.log* addr!=? res=success* acct=* [ search index=* source="/root/.bash_history" AND ((TERM(yum) AND (TERM(install) OR TERM(remove))) OR *rpm*-*i* OR *rpm*-*e* OR *tar*x*) | regex _raw = ".*rpm -[ivhe]+|.*yum.*|.*tar\s+(-|)[xvzjf]+" | stats latest(_time) as latest earliest(_time) as mod_time | eval earliest= relative_time(mod_time, "-24h@s") | fields earliest latest]) | regex _raw != ".*LOGIN.*" | eval Time = if(match(_raw,"(?<=user)\s*[[:alnum:]]\S*[[:alnum:]].*(?<=by)\s*[[:alnum:]]\S*[[:alnum:]](?=\Suid)") ,strftime(_time, "%Y-%d-%m %H:%M:%S"),null()), Date = strftime(_time, "%Y-%d-%m") | stats list(Time) as Time values(addr) as ip by Date host | where Time !="" | rename host as Hostname | stats values(ip) as "IP Address" by Date Hostname] | fillnull "IP Address" value="localhost" | fillnull Hostname value="N.A" | fields "Report ID" Time Agency Command/Events Hostname Users "IP Address" "Agency HF" ] | dedup Time Command/Events | table "Report ID" Time Agency Command/Events Hostname Users "IP Address" "Agency HF" | fillnull "IP Address" value="localhost" | sort 0 -Time
Need to run the below query for a month If i run the below query i will get results for the yesterday AVG count. sourcetype="X"| bin _time span=1s | stats count by logtime | stats avg(count)... See more...
Need to run the below query for a month If i run the below query i will get results for the yesterday AVG count. sourcetype="X"| bin _time span=1s | stats count by logtime | stats avg(count) | eval date=strftime(now()-86400, "%d-%m-%Y") | table date avg(count) | eval 'avg(count)'=round('avg(count)',0) | fields - avg(count) Output: date 'avg(count)' 12-05-2020 11 But i want to get the AVG count for a month or 6 months. Please suggest me to update this query to get the results as expected for 6 months , each day AVG count i would need. Thanks Pradeep
DBConnect was installed following the instructions provided via SplunkDocs install guide https://docs.splunk.com/Documentation/DBX/3.3.0/DeployDBX/Prerequisites Now my vulnerability scans keep c... See more...
DBConnect was installed following the instructions provided via SplunkDocs install guide https://docs.splunk.com/Documentation/DBX/3.3.0/DeployDBX/Prerequisites Now my vulnerability scans keep coming up with findings on old Java versions although it seems up to date when i run a java -version (which is seeing the installation of JDK from our RedHat repositories) I have been having to update Java manually to the /opt/java folder that was being hit on the scans. I would rather this be automated by maybe modifying the $Java_Home to /etc/alternatives/java/jdk/ and just let Redhat handle the Patches?
With Private App Installation, continue button not working in other languages than en-GB or en-US for installation, update, delete. fails with fr-FR and de_DE successful with en-GB and en-US R... See more...
With Private App Installation, continue button not working in other languages than en-GB or en-US for installation, update, delete. fails with fr-FR and de_DE successful with en-GB and en-US Regards
Hi dear Splunkers, I was spending hours to find and answer in docs or here, but still not got anything satisfying so far. Problem statement: We have users which role: - alllows access to all n... See more...
Hi dear Splunkers, I was spending hours to find and answer in docs or here, but still not got anything satisfying so far. Problem statement: We have users which role: - alllows access to all non internal indexes on the cluster - allows them sheduling searches - allows shedule searches However, it still can not add custom summary index to a sheduled report indexes are defined on both, sh and idx and admin is able to select indexes. I think nearest I was able to find was Custom summary index not showing up in "select the summary index" dropdown (https://answers.splunk.com/answers/47310/custom-summary-index-not-showing-up-in-select-the-summary-index-dropdown.html?utm_source=typeahead&utm_medium=newquestion&utm_campaign=no_votes_sort_relev) but it is from 2013 and I do not want to to grant indexes_edit capability to users just to be able to write in custom created summary index. Any ideas? we use SPlunk 7.1.3 at the mment and about to go to 7.3.3 (so solution for 7.3.3 will be good enough)
In indexer cluster environment one of the Indexer got stopped unable to start/restart C:\Windows\system32>d: D:>cd spluk\bin The system cannot find the path specified. D:>cd splunk\bin D:\S... See more...
In indexer cluster environment one of the Indexer got stopped unable to start/restart C:\Windows\system32>d: D:>cd spluk\bin The system cannot find the path specified. D:>cd splunk\bin D:\Splunk\bin>.\splunk restart Splunkd: Stopped Splunk> All batbelt. No tights. Checking prerequisites... Checking http port [8000]: open Checking mgmt port [8089]: open Checking appserver port [127.0.0.1:8065]: open Checking kvstore port [8191]: open Checking configuration... Done. Checking critical directories... Done Checking indexes... (skipping validation of index paths because not running as LocalSystem) Validated: _audit _internal _introspection _telemetry _thef ishbucket aws_anomaly_detection aws_topology_daily_snapshot aws_topology_hi story aws_topology_monthly_snapshot aws_topology_playback aws_vpc_flow_logs history main summary Done Bypassing local license checks since this instance is configured with a rem ote license master. Checking filesystem compatibility... Done Checking conf files for problems... Done Checking default conf files for edits... Validating installed files against hashes from 'D:\Splunk\splunk-7. 2.1-be11b2c46e23-windows-64-manifest' All installed files intact. Done Checking replication_port port [7778]: open All preliminary checks passed. Starting splunk server daemon (splunkd)... Splunkd: Starting (pid 6420) Timed out waiting for splunkd to start. Splunkd.log 05-18-2020 07:31:58.157 +0000 INFO ServerRoles - Declared role=cluster_slave. 05-18-2020 07:31:58.157 +0000 INFO ServerRoles - Declared role=indexer. 05-18-2020 07:31:58.157 +0000 INFO ClusteringMgr - initing clustering with: ht=60.000 rf=3 sf=2 ct=60.000 st=60.000 rt=60.000 rct=60.000 rst=60.000 rrt=60.000 rmst=180.000 rmrt=180.000 icps=-1 sfrt=600.000 pe=1 im=0 is=1 mob=5 mor=5 mosr=5 pb=5 rep_port=port=7778 isSsl=0 ipv6=0 cipherSuite= ecdhCurveNames= sslVersions=SSL3,TLS1.0,TLS1.1,TLS1.2 compressed=1 allowSslRenegotiation=1 dhFile= reqCliCert=0 serverCert= rootCA= commonNames= alternateNames= pptr=10 fznb=10 Empty/Default cluster pass4symmkey=true allow Empty/Default cluster pass4symmkey=true rrt=restart dft=180 abt=600 sbs=1 05-18-2020 07:31:58.172 +0000 INFO ClusteringMgr - Initializing node as slave 05-18-2020 07:31:58.172 +0000 INFO BucketReplicator - Initializing BucketReplicatorMgr 05-18-2020 07:31:58.219 +0000 INFO CMServiceThread - CMHealthManager starting eloop 05-18-2020 07:31:58.235 +0000 INFO CMBundleMgr - bundle=D:\Splunk\var\run\splunk\cluster\remote-bundle\2df598296706d9846433003de4c7a927-1589221919.bundle, checksum=5F5C9F53A58CD618B69209EBC5D92286 found on the slave 05-18-2020 07:31:58.235 +0000 INFO CMBundleMgr - setting active bundle= to latest bundle=6F0874F9DA123EA345D25A77F6D3CAFA 05-18-2020 07:31:58.235 +0000 INFO CMSlave - event=getActiveBundle status=success path=D:\Splunk\var\run\splunk\cluster\remote-bundle\83209f7543173582062b08f2b77fcde0-1589259155.bundle cksum=6F0874F9DA123EA345D25A77F6D3CAFA alreadyin=0 05-18-2020 07:31:58.235 +0000 ERROR CMSlave - event=move downloaded bundle to slave-apps failed with err="failed to remove dir=D:\Splunk\etc\slave-apps.old (There are no more files.)" even after multiple attempts, Exiting.. 05-18-2020 07:31:58.235 +0000 ERROR loader - Failed to download bundle from master, err="failed to remove dir=D:\Splunk\etc\slave-apps.old (There are no more files.)", Won't start splunkd. please provide the solution if any one knows.
From reading the docs on this app it appears you need to do HEC over the Internet since Dome9 is a SAS? Is anyone actually doing this?
Hi, In a dashboard we have a form with two inputs (email & subject). We are looking for generate a text in this dashboard who including result from search. For example a simple search : inde... See more...
Hi, In a dashboard we have a form with two inputs (email & subject). We are looking for generate a text in this dashboard who including result from search. For example a simple search : index=myindex src=$email$ message_subject=$msg_sub$ | stats count(recipients) by src this search will be used to generate a text in a dashboard : " Sed et eros bibendum, fermentum nibh volutpat, convallis lorem. Nunc in dignissim lacus. Integer sodales tristique ultricies. In porta condimentum neque eget gravida. Sed magna dolor, laoreet non tortor sed, feugiat varius lacus. Donec semper hendrerit orci ac sodales. Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. Etiam mollis id augue non laoreet. Etiam porttitor magna $email$ suscipit tortor luctus dignissim $msg_sub$ Morbi sit amet neque ipsum. Nam rhoncus dui nec neque bibendum commodo. Maecenas consequat imperdiet nisl a accumsan. Aenean pellentesque, justo sed elementum porta, nisl sem suscipit leo, quis consequat sapien velit et mi. Vivamus varius auctor risus, elementum pharetra nisl malesuada ut. Duis malesuada sollicitudin dignissim. In lacinia sagittis urna quis sollicitudin. Pellentesque a enim ultricies, blandit dui sit amet, tincidunt est." A function/method exist to do that ? Thanks for your help.
Hi there. Could someone please explain why this transform (in pertrained sourcetype from Splunk, not the TA) exists for this sourcetype? It has the consequence of (in many cases) creating divergent h... See more...
Hi there. Could someone please explain why this transform (in pertrained sourcetype from Splunk, not the TA) exists for this sourcetype? It has the consequence of (in many cases) creating divergent host values for a single host, and we're wondering why Splunk has chosen to "bake it in" to do this. Thanks for any insight.
I have 2 sites that has more than 10 peers each site, each peer have different hardware specs. what is the best practice in identifying the site_replication_factor and site_search_factor ?
logs source=/api/docker/docker-snapshot-demo/v2/pdap/pdap-validator-router/manifests/1.0.aws source=/api/docker/docker-snapshot/v2/mode-date/mod-validator-router/manifests/1.0.aws we want to e... See more...
logs source=/api/docker/docker-snapshot-demo/v2/pdap/pdap-validator-router/manifests/1.0.aws source=/api/docker/docker-snapshot/v2/mode-date/mod-validator-router/manifests/1.0.aws we want to extract the first occurrence of string that has min of 1 hyphen and max of 2 hyphens into separate filed. In above example, only "docker-snapshot-eis" and "docker-snapshot" should be extracted into different filed. tried "\w*[-]\w*" and "\b\w*[-']\w*\b" but has challenge limiting to first occurrence and range specifier.
Hi All, Can someone help me in changing the color based on the state I tried charting.fieldColors in option field, but its not working CHART is giving only one color,its not changing based on th... See more...
Hi All, Can someone help me in changing the color based on the state I tried charting.fieldColors in option field, but its not working CHART is giving only one color,its not changing based on the condition
Hi, I have this log line: May 13 08:01:56 192.168.10.10 system_service: 192.168.10.10 05/13/2020:07:01:56 GMT : GUI CMD_EXECUTED : User test_user - Remote_ip 10.10.10.10 - Command "login login ... See more...
Hi, I have this log line: May 13 08:01:56 192.168.10.10 system_service: 192.168.10.10 05/13/2020:07:01:56 GMT : GUI CMD_EXECUTED : User test_user - Remote_ip 10.10.10.10 - Command "login login tenant_name=Owner,password=********,Secret=*****,challenge_response=*****,token=80410000cb49a9,client_port=-1,cert_verified=false,sessionid=********,session_timeout=0,permission=superuser" - Status "Done" and I already have the Fields: user: test_user remote_ip: 10.10.10.10 command: "login login tenant_name=Owner,password=********,Secret=*****,challenge_response=*****,token=*****,client_port=-1,cert_verified=false,sessionid=********,session_timeout=0,permission=user" status: "Done" But I need to extract new fields from the existing field "command" For now what I need is to create the field "event" with the fist word (Login and Logout) Is there any way to Extract a field from an existing ? Or do I have to use the REX in Search? I have this search, but the event field has no values index=my_index (command=login* OR command=logout*) | rex field=command "event:^(.*.Command)\s+\"(?P\w+)" | table user,event, command,remote_ip, status, _time | sort -_time I've tested this regex expression and it return the value "login" from the log line above. Any idea of what I'm doing wrong? Regards,
Hi I am trying to add dynamic lookup file as the the date chosen by the user. And then use the same lookup file created for other queries in other panels. Its all working fine but with some gli... See more...
Hi I am trying to add dynamic lookup file as the the date chosen by the user. And then use the same lookup file created for other queries in other panels. Its all working fine but with some glitch where i am getting the error "Could not construct lookup file.." for some 3-4 minutes until the file gets generated and then after sometime of clicking on the refresh button on the panel it works fine. How can i avoid that particular error? How to tell explicitly load the panel only when the output lookup file gets created ?
Hello i have a raw with 5 columns from the same type and i want to compare the value of the cells of this 5 columns. how can i do it ? thanks