All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Search head cluster captain's /opt/splunk/var/run/file.bundle still has the csv even though file was added in the /opt/splunk/etc/system/local/distsearch.conf's [replicationBlacklist]. $SPLUNK_HOME/... See more...
Search head cluster captain's /opt/splunk/var/run/file.bundle still has the csv even though file was added in the /opt/splunk/etc/system/local/distsearch.conf's [replicationBlacklist]. $SPLUNK_HOME/bin/splunk btool distsearch list --debug showing the csv file in the [replicationBlacklist] list but the csv file still in the latest bundle on the SH captain? Could this a bug for Splunk 8.2.4 (build 87e2dda940d1) when the number of entries in [replicationBlacklist] exceeds a number? in this case there are entries from blacklist_lookups_1 to blacklist_lookups_79. Thanks in advance for inputs.
Are there currently supported methods for ingesting and monitoring Suricata events in splunk?
Are there any existing parser for samba smbd_audit records?  Or other was to collect access to files with samba?
Hi Splunkers, I want to create a macro that will be looking inside a lookup file, but in a way that will not break the search if the lookup is non-existent after some time. Is there any equivalen... See more...
Hi Splunkers, I want to create a macro that will be looking inside a lookup file, but in a way that will not break the search if the lookup is non-existent after some time. Is there any equivalent of for example Linux known "test -f filename" in Splunk?
I'm trying to finally make my bareos logs "work" properly. Parsing the fields out of the events is one thing but I was wondering if there is any good schema for backup-related fields. There is no suc... See more...
I'm trying to finally make my bareos logs "work" properly. Parsing the fields out of the events is one thing but I was wondering if there is any good schema for backup-related fields. There is no such datamodel in CIM so I can't rely on that but maybe some well-known and widely used TA to take pattern from?
I see that there is a journald_input app in the splunk forwarder install, but I can't seem to find any information on how to use it.  I ran: /opt/splunkforwarder/bin/splunk enable  app journald_inpu... See more...
I see that there is a journald_input app in the splunk forwarder install, but I can't seem to find any information on how to use it.  I ran: /opt/splunkforwarder/bin/splunk enable  app journald_input But it doesn't appear to be ingesting any entries from the journal.
I want to be the order I list below? Very High  High  Medium Low Very Low  Info
hi I use a search  thats transpose events with span of 30 m the end of the search is this one   | where _time <= now() AND _time >= now()-14400 | eval time=strftime(_time,"%H:%M") | sort ti... See more...
hi I use a search  thats transpose events with span of 30 m the end of the search is this one   | where _time <= now() AND _time >= now()-14400 | eval time=strftime(_time,"%H:%M") | sort time | fields - _time _span _origtime _events | fillnull value=0 | transpose 0 header_field=time column_name=KPI include_empty=true | sort + KPI   as you can see, I just display events which exist in a specific time range   | where _time <= now() AND _time >= now()-14400   It works fine but just when the timepicker choice is "today" I would like to do the same think on previous timepicker choice like "last 7 days" or "last 30 days" Could you help please?
As of today data models, like the Network Traffic data model, have fields for src, src_ip, dest and dest_ip, but not src_dns and dest_dns. The way I understand it, DNS names should then be used... See more...
As of today data models, like the Network Traffic data model, have fields for src, src_ip, dest and dest_ip, but not src_dns and dest_dns. The way I understand it, DNS names should then be used in the src and dest fields, and IPs in the fields src_ip and dest_ip. Some logs don't have DNS names available in the log itself. However, if you have Splunk ES with a populated asset framework, it will automatically add the field src_dns and dest_dns to the events if the fields src and dest are already available. If I want the fields src_dns and dest_dns from the events to be added to the src and dest fields in the data model, I would normally solve this by adding a coalesce for src in props.conf for the source type, but since lookups are applied after evals in the search time parsing, this is not possible when src_dns and dest_dns comes from a lookup, as in the case with Splunk ES. Therefore I propose the following change to the data models themselves, for all datamodels that are using the src and dest fields: Change the eval for src from if(isnull(src) OR src="","unknown",src) to case((isnull(src_dns) OR src_dns="") AND (isnull(src) OR src=""),"unknown",NOT (isnull(src_dns) OR src_dns=""),src_dns,true(),dest) and likewise change the eval for dest from if(isnull(dest) OR dest="","unknown",dest) to case((isnull(dest_dns) OR dest_dns="") AND (isnull(dest) OR dest=""),"unknown",NOT (isnull(dest_dns) OR dest_dns=""),dest_dns,true(),dest)
Hi there, I'm currently building a dashboard and need to display two dates, one being today's date and the other being the previous working day. I use the query below for today's date  <query> ... See more...
Hi there, I'm currently building a dashboard and need to display two dates, one being today's date and the other being the previous working day. I use the query below for today's date  <query> index=main | head 1 | eval Date = strftime(_time,"%d/%m/%Y") | fields Date</query> Could you help me with the query to display the previous working day. Many thanks! Janine
I have a simple search which is satisfaction_date=0 OR close_date=0 AND status=8 in the previous month. I now have a requirement where users want to see (last 30 days) where those records are now tag... See more...
I have a simple search which is satisfaction_date=0 OR close_date=0 AND status=8 in the previous month. I now have a requirement where users want to see (last 30 days) where those records are now tagged with a different status. The unique identifier with each record is a proposal_id.   i.e in October proposal vdutta1 had a satisfaction date as 0 and status as 8. Proposal vdutta1 now has a satisfaction date as 0 and status as 6 so this record should be shown.   Can you help?
Hello, I am facing issue while using singleview with js. In splunk I use js to create few singleviews, few of them are working which has only one trelli but the one which has multi trellis in a pan... See more...
Hello, I am facing issue while using singleview with js. In splunk I use js to create few singleviews, few of them are working which has only one trelli but the one which has multi trellis in a panel are not at all displaying the panels, it either on queue or just waiting for data. Could anyone please help...
Hi, I was looking for datasets mentioned in DSDL, but I dont find them with container, do we have any place to download the sample data related to notbooks and can run the dsdl examples.   They h... See more...
Hi, I was looking for datasets mentioned in DSDL, but I dont find them with container, do we have any place to download the sample data related to notbooks and can run the dsdl examples.   They have mentioned those are avaliable in data path in jupyter notebook, but i don't find any data realted to sample data.   If any one have the data please let us know.   thanks  
Hi, I have a question regarding the 0mb Lic; As stateted here in another thread it is mendatory in segmentated networks when DS cannot reach LM. This is the case for my customer in a critical infr... See more...
Hi, I have a question regarding the 0mb Lic; As stateted here in another thread it is mendatory in segmentated networks when DS cannot reach LM. This is the case for my customer in a critical infrastructure environment. Now this customer has several zones, running multiple DS / HF (filtering) inside different zones. The customer does not want to connect all of the DS/HF to LM, but Data is Forwarded to a central Indexercluster.   Lately we saw that Splunk is not allowing to install the same 0mb lic on multiple Servers. Does this mean I have to request a separate 0 mb lic for each of the DS/HFs? The error reads: Peer xxx has the same license installed as peer yyy. Peer xxx is using license master self, and peer yyy is using license master self. Please fix this issue in 72 hours, otherwise peer will be disabled. What is your opinion on this, how we can fix the issue? BR Markus
Hello Splunkers, I would like to have a page show when an user clicks an app icon, instead of going to a default dashboard, it shows a list of dashboards available (with links to open) and inform... See more...
Hello Splunkers, I would like to have a page show when an user clicks an app icon, instead of going to a default dashboard, it shows a list of dashboards available (with links to open) and information about the app and or dashboards. I have searched around and have not found anything. I would like something similar to an "index.html" page, etc. Thanks for a great source of info on splunk.   eholz1
I keep getting Client is not authenticated Error. I wondered if there was an error occurring internally, so I confirmed that the following authentication error occurred while searching. startup:1... See more...
I keep getting Client is not authenticated Error. I wondered if there was an error occurring internally, so I confirmed that the following authentication error occurred while searching. startup:116 - Unable to read in product version information; isSessionKeyDefined=False error=[HTTP 401] Client is not authenticated Host: It seems to be occurring in both IDX and SH, and it is confirmed that the error occurred on 11/12. Is there any way to check which app is causing it?  
hi as you can see I use a relative time in my search in order to filter events on today between 7h and 19h   earliest=@d+7h latest=@d+19h    Now I would like to be able to link this relati... See more...
hi as you can see I use a relative time in my search in order to filter events on today between 7h and 19h   earliest=@d+7h latest=@d+19h    Now I would like to be able to link this relative time with my timepicher in order to change the period slot, for example I need to display events on the last 7 days between 7h and 19 or on the last 24h between 7h and 19h is it possible to do that? thanks     <form> <label>CAP</label> <fieldset submitButton="false"> <input type="time" token="field1"> <label></label> <default> <earliest>-24h@h</earliest> <latest>now</latest> </default> </input> </fieldset> <row> <panel> <chart> <search> <query>`index_mes` sourcetype=web_request earliest=@d+7h latest=@d+19h    
Hello, we tried to enable TLS validation with Splunk 9.0.2 as described in the Splunk documentation. Unfortunately, this caused our distributed Splunk system consisting of index cluster and searchh... See more...
Hello, we tried to enable TLS validation with Splunk 9.0.2 as described in the Splunk documentation. Unfortunately, this caused our distributed Splunk system consisting of index cluster and searchhead cluster to stop working. Specifically, searches could no longer be performed. We discovered the following messages in the search head log: 11-06-2022 23:59:59.839 +0100 ERROR DistributedPeerManagerHeartbeat [95344 DistributedPeerMonitorThread] - Send failure while pushing public key to search peer = https://10.10.10. 10:8089 , error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed - please check the output of the `openssl verify` command for the certificates involved; note that if certificate verification is enabled (requireClientCert or sslVerifyServerCert set to "true"), the CA certificate and the server certificate should not have the same Common Name. This makes us wonder, because actually the connection for the TLS connection should be established over fqdn and not over an ip address. When establishing a connection over the IP, it is also logical that the TLS name check against the host name fails, since only the FQDN and no IP address is stored in the TLS certificate. We therefore wondered why the search head addresses our index peer 10.10.10.10 via the IP and not the fqdn. The Splunk documentation states that the search head gets the list of index peers from the cluster manager. However, in the cluster manager, our index peers also report by name, at least that is what the output of the cluster master suggests: >> splunk/bin/splunk show cluster-status WARNING: Server Certificate Hostname Validation is disabled. Please see server.conf/[sslConfig]/cliVerifyServerName for details. Replication factor met Search factor met All data is searchable Indexing Ready YES HA Mode: Disabled indexpeer01.bla.fasel 132BC25B-7774-40D9-AAED-22F9795C8E3F site2 Searchable YES Status Up Bucket Count=2412 indexpeer02.bla.fasel 9B5E23F6-3D53-4AAF-805F-DEDF3ACF9D87 site2 Searchable YES Status Up Bucket Count=2467 indexpeer03.bla.fasel CCBC4D24-025E-45FB-A68D-9C1A14219D3F site1 Searchable YES Status Up Bucket Count=2384 indexpeer04.bla.fasel D649A1C7-9E86-4E48-9B47-144C70029C15 site1 Searchable YES Status Up Bucket Count=2475 On our search heads, our search peers are listed by IP-Adress in the attribute PeerURI instead of the fqdn. How can we change it to the fqdn? Do other users have the problem as well? Or does TLS validation work even though IP addresses are listed there? Thanks!
Hi Team, We are planning to monitor microsoft 365 and microsoft teams using Appdynamics, could you please let us know how to set up the monitoring. Thanks Kamal Rath
Hi all I would like to include the start and end date of my search in the email subject. For example, 'The results from 2022-11-01 to 2022-11-11'. I tried the email tokens $job.earliestTime$ and $j... See more...
Hi all I would like to include the start and end date of my search in the email subject. For example, 'The results from 2022-11-01 to 2022-11-11'. I tried the email tokens $job.earliestTime$ and $job.latestTime$ but they also give me time which just obscures the title. Is there any way to retrieve just the dates? Any help much appreciated. Cheers Tom