All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, Can anyone please help me with the requirement? Can you please build an overview dashboard that displays how many change requests, how many incidents, and how many tasks are submitted, and co... See more...
Hi, Can anyone please help me with the requirement? Can you please build an overview dashboard that displays how many change requests, how many incidents, and how many tasks are submitted, and completed each day? FYI: these changes, incidents, and tasks are from ServiceNow. I work on ServiceNow but want to expand my knowledge of Splunk.  Regards Suman P.
Hi, i need help on writing the [http] stanza in inputs.conf  for HEC token configuration. Please assist. Thank you.
Hello Splunk Lovers! i have date format 202211131614220000 and i want convert this format to readble for Splunk i should use strptime and strftime, but i have some problems. Please, give me prompt
Im trying to get the following into a table and have a count of the successful attempts. I have tried a few ways, but still am lost. below are my 2 attempts: Ex1: (index="sfdc" sourcetype="sfdc... See more...
Im trying to get the following into a table and have a count of the successful attempts. I have tried a few ways, but still am lost. below are my 2 attempts: Ex1: (index="sfdc" sourcetype="sfdc:loginhistory" eventtype="sfdc_login_history" LoginType="SAML Sfdc Initiated SSO" app="sfdc" action=success)  OR (index="microsoft" (sourcetype="azure:aad:signin" eventtype="azure_aad_signin" app="windows:sign:in" action=success) OR (sourcetype="azure:aad:user" jobTitle!=null)) | eval login=case((sourcetype=="azure:aad:signin" AND eventtype=="azure_aad_signin"), "windows", (sourcetype=="sfdc:loginhistory" AND app="sfdc"), "salesforce") | table displayName  mail jobTitle officeLocation eventtype Ex2: index=microsoft (sourcetype="azure:aad:user" givenName="***" surname="***" jobTitle!="null" officeLocation!="null") OR (sourcetype="azure:aad:signin" eventtype="azure_aad_signin" app="windows:sign:in" action=success)    | stats  count by displayName  mail jobTitle officeLocation     | rename  displayName AS "Display Name" mail AS Email department AS Department jobTitle AS "Job Title" officeLocation AS Branch    | fields  - count     | sort  + Display Name EX3: index=microsoft (sourcetype="azure:aad:signin" eventtype="azure_aad_signin" app="windows:sign:in" action=success) OR (sourcetype="azure:aad:user" givenName="***" surname="***" jobTitle!="null" officeLocation!="null") | eval joiner=if(sourcetype="azure:aad:signin", action, displayName) | stats values(action) as action by displayName mail jobTitle officeLocation | rename displayName AS "Display Name" mail AS Email department AS Department jobTitle AS "Job Title" officeLocation AS Branch | sort + Display Name What I'm trying to achieve here is to have a table listing the following Display Name | Email | Job Title | Branch | Windows Logon Attempt* | Sales Force Login Attempt* Windows Logon Attempt* | Sales Force Login Attempt* - is the part that I get stuck and can't seem to populate the list from the following index and srctype. Ex2 and 3 is without Salesforce (which I can live with ). If you can help he with Ext3 that will be great! Any ideas from the Splunkers in here? Thanks, S
Hello Greetings! i have data in the following way Device   Processor  status 01             Splunkd        Running 01               Sql                 Stopped 01                Python        S... See more...
Hello Greetings! i have data in the following way Device   Processor  status 01             Splunkd        Running 01               Sql                 Stopped 01                Python        Stopped 02          Spluknd.          Stopped In the above output for a device if state is running of atleast on processor need to consider it as online otherwise offline can you please help me with query.     Thank you in advance Happy splunking!
Hi, We have configured a input to connect to redshift database from splunk db connect. It was working fine. But suddenly the input is showing invalid database connection in the inputs page and we ar... See more...
Hi, We have configured a input to connect to redshift database from splunk db connect. It was working fine. But suddenly the input is showing invalid database connection in the inputs page and we are unable to create a new connection. Telnet to the db is connecting. What could be the possible issues for this?
Hi, My customer have configured Splunk to get the data in from "GitHub audit log stream" with Http Event Collector installed in their DMZ Server(with 8088 port open to the outside internet), Which ... See more...
Hi, My customer have configured Splunk to get the data in from "GitHub audit log stream" with Http Event Collector installed in their DMZ Server(with 8088 port open to the outside internet), Which forwards the data to another Splunk server within their secure server with only 9997, 8000 and 8088 port opened. But, in order to open 8088 port from DMZ Server, they have to complete their Security Vulnerability Check.  The problem is that the check returned with various security vulnerabilities, and that prevents them to open the port. the vulnerabilities returned is as below. phpPgAdmin redirect.php URL redirection Spring Boot Actuator endpoint exposed Missing "Content-Security-Policy" header Sensitive Authentication (Basic) Information Leakage Missing HttpOnly attribute in session cookie Cookies with insecure, incorrect or missing SameSite attributes Discover compressed directories Unnecessary Http response headers were found in the application Include sensitive session information in persistent cookies Discovery of web application source code exposure patterns host header injection Are there any security vulnerability check reports done by Splunk? or some way to solve this vulnerability? Thank you in advance.  
in the raw event there is a line that goes Brand\="xyz"   What's the rex command I can use to extract this in my search?   If possible, I'd like to remove the \ and "" from the extraction its... See more...
in the raw event there is a line that goes Brand\="xyz"   What's the rex command I can use to extract this in my search?   If possible, I'd like to remove the \ and "" from the extraction itself.
Hi, does anyone has experience with website monitoring app  I am facing issue with adding inputs, especially if input (check) requires HTTP Authentication. error is : " 401 Splunk cannot authenti... See more...
Hi, does anyone has experience with website monitoring app  I am facing issue with adding inputs, especially if input (check) requires HTTP Authentication. error is : " 401 Splunk cannot authenticate the request. CSRF validation failed "     Request URL: https://xxxx:8443/en-US/splunkd/__raw/services/storage/passwords?output_mode=json Request Method: POST Status Code: 401 Splunk cannot authenticate the request. CSRF validation failed. Remote Address: 10.217.11.78:8443 Referrer Policy: no-referrer     I find out that request is missing one header parameter X-Splunk-Form-Key requestURL: en-US/splunkd/__raw/services/storage/passwords?output_mode=json request header:   Accept: text/javascript, text/html, application/xml, text/xml, */* Accept-Encoding: gzip, deflate, br Accept-Language: en-GB,en-US;q=0.9,en;q=0.8,sk;q=0.7 Connection: keep-alive Content-Length: 61 Content-Type: application/x-www-form-urlencoded; charset=UTF-8 Cookie: mintjs%3Auuid=02ced06b-7ec3-40e2-8e0b-91040e343001; built_by_tabuilder=yes; ta_builder_current_ta_name=TA-splunk-backup; ta_builder_current_ta_display_name=Splunk%20backup; splunkweb_csrf_token_8443=1505950XXXXXXXXXXX; session_id_8443=6e995a2d52b3a34ade550aafff50XXXXXXXXXXX; splunkd_8443=OUucWpZKKsQtgnedQ98lJ5VRCosW7HAdUh6fia3B^Q^D9HofK5tn11AwTAEiKXhzUL_HPsAiG91v8evtXcVri9MYUmXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX0fCIm84az_izL Host: xxxx:8443 Origin: https://xxxx:8443 sec-ch-ua: "Not?A_Brand";v="8", "Chromium";v="108", "Google Chrome";v="108" sec-ch-ua-mobile: ?0 sec-ch-ua-platform: "Windows" Sec-Fetch-Dest: empty Sec-Fetch-Mode: cors Sec-Fetch-Site: same-origin User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36 X-Requested-With: XMLHttpRequest     Response header :    Connection: Keep-Alive Content-Length: 104 Content-Type: application/json; charset=UTF-8 Date: Thu, 08 Dec 2022 23:06:45 GMT Server: Splunkd Vary: Cookie X-Content-Type-Options: nosniff X-Frame-Options: SAMEORIGIN     Any idea why is this parameter missing?  Splunk runs on linux  I tried : clear cache, incognito window,
I want to strip certain results by time from my search. I eventually plen to place a dedup command between the first and second searches, however I am running into issues with the earliest and latest... See more...
I want to strip certain results by time from my search. I eventually plen to place a dedup command between the first and second searches, however I am running into issues with the earliest and latest modifiers on search in the second search. The following 3 searches work fine and return results throughout the week:     host=x host=x earliest=-7d host=x earliest=-7d | search *     But these searches return no results: even when there are events in the listed time frame.     host=x | search host=x earliest=-7d host=x | search host=x earliest=-4d     Does anyone have any idea why? I would like to strip off search results based on time in the second search but it doesn't seem to work.
We have a dashboard built in our Splunk Cloud instance using dashboard studio with multiple panels using different data sources There is a panel that has several calculated timestamps displayed in ... See more...
We have a dashboard built in our Splunk Cloud instance using dashboard studio with multiple panels using different data sources There is a panel that has several calculated timestamps displayed in UTC When I open in search, the timestamps are displayed in the user preferred timezone All other panels in the dashboard displays timestamps in the user preferred timezones Any ideas why this might be happening?
My Splunk Forwarder inputs.conf looks like this: [batch://C:\Splunk\MyApi\Local\Api\*.json] index = myapi_local move_policy = sinkhole disabled = 0 source = myapi sourcetype = Api   My log... See more...
My Splunk Forwarder inputs.conf looks like this: [batch://C:\Splunk\MyApi\Local\Api\*.json] index = myapi_local move_policy = sinkhole disabled = 0 source = myapi sourcetype = Api   My logging files are generating every second . Is that perhaps a little bit too excessive? What's the best practice in using the Forwarder? File name examples: MyAPI_2022-12-08 23-06-28.json MyAPI_2022-12-08 23-06-29.json ... Thanks!
Hi, I need to use a number of regression models on some index data. This index data is in an app called "XY". However, thus far, I have only been able to use the regression modelling on the Splunk... See more...
Hi, I need to use a number of regression models on some index data. This index data is in an app called "XY". However, thus far, I have only been able to use the regression modelling on the Splunk MLTK app. Is it possible to call such  functionality in the Splunk MLTK app while inside another app which has the data you wish to model on? Otherwise, what alternatives can I do in this situation? Many thanks,
Hello, I've spent probably 8+hrs now trying to debug how to get SSL certificates working with splunk web and finally got it working, so posting this here to hopefully help someone in the future. ... See more...
Hello, I've spent probably 8+hrs now trying to debug how to get SSL certificates working with splunk web and finally got it working, so posting this here to hopefully help someone in the future. Using these links as a reference: https://docs.splunk.com/Documentation/Splunk/9.0.2/Security/Turnonbasicencryptionusingweb.conf https://docs.splunk.com/Documentation/Splunk/9.0.2/Security/HowtoprepareyoursignedcertificatesforSplunk The hardest part was figuring out how to use the certificates provided by certbot into a format that splunk recognizes. The following steps ended up working: 1) Create /opt/splunk/etc/system/local/web.conf by copying /opt/splunk/etc/system/default/web.conf and change the line "enableSplunkWebSSL = false" to "enableSplunkWebSSL = true" 2) Install and configure certbot to obtain certificates as needed. They'll be in /etc/letsencrypt/live/$my_domain/ instead of /opt/splunk/etc/auth/splunkweb/ and they're not in a format that splunk can use. 3) The second link above gives some guidance on how to prepare the certbot certificates to the format that splunk needs them, which should be: server certificate private key CA certificate To do this, I'm creating the following certbot post renewal hook script: /etc/letsencrypt/renewal-hooks/post/splunk.sh #!/bin/bash #change this my_domain variable to match the domain you are using my_domain=XXXX src_path=/etc/letsencrypt/live/$my_domain dst_path=/opt/splunk/etc/auth/splunkweb cat $src_path/cert.pem $src_path/privkey.pem $src_path/fullchain.pem > $dst_path/cert.pem cat $src_path/privkey.pem > $dst_path/privkey.pem chown splunk:splunk $dst_path/cert.pem $dst_path/privkey.pem chmod 600 $dst_path/cert.pem $dst_path/privkey.pem /opt/splunk/bin/splunk restart #EOF And make the script executable: chmod +x /etc/letsencrypt/renewal-hooks/post/splunk.sh 4) Since you've already renewed the certificate with certbot, you can run the script directly: /etc/letsencrypt/renewal-hooks/post/splunk.sh The script should run automatically whenever certbot renews your certificate
Would someone know how to find out who is logged into a specific computer. Thanks in advance!
Hello! I recently install security patches in the Linux Server where my heavy forwarder is installed, but when I update the JRE path to the new one, the add-on does not work. I updated the JRE path... See more...
Hello! I recently install security patches in the Linux Server where my heavy forwarder is installed, but when I update the JRE path to the new one, the add-on does not work. I updated the JRE path, I upgrade the Splunk DB Connector add-on to the latest version, I upgrade my Heavy Forwarder as well to the 9.0.2 version and still does not work. Does anyone knows what else can I do? Thank you in advance.
What are the benefits around creating health rules in baseline and browser metrics?
Hello Splunkers, I'm looking for a Splunk search to list all indexes that were not used by users for last 30 days. I've tried the below query from audit logs, but it's not giving me the accurate res... See more...
Hello Splunkers, I'm looking for a Splunk search to list all indexes that were not used by users for last 30 days. I've tried the below query from audit logs, but it's not giving me the accurate results. This query is only giving me few indexes but not all the indexes that we used.  index=_audit sourcetype=audittrail action=search info=granted (NOT TERM(index=_*)) | where match(search, "index\s*(?:=|IN)") | rex max_match=0 field=search "'search index=(?<used_index>\w+)'" | stats count by used_index Appreciate if anyone could share some thoughts on this?
Splunk Enterprise on Prem 8.2.6 Linux I understand Dashboard studio, at least on 8.2.6, is very limited in the drilldown area but there must be a way to pass the earliest and latest time from one D... See more...
Splunk Enterprise on Prem 8.2.6 Linux I understand Dashboard studio, at least on 8.2.6, is very limited in the drilldown area but there must be a way to pass the earliest and latest time from one Dashboard Studio Dashboard to another.  I have my first level dashboard passing the earliest/latest in the URL (I have tried with and without &quot; https://10.178.1.121:8000/en-US/app/filetracker/ds_begin_clone/edit?earliest=&quot;2022-12-07T18:00:00.000&quot;&latest=&quot;2022-12-07T18:05:00.000&quot; I cannot seem to figure out how to set the global time picker on the dashboard.  I this possible? Up to now, I have been using Classic because I can generate links in my alerts to take staff directly to the data they need to troubleshoot, but so far only in another classic dashboard. I can provide a much better explanation of the problem in Dashboard Studio. How can I pass parms from one and use in another dashboard studio dashboard?