All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

What is the path to the etc folder on windows or Unix hosts. How do I copy the etc folder for backing up purposes? Please show steps. Are there more to backup on the Splunk enterprise or ES for daily... See more...
What is the path to the etc folder on windows or Unix hosts. How do I copy the etc folder for backing up purposes? Please show steps. Are there more to backup on the Splunk enterprise or ES for daily / weekly back ups?
Hello everyone I'm looking for how to monitor status of forwarders. I have seen this app " Unified Forwarder Monitoring App for Splunk" in Splunkbase. https://splunkbase.splunk.com/app/3805/#/detai... See more...
Hello everyone I'm looking for how to monitor status of forwarders. I have seen this app " Unified Forwarder Monitoring App for Splunk" in Splunkbase. https://splunkbase.splunk.com/app/3805/#/details When i try use it on my Splunk system, i can see only the dashboard Deployment Summary working.  In Details tab on Splunkbase, this app need to run manually Rebuild Asset Table to build/create "ufma_asset_list". Actually, I can't run this step so other dashboards as Forwarder Summary, Forwarder Resource Usage which didn't show any results.  Anyone have been install, using this app before ? Please let me know how i can install and use this app to monitoring my forwarder/deployment server? Thank you in advance !   
When I use Splunk to search for events logged by my application I am getting duplicated results with `stats` `table` commands.  Other posts have advised to edit the configuration file. .  I am on Spl... See more...
When I use Splunk to search for events logged by my application I am getting duplicated results with `stats` `table` commands.  Other posts have advised to edit the configuration file. .  I am on Splunk Cloud not using an Universal Forwarder.  After editing the source type configuration (The only configuration I could find within my admin panel & image included below).  I still am getting duplicated results.  I am new to managing my own Splunk instance but willing to learn! I chose `log2metrics_json` because I interpreted that I would be able to use it to use that to auto-convert values to their types (numbers, booleans, etc). Thank you,
hello ,  I want to get active users count per region for my mobile application , I used the following query :  select georegion,distinctcount(cguid) from mobile_snapshots where (geocountry='xxx' ... See more...
hello ,  I want to get active users count per region for my mobile application , I used the following query :  select georegion,distinctcount(cguid) from mobile_snapshots where (geocountry='xxx' and mobileappversion='xxxx' and appkey='xxxxx')   but it returned non-logic numbers , the returned number is not the same as the active users per region widget in the sessions tab    I need to know the query of the active users per region widget in the session tab , in order to use it in dashboards so is the query that I used is wrong or I missed something   thanks
Hi, I have a csv file uploaded in the location /opt/splunk/etc/apps/search/lookups/. My transforms file is in /opt/splunk/etc/apps/search/local with configuration as [error_codes] filename=error_c... See more...
Hi, I have a csv file uploaded in the location /opt/splunk/etc/apps/search/lookups/. My transforms file is in /opt/splunk/etc/apps/search/local with configuration as [error_codes] filename=error_codes.csv I am trying to run a search query using this lookup command to map the error code from the event to the error codes in the csv file. But Splunk keeps saying "Error in 'lookup' command: Could not construct lookup ".  This is my search query  base_search | lookup error_codes error_code_spl OUTPUT error_code_desc, error_code_sol I even tried to make the lookup file and transforms global by moving it to /opt/splunk/etc/system/lookups and /opt/splunk/etc/system/local/transforms.conf, but nothing works.  Am I missing something here?. Please help.
Greetings!! I need your help, I need to implement new solutions UEBA and SOAR solutions that works with Splunk, how does this can be implemented?  what are the requirements? kindly , I really need... See more...
Greetings!! I need your help, I need to implement new solutions UEBA and SOAR solutions that works with Splunk, how does this can be implemented?  what are the requirements? kindly , I really need your advice. Thank you in advance.        
I'm working on an executive dashboard for my applications in AppDynamics Pro. I have everything on the dashboard except for a percentage of server HardwareResource/Machine/Availability. I see the typ... See more...
I'm working on an executive dashboard for my applications in AppDynamics Pro. I have everything on the dashboard except for a percentage of server HardwareResource/Machine/Availability. I see the type of chart and value I want to include is you go to Servers -> Select a Server -> View Details.  Is there a way to add percent values of server availability as a dashboard widget? Thanks, Shawn
I copied this from a great post: Which server do I run this on & how do I execute this script to back up configuration file. Thank u ou can compress $SPLUNK_HOME/etc/ and keep backup of compressed f... See more...
I copied this from a great post: Which server do I run this on & how do I execute this script to back up configuration file. Thank u ou can compress $SPLUNK_HOME/etc/ and keep backup of compressed file. A simple shell script can do this and you can schedule it for every 24 hours using cron. /backup.sh #!/bin/bash TIME=`date +%b-%d-%y` # This Command will read the date. FILENAME=splunk-configs-backup-$TIME.tar.gz # The filename including the date. SRCDIR=/opt/splunk/etc # Source backup folder. DESDIR=/backup # Destination of backup file. tar -cpzf $DESDIR/$FILENAME $SRCDIR Cron expression: 00 04 * * * /bin/bash /backup.sh This will run the script every day at 04:00:00.
I am a beginner at Splunk and I am stuck in a case. How can I get the User-agent from the Request Heder field in Splunk? I mean to ask what query should I write for this??
Hi. We're running the free trial and want to test Kubernetes integrations. But it look like it requires a license that's showing 0. I'm not sure how to trial this, seeing that the Kubernetes features... See more...
Hi. We're running the free trial and want to test Kubernetes integrations. But it look like it requires a license that's showing 0. I'm not sure how to trial this, seeing that the Kubernetes features are what really interest us. Can you help?
Hi, We are setting up "Splunk Add-on for Microsoft Office 365" to pull the event logs from our global tenant. But this is a shared tenant and has data from multiple domains, we would like to pull ... See more...
Hi, We are setting up "Splunk Add-on for Microsoft Office 365" to pull the event logs from our global tenant. But this is a shared tenant and has data from multiple domains, we would like to pull data only relevant to our domain and restrict other domains. Please let us know if we could achieve this. https://www.ciraltos.com/use-splunk-to-collect-logs-from-office-365-and-azure-ad/ Thanks, Sai
We are using 3.4 version on Splunk_db_Connect app and I am trying to connect with snowflake db which requires JDBC driver. I am using latest veriosn of JDBC driver. Driver is loading correctly, ... See more...
We are using 3.4 version on Splunk_db_Connect app and I am trying to connect with snowflake db which requires JDBC driver. I am using latest veriosn of JDBC driver. Driver is loading correctly, when i created a connection i am seeing below error: Error: Database connection Splunk_Snowflake is invalid. No suitable driver found for jdbc <host>:443/ I am using below connections: db_connection_types.conf [snowflake] serviceClass = com.splunk.dbx2.DefaultDBX2JDBC jdbcDriverClass = net.snowflake.client.jdbc.SnowflakeDriver jdbcUrlFormat = jdbc snowflakecomputing.com:443/ port=443 displayName = Snowflake useConnectionPool = true Can someone help with issue Thanks in advance Splunkers!
Hi all, I want to develop an application where users can select the index, timerange and sourcetype and all the logs found from that search will be collected and sent to a third-party system via the... See more...
Hi all, I want to develop an application where users can select the index, timerange and sourcetype and all the logs found from that search will be collected and sent to a third-party system via their API for processing by their systems The way I was planning to do it was as follows: 1- Use the Splunk SDK to create a custom search command that takes three parameters index, timerange and sourcetype.  2- The custom search command using Splunk SDK, runs a export search in the background to find all the logs and then dumps them in a  file. 3- It then uploads the file to the third-party by using their upload API. All done at this point. Please note that log file size can range from 5-10 GB so is the above a good way to achieve what I want? Kindly share your suggestions if there is a better way. I would ideally like to provide the user with a seamless experience. Thanks and regards!
I have a problem I'm trying to solve in a subsearch query. The problem I'm trying to solve, is to monitor when two separate parties generate an event to measure the time in between, with a common as... See more...
I have a problem I'm trying to solve in a subsearch query. The problem I'm trying to solve, is to monitor when two separate parties generate an event to measure the time in between, with a common assignmentId. The main search is the first party, the subsearch pulls the second party if it exists. This search *almost* gets me there: index=my_index sourceServiceId=MyService ruleId=dbdc2f48-1b2c-4869-92ea-10a12f03e3ce | sort -_time | dedup assignmentId | join type=left assignmentId [ search index=my_index sourceServiceId=MyService ruleId=1caf58a6-d4b9-4a1a-a0d7-43b590a374f5 | dedup assignmentId | sort -_time ]   However, the above search is not exactly right, as it pulls the "last event from second party". What I need is the "first event from the second party, AFTER the first party's event". So, basically I want the subsearch to be ordered in ASC time order, limiting results to those that are AFTER the found record in the main search. I have tried changing the query to variations of the following search, but it does not return any data at all from the subsearch. Is it not valid to use a where clause like this in a subsearch? If not, is there another strategy I can use for this? index=my_index sourceServiceId=MyService ruleId=dbdc2f48-1b2c-4869-92ea-10a12f03e3ce | sort -_time | dedup assignmentId | join type=left left=mainResults right=subResults assignmentId [ search index=my_index sourceServiceId=MyService ruleId=1caf58a6-d4b9-4a1a-a0d7-43b590a374f5 | sort 1 +_time | where subResults._time > mainResults._time ]  
I am working on statsing firewall data into a sparkline.  However, when I run the search, the sparkline caps out at 1800 results.  This ends up being an issue because some sparklines end up just look... See more...
I am working on statsing firewall data into a sparkline.  However, when I run the search, the sparkline caps out at 1800 results.  This ends up being an issue because some sparklines end up just looking like a flat line, which is not very useful. Is there a way to work around this/a fix to this? | tstats count from datamodel=firewall where firewall.signature!="(9999)" by firewall.signature, _time, span=1ms | stats sparkline sum(count) by log.threat_name Sparkline limit @ 1800
Hi, How can I use pan and zoom for my chart? I have this chart avg(CALCMIPS) over _time span=1mon by Application limit=0 I wanted to zoom in over to by day or week if possible. Thanks and Regards,
Hi Everyone, I have created hyperlink like below: <html depends="$show_html$"> <div style="font-weight:bold;font-size:200%;text-align:center;color:red"> If you considered it as Error. Please Cont... See more...
Hi Everyone, I have created hyperlink like below: <html depends="$show_html$"> <div style="font-weight:bold;font-size:200%;text-align:center;color:red"> If you considered it as Error. Please Contact <a href="your_link" target="_blank"> O2-Site-Reliability-Engineering Team.</a> </div> </html> I want when I should click on this hyperlink .It should open Email with O2-Site-Reliability-Engineering <o2-site-reliability-engineering@sg.com> Can some one guide me on this.
Hi, guys! I have an event table, which has a field called "COD SERIE CEI". I need to get the "COD SERIE CEI" which has no events between now and 2 hours ago. index="raw_arq_cei" Titulo="NCEI Inform... See more...
Hi, guys! I have an event table, which has a field called "COD SERIE CEI". I need to get the "COD SERIE CEI" which has no events between now and 2 hours ago. index="raw_arq_cei" Titulo="NCEI Informativas" | eval eventHour=strftime(_time,"%H") | eval eventMin=strftime(_time,"%M") | eval curHour=strftime(now(),"%H") | eval curMin=strftime(now(),"%M") | table Dados.COD_SERIE_CEI | sort _time
We are using Splunk DB Connect 3.4.0 and have setup a Database input with a timestamp column as Rising column. I can see that Splunk uses milliseconds precision in the rising column field for timesta... See more...
We are using Splunk DB Connect 3.4.0 and have setup a Database input with a timestamp column as Rising column. I can see that Splunk uses milliseconds precision in the rising column field for timestamps, but our DBA is complaining that querying the DB with millisecond precision is causing issues at their end. The database is an Oracle one. I have tried changing the Rising column to DateTime field but again that having millisecond precision, although in the database I can verify it having till seconds. Is there anyway I can change the rising column precision to seconds and would that be the right thing to do for rising timestamps/DateTime fields?
As the title states, is it possible to color a table row based on a value in a cell/row. I'm able to do this in regular dashboarding, but the beta dashboard is new to me.