All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello pls I have a problem with a search. if I run this search, it has inconsistent ingestion. Here is the search I ran: index=compare_items  if I put a time range of 60mins even 7days, I do no... See more...
Hello pls I have a problem with a search. if I run this search, it has inconsistent ingestion. Here is the search I ran: index=compare_items  if I put a time range of 60mins even 7days, I do not see results. But if I put 30days, I have like million events populated. Here is the error message I got from Splunk.: configuration for xyz/123/xxx/ took longer time than expected. This usually indicate problem with underlying storage performance.  can someone help me if you had similar experience. Thanks 
je ne parviens pas à installer Splunk dans ma machine virtuelle Ubuntu
Hi I am new to Splunk and looking to use it for analytics in place of Matomo. I have it gathering my logs and I can query them. However, I am trying to understand what benefits I would get from thi... See more...
Hi I am new to Splunk and looking to use it for analytics in place of Matomo. I have it gathering my logs and I can query them. However, I am trying to understand what benefits I would get from this add-on? Does it enrich the data or provide prebuilt queries/dashboards? Thanks
Hi Community,   I have these alerts on EDR and I want to create a correlation search to show these alerts on the Splunk   Found alert GnDump.exe was returned as Malware from the Fidelis San... See more...
Hi Community,   I have these alerts on EDR and I want to create a correlation search to show these alerts on the Splunk   Found alert GnDump.exe was returned as Malware from the Fidelis Sandbox Submission on endpoint HQ0S-IT-NAS.Jmcc2.local Found alert GnScript.exe was returned as Malware from the Fidelis Sandbox Submission on endpoint HQ0S-IT-NAS.Jmcc2.local
Hi, I'm doing prep work for my 8.2.6 upgrade to 9.0.1 and I have a couple of apps which are not listed as compatible with 9.0 in Splunkbase. These are: Splunk Datasets Add-on | Splunkbase Splunk S... See more...
Hi, I'm doing prep work for my 8.2.6 upgrade to 9.0.1 and I have a couple of apps which are not listed as compatible with 9.0 in Splunkbase. These are: Splunk Datasets Add-on | Splunkbase Splunk Secure Gateway - Get started with Splunk Secure Gateway - Splunk Documentation I note that in the Splunk docs for both of these apps that it indicates that they are built into Splunk.  My question is, should I delete these two from the etc/apps folder BEFORE I do the upgrade?
Not sure if anyone is using this script to pull logs from salesforce ecommerce, hoping to get some input from similar cases. URL: https://github.com/Pier1/sfcc-splunk-connector This script is ins... See more...
Not sure if anyone is using this script to pull logs from salesforce ecommerce, hoping to get some input from similar cases. URL: https://github.com/Pier1/sfcc-splunk-connector This script is installed on a server with a UF installed. I know the UF is pushing logs because I have other inputs.conf that's pushing logs to Splunk cloud. However in this case, the sfcc runs off a python script. That script runs okay on the server, however i'm not sure why the UF isn't forwarding it into Splunk.
Splunk Addon for Cisco ESA not working when installed on Splunk Cloud? I get this error message ("Oops. Page Not Found") when I try to open the App  
I need to calculate count of the good 15 minute intervals where (status code = 200 AND average response time < 300 milliseconds AND 99.99th percentile response time < 1500 milliseconds ) / the total ... See more...
I need to calculate count of the good 15 minute intervals where (status code = 200 AND average response time < 300 milliseconds AND 99.99th percentile response time < 1500 milliseconds ) / the total count of the intervals * 100. Could someone help. Where I already have status code and response time in two separate fields
My Query:  index=test sourcetype=true AND private AND beta |rex field=_raw "\[private]\s(?<category>\S+\s+\S+\s+\S+)" |dedup category, source|eval category=upper(category)| stats count by category ... See more...
My Query:  index=test sourcetype=true AND private AND beta |rex field=_raw "\[private]\s(?<category>\S+\s+\S+\s+\S+)" |dedup category, source|eval category=upper(category)| stats count by category |rename count as count1| appendcols [search index=test sourcetype=true AND private AND alpha |rex field=_raw "\[private]\s(?<category>\S+\s+\S+\s+\S+)" |dedup category, source|eval category=upper(category)| stats count by category |rename count as count2]| eval Total=(count1-count2) So when the 2nd query doesn't have any events i am not getting the Total column Current output if the 2nd search doesn't have any events: category      count1       xxxx                  5   Desired output: category      count1         count2     Total xxxx                  5                    0                  5
  This is the original link.  Anyone know where this has been moved to? http://wiki.splunk.com/Where_do_I_configure_my_Splunk_settings%3F It describes all of the props.conf attributes and which... See more...
  This is the original link.  Anyone know where this has been moved to? http://wiki.splunk.com/Where_do_I_configure_my_Splunk_settings%3F It describes all of the props.conf attributes and which tier they are applicable to.
How do list multiple sources in a query: sourcetype=xml source="/wealthsuite/tti/current/*"?
status=Auto, Manual car= BMW, Honda, Audi index * | stats count(status) as Total by car Is there anyway I can get the results as shown in attached picture.            
Hello fellow Splunkers, I've recently run into a bit of an issue while working on an automation process. For context, I have already reviewed the following without success: Solved: Re: Generate PDF... See more...
Hello fellow Splunkers, I've recently run into a bit of an issue while working on an automation process. For context, I have already reviewed the following without success: Solved: Re: Generate PDF from View in REST API - Splunk Community Can I Export PDF Via Rest? - Splunk Community In short, when do not ship the modified XML in my GET request I get the following response: PDF endpoint must be called with one of the following args: 'input-dashboard=<dashboard-id>' or 'input-report=<report-id>' or 'input-dashboard-xml=<dashboard-xml>' Which is more or less expected. However when I do send the modified XML in my GET request, this is what comes back: I know the endpoint is functioning as I'm able to manually export the dashboard results utilizing the web interface without issue. However the manual process tie up half my day, and is not scalable moving forward. Any advice from those who have been able to solve this would be greatly appreciated Thanks in advance    
Upgraded to Splunk 9.0.1 from Splunk 8.2.1 MS-Windows AD Objects received the dashboard error, upgraded to MS-Windows AD Objects 4.1.1 which claims to be compatible with 9.0. But even after upgrading... See more...
Upgraded to Splunk 9.0.1 from Splunk 8.2.1 MS-Windows AD Objects received the dashboard error, upgraded to MS-Windows AD Objects 4.1.1 which claims to be compatible with 9.0. But even after upgrading the same error persists. Does MS-Windows AD Objects use jQuery 3.5, does 9.0.1 not work with it or am I spinning my wheels trying to make this thing work? Have the same issue with other apps but figured I would start here. Looked through the boards found stuff on jquery 3.5 but nothing specific to AD objects 4.1.1.  Seems like most things work with the App it just always throws the error? Also tried the " clone the dashboard in the new studio option", no joy there.  Any help is appreciated   
I want to use the map command to add the total event times for each day during the time interval from 6am-6pm. For each day.... the "earliest" token in my map command = start of each day+6hours (... See more...
I want to use the map command to add the total event times for each day during the time interval from 6am-6pm. For each day.... the "earliest" token in my map command = start of each day+6hours (Start1) the "latest" token in my map command = start of each day+18 hours(End 1) Using the tokens I use the map command to search over my set Splunk search timeframe. In my map command...    1. For each day, I subtract each events  Endtime from its starttime = Diff    2. To get the total event time for each day, I sum the time differences (sum(diff)) to get  the "total_time_of_events"    3. Next I take the info_max_time - info_min_time for each search (for each earliest and latest token searches) to get the time value for each 12 hour day. 4.  Finally I divide the total_event_time by the (search_time_span*100) for each search to get the total time percentage of events being pulled into Splunk by day YET it is not working!! My search returns "No results found". May I please have help? What am I doing wrong? CODE: |table BLANK hour date_mday date_month date_year |bin span=1d _time |eval Month=case(date_month="august","8") |eval Start=Month+"/"+date_mday+"/"+date_year |eval start= strptime(Start,"%m/%d/%y") |eval Start1=start+21600 |eval End1=start+64800 |map search="search (index...) earliest=$Start1$ latest=$End1$ |bin span=1d _time|dedup _time |eval timeend=strptime(DateEnd,\"%m/%d%Y %I:%M:%S %p\") |eval timestart=strptime(DateStart,\"%m/%d/%Y %I:%M:%S %p\") |eval diff=round(timeend-timestart)|stats sum(diff) as total_time_of_events by BLANK |addinfo |eval IntTime= info_max_time-info_min_time |eval prcntUsed=round((total_time_of_events/(IntTime))*100) |rename prcntUsed as Percent_of_event_time"
Please let me know if anyone has experience bringing Guardicore data in other than using a Heavy Forwarder. Thank you!  
Hello, Data in CyberArk comes through the Syslog Server and CyberArk TA needs to be installed into Search head (or search head cluster) based on the SPLUNK web site (https://docs.splunk.com/Documen... See more...
Hello, Data in CyberArk comes through the Syslog Server and CyberArk TA needs to be installed into Search head (or search head cluster) based on the SPLUNK web site (https://docs.splunk.com/Documentation/AddOns/released/CyberArk/Installation). I installed this TA directly into the Syslog server, but not working as expected. How I would configure, Syslog, SHC, and CyberArk? Any help would be highly appreciated. Thank you! 
I was searing for a simple way to convert all types of mac address to "more" standard format.  Found various solution, but not a single line that I did like, so I made one. This will convert any mac... See more...
I was searing for a simple way to convert all types of mac address to "more" standard format.  Found various solution, but not a single line that I did like, so I made one. This will convert any mac format to XX:XX:XX:XX:XX:XX. (Output can be modified to format of your choice.)   | rex mode=sed field=mac "s/[^0-9a-fA-F]//g s/(..)(..)(..)(..)(..)(..)/\1:\2:\3:\4:\5:\6/g y/abcdef/ABCDEF/"   s/[^0-9a-fA-F]//g remove all that are not 0-9 a-z and A-Z (all symbols are gone) s/(..)(..)(..)(..)(..)(..)/\1:\2:\3:\4:\5:\6/g set the output format to xx:xx:xx:xx:xx:xx y/abcdef/ABCDEF/ change to upper case  
I have 2 roles A and B - they both inherit only from "user" role. If they create a dashboard in search they cannot edit the permissions to share the dashboard to "App" so the other role or users in... See more...
I have 2 roles A and B - they both inherit only from "user" role. If they create a dashboard in search they cannot edit the permissions to share the dashboard to "App" so the other role or users in the same role can see their dashboard. By default it is built and remains "private". If I add all capabilities under "power" role (that aren't in "user") to roles A and B they still cannot edit permissions on their own dashboard to share to "app" context so the dashboard an be shared in search app. If I add "power" to inheritance of roles A and B roles then they can edit the permissions. What am I missing?
How do I get a count of Low, Medium, High, Critical in a Splunk Search?   This is the current search I am using: `get_tenable_index` sourcetype="tenable:sc:vuln" severity=Low OR severity=Medium... See more...
How do I get a count of Low, Medium, High, Critical in a Splunk Search?   This is the current search I am using: `get_tenable_index` sourcetype="tenable:sc:vuln" severity=Low OR severity=Medium OR severity=High OR severity=Critical | dedup plugin_id, port, protocol, sc_uniqueness, source | eval key=plugin_id."_".port."_".protocol."_".sc_uniqueness."_".source | table severity, synopsis, solution, port, protocol, ip | outputlookup append=true key_field=key sc_vuln_data_lookup