All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Splunkers,  I have a question related to a json file that I'm trying to parse.I want to remove the first part of it until {"kind"), see sample file is added below.  I tried using the FIELD_REGEX... See more...
Hi Splunkers,  I have a question related to a json file that I'm trying to parse.I want to remove the first part of it until {"kind"), see sample file is added below.  I tried using the FIELD_REGEX_HEADER in props.conf which I think is supposed to that so far I've tried an failed with the following: FIELD_HEADER_REGEX={"activities":\s\[(.) FIELD_HEADER_REGEX={"activities":\s\[ FIELD_HEADER_REGEX={"activities": FIELD_HEADER_REGEX=\{\"activities\"\: Some of the above work on regexr.com with the sample data.  {"activities": [{"kind": "admin#reports#activity", "id": {"time": "2022-07-18T14:04:19.866Z", "uniqueQualifier": "-2451221827967636314", "applicationName": "redacted", "customerId": "redacted"}, "etag": "\"dng2uCItaXPqmMj2MG4RUqVkRjnE_4kf0VvQ0_WkiTg/6j3Reg7FneLgLDfjE-lZuZUOrdc\"", "actor": {"callerType": "USER", "email": "redacted", "profileId": "redacted"}, "ipAddress": "redacted", "events": [{"type": "SECURITY_INVESTIGATION", "name": "SECURITY_INVESTIGATION_QUERY", "parameters": [{"name": "INVESTIGATION_DATA_SOURCE", "value": "USER LOG EVENTS"}, {"name": "INVESTIGATION_QUERY", "value": "(empty)"}]}]}, Any help is appreciated thank you!
Hi all, I have a use case where i need to check for duplicate JIRA contents Basically, we are ingesting our JIRA into SOAR as containers (SOAR containers) In a particular playbook, i would like to que... See more...
Hi all, I have a use case where i need to check for duplicate JIRA contents Basically, we are ingesting our JIRA into SOAR as containers (SOAR containers) In a particular playbook, i would like to query the other SOAR containers with similar labels and based on these information do some processing. is there anyway i can achieve this in SOAR?
Hello Experts, I am stuck with a timechart % query and I want to sort basis a field count and not the default sort on alphabetical order it is counting There are two queries, it be best if I can... See more...
Hello Experts, I am stuck with a timechart % query and I want to sort basis a field count and not the default sort on alphabetical order it is counting There are two queries, it be best if I can get a help or workaround in both the one   Query - 1 index=xyz catcode="*" (prodid="1") (prodcat="*") success="*" | eval TheError=if(success="false" AND Error like "%%",count,0) | timechart span="15m" eval(round(sum(TheError)*100/sum(count),2)) by catcode useother=f In above query I want to find an option to sort it by catcode and not the default in alphabetical order   OR   Query 2 index= xyz (prodid="1")  (prodcat=*) (catcode=*) success=* | timechart span=1w sum(count) by catcode limit=10 useother=f usenull=f | untable _time catcode count | eventstats sum(count) as Total by _time | eval Fail_Percent=round(count*100/Total,2) | table _time, catcode, Fail_Percent | xyseries _time catcode Fail_Percent | sort -catcode In above query all is fine but I dont want 'eventstats count as Total' as it counts all events. I want to have this counted as Total by catcode and then calculate the % Can you help please.   Thanks in advance Nishant
Hi,  I habe a table after using stats: | stats values(durationSum) as duration by Fauf Station. How can I convert it to a table with only one line in such a format: Fauf duration_Station1 durat... See more...
Hi,  I habe a table after using stats: | stats values(durationSum) as duration by Fauf Station. How can I convert it to a table with only one line in such a format: Fauf duration_Station1 duration_Station2, duration_Station7, duration_Station10 Thanks for helping in advance!    
    index="main" source="all_digikala1.csv" | table title price | map search="search index=main source=all_sites1.csv | eval title_m=$title$,price_m=$price$ | table title_m price_m title price stor... See more...
    index="main" source="all_digikala1.csv" | table title price | map search="search index=main source=all_sites1.csv | eval title_m=$title$,price_m=$price$ | table title_m price_m title price store " maxsearches=99999999 | similarity textfield=title_m comparefield=title algo=MASI limit=200000000 | sort limit=0 -similarity | where similarity > 0.2 | table title_m title similarity store        I run the above code in Splunk   but it returns the following error Error in 'similarity' command: External search command exited unexpectedly with non-zero error code 9   The map part works correctly and returns a result of about 15 million, but the similarity part has a problem with this number, because when I reduce the output number of the map below 14 million, the code works correctly and the result is correct. Of course, this has nothing to do with the similarity command, because for example, when I use the jellyfisher command instead of similarity, the same error occurs again.   similarity command is related to "nlp text analytics", which has been added to Splunk
Hi Splunkers, I'm working on a dashboard panel where I have to show the average count of events for the users. This should be dynamic means it should give the average based on the time I select. ... See more...
Hi Splunkers, I'm working on a dashboard panel where I have to show the average count of events for the users. This should be dynamic means it should give the average based on the time I select. Example: users      count a                 9 b                 3 c                 5 d                 2 If I select time frame 7 days and user "a" from inputs, The panel should show the avg of count of events for the 7 days of the user a. Please help me to achieve this TIA.
Hello everyone, I have a csv file which shows me the power status of the server i.e if the server is powered on or off. I want to make a table with powered on as individual row and powered off as ano... See more...
Hello everyone, I have a csv file which shows me the power status of the server i.e if the server is powered on or off. I want to make a table with powered on as individual row and powered off as another individual row and show the total no of powered on servers and powered off servers as count
Hi Team, I have time in below two  formats and I want to convert them to minutes. How can I do this Format 1 1 Hour 10 Hours 47 Minutes 1 Day 5 Hours 15 Minutes 45 Minutes Format 2 ... See more...
Hi Team, I have time in below two  formats and I want to convert them to minutes. How can I do this Format 1 1 Hour 10 Hours 47 Minutes 1 Day 5 Hours 15 Minutes 45 Minutes Format 2  00:00:00 00:09:00 22:30:00
Hello Members, I have a basic question - I am not sure how to get data into splunk, into a custom index, use a source type, and then exrract fields. I have the add-0n installed for Cisco network de... See more...
Hello Members, I have a basic question - I am not sure how to get data into splunk, into a custom index, use a source type, and then exrract fields. I have the add-0n installed for Cisco network devices, but not sure it is the correct app to use for my case. I have a remote syslog server (running rsyslog) that builds log files for cisco switches and routers. I have a universal forwarder installed on the syslog server, it forwards data to splunk IF I configure it correctly.  I have tried configuring the Splunk receiver two ways: one using the "Forwarding and receiving" option from the "DATA" area - this works - but only allows showing data from the host sending the log info. And uses only 1 port, I am using 9997. I have not seen how to set a data source or source type for the incoming data.   The second way seems to be using the "Data Inputs" part of the "DATA" area.  This seems to not be possible, as the data is coming from a Universaly forwarder not a Splunk Enterprise configured as a forwarder.   How can I assign a source type and index to the data that does come in from the host that is configured with port 997 as a receiver?  Sorry for such a confusing question, Regards, eholz1    
Is there an SPL query to know the last date  UFs phoned in to a specific DS. We've many DS in our company
I have Splunk installed on a VM in vmware. The block size is currently configured for 4k blocks. We are considering moving the storage to different drives and I want to make sure the drives we are mo... See more...
I have Splunk installed on a VM in vmware. The block size is currently configured for 4k blocks. We are considering moving the storage to different drives and I want to make sure the drives we are moving it to meet the 800 required IOPs Splunk requires. I will be following the documentation in this post to test the storage using FIO (https://community.splunk.com/t5/Monitoring-Splunk/Calculating-IOPS-using-FIO-testing/m-p/455055). Will the 4k block size affect the results of the IOPs test? Is there a recommended block size for the drive to be configured to for Splunk?
I have data that looks like the following: Week               Employee        Project# 6/3/2022         A                      001 6/3/2022         A                      002 6/10/2022       ... See more...
I have data that looks like the following: Week               Employee        Project# 6/3/2022         A                      001 6/3/2022         A                      002 6/10/2022       A                      002 6/10/2022       B                      002 6/17/2022       A                      003 6/17/2022       B                      001 6/17/2022       B                      002 6/24/2022       B                      001 I would like to get a count of the total of the number of distinct weeks that employees appear in the data regardless of how many projects they have an entry for .  So, for the above the count should be 6 as below: 6/3/2022 > Employee A > Count=1 6/10/2022 > Employee A and B > Count=2 6/17/2022 > Employee A and B > Count=2 6/24/2022 > Employee B > Count=1 Is there some way I can use multiple fields in Distinct Count to accomplish this?  
Hey all, I'm trying to pull in the Syslog or our Meraki MX to our on-premise Splunk Enterprise in order to monitor internal port scanning. Right now I have the Syslogs coming in via the Data input ... See more...
Hey all, I'm trying to pull in the Syslog or our Meraki MX to our on-premise Splunk Enterprise in order to monitor internal port scanning. Right now I have the Syslogs coming in via the Data input > UDP (514). I see all the data being pulled in correctly however when I search internal traffic communication it shows everything going to the broadcast IP. I'm not sure if I should be using a different method, but I would appreciate some guidance on best practices to monitor internet traffic. Thanks!
How to create a 14 day search for specific time range (02:00 - 06:00) only?    
I have a lookup table with only one field, named host. The table contains a list of hostnames.  I'm trying to find a way to get a count of events by host using this lookup table as the input (i.e. ... See more...
I have a lookup table with only one field, named host. The table contains a list of hostnames.  I'm trying to find a way to get a count of events by host using this lookup table as the input (i.e. the hosts I want a count for).  I've tried a variety of approaches. For example:      |inputlookup file.csv | stats count by host     Every host returns a count of 1.      |inputlookup file.csv | join type=left host [|tstats count by host]     About a dozen hosts return counts; the rest return null values.  Complicating this problem seems to be case. If I crunch all the hosts to upper or lowercase, I get different results, but neither returns a complete result set. That seems super odd given that field values aren't case sensitive. I've tried crunching case with eval as well as in the lookup table itself, to no avail.  We're stumped. What is the best approach to use a lookup table of hostnames to get an event count by host?
I have the following row in a CSV file that I am ingesting into a Splunk index: "field1","field2","field3\","field4" Excel and the default Python CSV reader both correctly parse that as 4 separate ... See more...
I have the following row in a CSV file that I am ingesting into a Splunk index: "field1","field2","field3\","field4" Excel and the default Python CSV reader both correctly parse that as 4 separate fields. Splunk does not. It seems to be treating the backslash as an escape character and interpreting field3","field4 as a single mangled field. It is my understanding that the standard escape character for double quotes inside a quoted CSV field is another double quote, according to RFC-4180: "If double-quotes are used to enclose fields, then a double-quote appearing inside a field must be escaped by preceding it with another double quote." Why is Splunk treating the backslash as an escape character, and is there any way to change that configuration via props.conf or any other way? I have set: INDEXED_EXTRACTIONS = csv KV_MODE = none for this sourcetype in props.conf, and it is working fine for rows without backslashes in them.
Can the DGA App for Splunk be installed in Splunk Cloud?
I've imported a .csv that has many fields, but the only one I care about has multiple values in it.  pluginText: <plugin_output> Computer Manufacturer : VMware, Inc. Computer Model : VMware V... See more...
I've imported a .csv that has many fields, but the only one I care about has multiple values in it.  pluginText: <plugin_output> Computer Manufacturer : VMware, Inc. Computer Model : VMware Virtual Platform Computer SerialNumber : This is what I REALLY need Computer Type : Other Computer "ect".. </plugin_output> I've tried extracting, and filtering, I believe Regex may work, but that is where I'm at.  
I'm trying to run a query to figure out the top 10 src_ip's along with their top 10 urls visited. When I try the below query it's giving me every src_ip instead of just the top 10. Any suggestions ... See more...
I'm trying to run a query to figure out the top 10 src_ip's along with their top 10 urls visited. When I try the below query it's giving me every src_ip instead of just the top 10. Any suggestions on how to limit the search for just the top 10 src_ip by top 10 url? I've been running something like this: index=firewall | stats count by src_ip, url | sort 0 src_ip -count | streamstats count as standings by src_ip | where standings < 11 | eventstats sum(count) as total by category | sort 0 -total src_ip -count
Is there a way to send all matching notable events to a custom index with very vague fields (due to confidentiality reasons)? I would like to send event data to a new index that basically says "You... See more...
Is there a way to send all matching notable events to a custom index with very vague fields (due to confidentiality reasons)? I would like to send event data to a new index that basically says "You have a new alert" so that I can integrate it with an XSOAR solution without disclosing any confidential information. This is due to the way the ingestion script is written - anyone can modify the query to pull information from the logs.  The intention is to notify analysts that an alert is present without (potentially) exposing this information to unauthorized individuals.