All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I have upgraded from the old defender app to the new Microsoft 365 Defender Add-on for Splunk. I finally got it working after renewing secrets etc... but seems like there are a lot of dupl... See more...
Hello, I have upgraded from the old defender app to the new Microsoft 365 Defender Add-on for Splunk. I finally got it working after renewing secrets etc... but seems like there are a lot of duplicate events for each incident triggered.  How can we get this too work properly? Can Splunk give proper support on this?  These small input apps are vital for a proper working of our SOC en Splunk ES environment. 
I have a few questions about archiving in Splunk cloud. Specifically DDAA which needs to be purchased. 1. How long can data be archived? I believe DDAA has a maximum of ten years? 2. How much data... See more...
I have a few questions about archiving in Splunk cloud. Specifically DDAA which needs to be purchased. 1. How long can data be archived? I believe DDAA has a maximum of ten years? 2. How much data can be restored at a time? I believe it's something like a maximum of two days worth at a time, and restored data will remain available for 30 days? 3. Is it possible to offboard from DDAA? If we log a support ticket to have the data moved to an S3 bucket for example? Thanks, Blake
I am trying to calculate difference between two dates including seconds. But i am unable to find any logs. Please help My query index=main source="https://test.ticketing-tool.com/" dv_state=* dv_p... See more...
I am trying to calculate difference between two dates including seconds. But i am unable to find any logs. Please help My query index=main source="https://test.ticketing-tool.com/" dv_state=* dv_priority="4 - Low" number=SIR0010241 | dedup number |eval startTime=strptime(dv_opened_at,"%Y-%m-%d %H:%M:%S:%3N") |eval endTime=strptime(dv_sys_updated_on,"%Y-%m-%d %H:%M:%S:%3N") |eval TimeDiff=tostring((endTime-startTime),"duration") |table dv_opened_at dv_sys_updated_on TimeDiff, number @soutamo @ITWhisperer  @gcusello @thambisetty @bowesmana @DalJeanis     
I have derived data from Splunk in the following format (Actual Format). But I want to format furthermore it in such a way that I can see which items are present in which categories, and which are mi... See more...
I have derived data from Splunk in the following format (Actual Format). But I want to format furthermore it in such a way that I can see which items are present in which categories, and which are missing (Expected Format). I am trying to chart it based on categoryID, but it's not working for me as I do not think max function is appropriate for this. Can anyone please help me know how can I achieve this Tried using | chart max(itemId) over itemId by categoryID Expected Format Expected Format Actual Format  Actual Data
Hi All, User has configured to send the logs from his end to splunk via syslog method. He has enabled debug logs at his end. We are able to see the logs in Splunk Search Head (Log in /log out/ som... See more...
Hi All, User has configured to send the logs from his end to splunk via syslog method. He has enabled debug logs at his end. We are able to see the logs in Splunk Search Head (Log in /log out/ some other logs) When user performs show command at his end --> Those logs are not seen in splunk. May I know what is missing here? Thanks, Vijay Sri S  
I am using Splunk 8.1.0 with the Sankey Diagram 1.5.0 app. I have a sankey diagram viz in a dashboard. The source and target fields of the sankey correspond to  dashboard form inputs that I use to ... See more...
I am using Splunk 8.1.0 with the Sankey Diagram 1.5.0 app. I have a sankey diagram viz in a dashboard. The source and target fields of the sankey correspond to  dashboard form inputs that I use to filter queries. When the dashboard user clicks on the left side of the sankey diagram (a source field value), I want to set the corresponding form input to that value. Similarly, when the user clicks on the right side of the sankey diagram (a target field value), I want to set the corresponding form input to that value. How do I do that? I've tried:         <drilldown> <condition field="source"> <set token="form.applid">$click.name2$</set> <set token="applid">APPLID="$click.name2$"</set> </condition> <condition field="target"> <set token="form.poolid">$click.name2$</set> <set token="applid">"ENQ Pool ID"="$click.name2$"</set> </condition> </drilldown>           (where "applid" and "poolid" are the tokens set by the form inputs, and also the corresponding form input id attribute values) but that navigates to a new tab with a new search that has the "set" values appended: ... APPLID=TXCQAIC "ENQ Pool ID"=EXECSTRN I don't want to navigate to a new window. I want to remain on the same dashboard, and just update the form inputs and their corresponding tokens.
While getting Netflow data using streams, I aggregate a variable "bytes_in" as a sum of the bytes_in received in a flow, this works well, and I get a new variable called "sum(bytes_in)" The problem ... See more...
While getting Netflow data using streams, I aggregate a variable "bytes_in" as a sum of the bytes_in received in a flow, this works well, and I get a new variable called "sum(bytes_in)" The problem I am experiencing is this: Streams in the previous version, used to index the data for the aggregate "sum(bytes_in)" as "bytes_in". Now since I upgraded to Streams 7.2, Streams indexes the data as "sum(bytes_in)" And this is causing me a lot of issues when trying to get this data into a data model, every time I try to use this data, the SPL fails  This works in a normal search: I use stats to get the max value of "sum(bytes_in)" and distinguish events by src_ip,dest_ip, etc  index=streams | stats max(sum(bytes_in)) AS bytes_in by src_ip,dest_ip,dest_port,src_port,timestamp | where isnum(bytes_in) But When I try and import the streams data into a data model, the name of the variable "sum(bytes)" seems to be causing a lot of issues: error message: "Error in 'eval' command: The 'sum' function is unsupported or undefined." So I need to either: - from streams, rename the variable name "sum(bytes_in)" to "bytes_in for example, but I cannot find how to do this in streams OR - in the data model, rename the variable name "sum(bytes_in)" to "bytes_in for example, but I cannot seem to find how to do that as well
Hi community, Need your help..! is there any possibility that we can create a dashboard for AV related issues or notables...?  was using the below query but could get the exact results. requesting ... See more...
Hi community, Need your help..! is there any possibility that we can create a dashboard for AV related issues or notables...?  was using the below query but could get the exact results. requesting you to help me on this to create a dashboard for AV related alerts for the servers. | tstats summariesonly=true max(_time) AS time values(Malware_Attacks.file_name) AS fileName values(Malware_Attacks.signature) AS signature from datamodel=Malware.Malware_Attacks by Malware_Attacks.event_description, Malware_Attacks.dest Malware_Attacks.action | makemv delim="|" fileName | makemv delim="|" signature | rename Malware_Attacks.event_description AS event_description | rename Malware_Attacks.dest AS dest | rename Malware_Attacks.action as action | regex event_description!="blocked" | regex event_description!="deleted" | regex event_description!="Cleaned" | regex event_description!="handled" | where event_description!="Exploit Prevention Files/Process/Registry violation detected" OR threat_handled!=1 | where event_description!="Infected file found, access denied" OR threat_handled!=1 | search action!=handled event_description!=DLL* event_description!="Script security violation detected, AMSI would block" | table time event_description dest fileName signature   Thanks, Kishore
Hi! Need help with this please. I have to extract the IP address from this: src=45.141.87.33:53402:X19 value 53402 and value X19 could be anything.   help please!  
Hi, I am fetching data for Splunk from Sql database. I found some of the rows are missing.. I am checking it for complete day with below splunk query index="myavista_events" sourcetype="myavista:s... See more...
Hi, I am fetching data for Splunk from Sql database. I found some of the rows are missing.. I am checking it for complete day with below splunk query index="myavista_events" sourcetype="myavista:sitecore:sqldb" | stats count and for the same period I am checking it SQL with sql query and found lots of diff in count.. In SQL data count is more as compare to Splunk.. SO some data is missing in Splunk.   I am fetching the data at every 5 min interval from DB.. And I tried to check the count in each fetch with below Splunk query.. index=_internal ServerName "format_hec_success_count"  This is giving count like format_hec_success_count=3365  But this number is also not matching with sql query for same timespan.. Please suggest how can I get the complete sql data in splunk...
Mydata is like below where the customerNumber can come like CustomerNumber or customernumber or CUSTOMERNUMBER AND isoCountryCode can come as IsoCountryCode, ISOCountryCode or any other combination.... See more...
Mydata is like below where the customerNumber can come like CustomerNumber or customernumber or CUSTOMERNUMBER AND isoCountryCode can come as IsoCountryCode, ISOCountryCode or any other combination.   Now during field extraction Splunk considers all these fields as seperate. though while writing query i want to consider all these fields as one. Environment = prod-dmz-usch01 | API = testapi| RequestURI = /test/v5/tesdt/10-12345?customerNumber=01-12345&isoCountryCode=US | ProxyRequestFlowName = testDetails-OpenAPIv3GetVerb.   My query is as below:   index=test sourcetype="testsamples" testapi "ProxyRequestFlowName = testDetails-OpenAPIv3GetVerb" | search isocountrycode=US OR isoCountryCode=US   -- this seems to be taking care of multiple values but it is not a good idea to write each field here, how to handle all scenario's ? | bucket _time span="24h" | chart count by customerNumber where count in top100 -- i am able to give only one value of customer number here , how can i handle all use cases ?
Hello all... I have events that have a timestamp that starts with:         2014-05-07 13:12:27 2910 ...         The trailing # (2910) is variable in length and this seems to mess things up.... See more...
Hello all... I have events that have a timestamp that starts with:         2014-05-07 13:12:27 2910 ...         The trailing # (2910) is variable in length and this seems to mess things up. I have been setting this in my props.conf for this sorucetype:         TIME_FORMAT = %Y-%m-%d %H:%M:%S          This is not working. What am I missing?
I've been attempting to set a url for our single instance of Splunk. Right now we've been accessing it at http://servername:8000, but what I'm looking for is something like http://company.splunk.com.... See more...
I've been attempting to set a url for our single instance of Splunk. Right now we've been accessing it at http://servername:8000, but what I'm looking for is something like http://company.splunk.com. I've been able to set a cname dns record, but it only seems to accept the server name. I'm wondering what I'm missing to be able to redirect both the server name and port number.
I'm using the splunk-logging library found here to setup event collection from a web app.  I have followed the Basic Example in the repo's README and also from the how to page The error I'm seeing ... See more...
I'm using the splunk-logging library found here to setup event collection from a web app.  I have followed the Basic Example in the repo's README and also from the how to page The error I'm seeing is ERROR: TypeError: Failed to fetch CONTEXT from line 32 in splunklogger.js. The code that sets up the logger looks like the following.  const config = { token: splunkToken, url: splunkUrl }; const SplunkLogger = require('splunk-logging').Logger; const logger = new SplunkLogger(config); const payload = { message: 'hello world', metadata: { index: 'hms' sourcetype: 'ui' } }; logger.send(payload, (error, resp, body) => { console.log(error); });   As I'm writing this out I'm wondering if the error I'm getting is because I'm using the logger from the browser, so there is a cross origin issue. Is this the issue or is there an issue in the code?
All,  Anyone have an app or search that can help a non-technical user review retention of data by source?  They'd like to see what the index config is, what the current oldest log is and what the p... See more...
All,  Anyone have an app or search that can help a non-technical user review retention of data by source?  They'd like to see what the index config is, what the current oldest log is and what the project ETA to fill the volume?    thanks in advance,  -Daniel 
I have created a workflow action to send a Notable Event to ServiceNow to create an incident. I am unable to figure out how to resolve nested tokens. For example, if the rule title for the correlatio... See more...
I have created a workflow action to send a Notable Event to ServiceNow to create an incident. I am unable to figure out how to resolve nested tokens. For example, if the rule title for the correlation rule is "Host With A Recurring Malware Infection ($signature$ On $dest$)"  and I use: `notable` | search event_hash=$event_hash$ | eval comments="$rule_title$" | snowincidentalert what ends up in ServiceNow is "Host With A Recurring Malware Infection ($signature$ On $dest$)". The signature and dest tokens do not get expanded.  How can I tell it to recursively expand any tokens nested inside other tokens?  
Hello, I am trying to create a table output of events in logilfe. Here is the query -   index=myindex <my search> | rex ".*source=(?<source>[^,]+).*col1=(?<Col1>[^,}]+).col2=(?<Col2>[^,}]+).col3=(... See more...
Hello, I am trying to create a table output of events in logilfe. Here is the query -   index=myindex <my search> | rex ".*source=(?<source>[^,]+).*col1=(?<Col1>[^,}]+).col2=(?<Col2>[^,}]+).col3=(?<Col3>[^,}]+)" | kv | table source Col1 Col2 Col3 | sort – source Col1 Col2   In my source column,  values can be like - Email, Scan, or Fax But when I get the result then I get the value of source as the "source file" - "D:\App\tomcat\logs\applog.log" instead of values like "Email, Scan, or Fax" How can I get the values of "source" fro the logfile event. Thanks!
Hey all!   I've seen similar Splunk Help answers similar to mine but I'm having some issues with getting it to work exactly how I want. Essentially I am trying to link together multiple events in on... See more...
Hey all!   I've seen similar Splunk Help answers similar to mine but I'm having some issues with getting it to work exactly how I want. Essentially I am trying to link together multiple events in one source and then correlate that with another source. So I have two sources which I've given sample sources of at the bottom of this post. For each ID listed there is data in TestPOAM.csv and possibly data for it TestRem.csv. For each ID there can be multiple remediation actions listed in TestRem.csv or none at all. My current issue is that when there is no Action Identified in TestRem.csv for an App then I want it to fill with "N/A". Below is my current search which is getting me very close to the results I want.     index="testdata" (POAMApps="*Test1*" OR RemApps="*Test1*") | stats values("POAMApps") AS "POAMApps" values("Description") AS "Description" values("ActionID") AS "ActionID" values("RemApps") AS "RemApps" values("RemAction") AS "RemAction" BY "ID" | sort ID     The above search gives me the following:   My issue with the above is that when there is nothing identified in ActionID, RemApps, or RemAction I need it to fill that with "N/A". When I use fillnull like in the following search:     index="testdata" (POAMApps="*Test1*" OR RemApps="*Test1*") | fillnull ActionID, RemApps, RemAction value="N/A" | stats values("POAMApps") AS "POAMApps" values("Description") AS "Description" values("ActionID") AS "ActionID" values("RemApps") AS "RemApps" values("RemAction") AS "RemAction" BY "ID" | sort ID     It fills the source data for TestPOAM.csv with N/A meaning that it shows up in columns with actions already in them as pictured in the following:   If someone knows a better way to correlate these events or how to do a fillnull only for one source that help would be greatly appreciated. If anything is confusing please just let me know and I can clarify. Data Sources Source1: TestPOAM.csv ID POAMApps Description 1 Test1 Description1 2 Test2 Description2 3 Test3 Description3 4 Test4 Description4 5 Test5 Description5 6 Test6 Description6 7 Test1, Test6 Description7 8 Test3, Test5 Description8 9 Test2, Test3 Description9 10 Test1, Test5 Description10 11 Test1, Test2, Test3 Description11 12 Test2, Test3, Test4 Description12 13 Test4, Test5, Test6 Description13 14 Test1, Test4, Test6 Description14 15 Test2, Test3, Test6 Description15   Source2: TestRem.csv ID ActionID RemApps RemAction 1 1 Test1 Action1 1 2 Test1 Action2 2 3 Test2 Action3 2 4 Test2 Action4 3 5 Test3 Action5 5 6 Test5 Action6 6 7 Test6 Action7 7 8 Test1, Test6 Action8 7 9 Test1 Action9 7 10 Test6 Action10 8 11 Test3, Test5 Action11 11 12 Test1, Test2, Test3 Action12 11 13 Test1, Test2 Action13 11 14 Test2, Test3 Action14 11 15 Test1 Action15 11 16 Test3 Action16 12 17 Test2, Test3, Test4 Action17 12 18 Test3, Test4 Action18 12 19 Test2 Action19 12 20 Test3 Action20 13 21 Test4, Test5, Test6 Action21 14 22 Test4 Action22 15 23 Test2, Test3, Test6 Action23 15 24 Test2 Action24 15 25 Test3 Action25 15 26 Test6 Action26 15 27 Test2, Test6 Action27 15 28 Test3, Test6 Action28 15 29 Test2, Test3 Action29 15 30 Test2 Action30 15 31 Test6 Action31
Hi i want to make a chart that shows real time packet loss percentage of gateways  but there are two problem 1.the firewall sends logs only when packet loss  occurring therefor in line-chart there ... See more...
Hi i want to make a chart that shows real time packet loss percentage of gateways  but there are two problem 1.the firewall sends logs only when packet loss  occurring therefor in line-chart there is no correct value for zero packet loss since line match two non zero points 2. i want to show all five gateway in single chart with different colors here is what i search and get... TNX
Hi We have a forwarder that is sending partial data. We can identify the files that it is not sending (Image below). However, when we copy the forwarder and change only the host name, it sends the... See more...
Hi We have a forwarder that is sending partial data. We can identify the files that it is not sending (Image below). However, when we copy the forwarder and change only the host name, it sends the reminding files that were missing, we don’t delete fish buckets we just restart it and give it a new host name…any ideas?         [monitor:///net/dell552srv.fr.murex.com/dell552srv1/apps/AMBER_PSC47_SEC1.../*.log] disabled = false host = TEST_CLUSTER1 index = mxtiming_live whitelist=mxtiming.*\.log$ blacklist=logs_|fixing_|tps-archives|mxtiming_crv_nr.*|mxtiming_437_dell552srv.fr.murex.com_215699.log crcSalt = <SOURCE> sourcetype = MX_TIMING2           props         [MX_TIMING2] FIELD_DELIMITER = | DATETIME_CONFIG = NO_BINARY_CHECK = true category = Custom description = MX_TIMING disabled = false pulldown_type = true REPORT-MX-TIMING = REPORT-MX-TIMING2 EXTRACT-MX-TIMING = ^(?:[^\|\n]*\|){6} *-*(?P<Elapsed>\d+\.\d+)\w+\| *-*(?P<CPU>\d+\.\d+)s\| *-*(?P<CPU_PER>\d+)%\| EXTRACT-MX-TIMING2 = ^(?:[^\|\n]*\|){11} *-*(?P<Elapsed_C>\d+\.\d+)\w+\| EXTRACT-MX-TIMING3 = ^(?:[^\|\n]*\|){9} *-*(?P<RDB_COM1>\d+\.\d+)s\| *-*(?P<RDB_COM_PER1>\d+)%\s+\| EXTRACT-MX-TIMING-Memory = \| *(?P<Memory>\d+\.\d+)Mb(\|\s?(?P<VmHWM>\d+\.\d+)Mb)?(\|\s?(?P<Malloc>\d+\.\d+)Mb)?$ TRANSFORMS-set = setnull, setparsing_mxtiming           Transform         [setparsing_mxtiming] REGEX = (Deal insertion|contract insertion|Realtime Shutdown|SessionCreate|SessionKill|Read SHM|Read_SHM|Updated keys|Portfolio_Load|Viewer|Publishing Config|simulation|BOS|MPC|MXWAREHOUSE|RequestDocument|LOGIN|event|Bulkportfoliomodification|Bulkunwind|unwind|Event_insertion|Deal_input) DEST_KEY = queue FORMAT = indexQueue