All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi we have a microservices based system and have several services running , the developers put unti a lookup table the complete search string, . I am ablr to retrieve the string from lookup but not... See more...
Hi we have a microservices based system and have several services running , the developers put unti a lookup table the complete search string, . I am ablr to retrieve the string from lookup but not able to execute it | inputlookup searchstring.csv | streamstats count as Rowcount | where Rowcount =1 | search Search_String   a sample of what is there in Search_String , this one is simple but sometimes there are complex queries   index=abc* AND source= xyz* AND host=* AND ERROR=50* | stats count as 5xx_Errors     how to make the Search string in lookup execute
Here's my json example file, log.json:   {"ts":"2022-01-01 01:22:34","message":"test4"} {"ts":"2022-01-01 01:22:35","message":"test5"} {"ts":"2022-01-01 01:22:36","message":"test6"}   And her... See more...
Here's my json example file, log.json:   {"ts":"2022-01-01 01:22:34","message":"test4"} {"ts":"2022-01-01 01:22:35","message":"test5"} {"ts":"2022-01-01 01:22:36","message":"test6"}   And here's a props.conf that at least parses the json:   [ json_test ] DATETIME_CONFIG=CURRENT INDEXED_EXTRACTIONS=json NO_BINARY_CHECK=true SHOULD_LINEMERGE=false   But when I try to get "ts" to be parsed as the timestamp, it fails completely:   [ json_test ] CHARSET=UTF-8 DATETIME_CONFIG=None INDEXED_EXTRACTIONS=json NO_BINARY_CHECK=true SHOULD_LINEMERGE=false TIME_FORMAT=%Y-%m-%d %H:%M:%S TIMESTAMP_FIELDS=ts TZ = America/Los_Angeles   I've also tried adding TIME_PREFIX = "ts":", but again, it does nothing (and turns blue in the Add Data dialog, which I think means there's something wrong). Any idea what I'm missing?
Hi, how do i craft a search to match 2 fields from my raw events with  2 fields from a CSV file and output if one of the fields is different  ?  Requirement is to match the country_name and email ... See more...
Hi, how do i craft a search to match 2 fields from my raw events with  2 fields from a CSV file and output if one of the fields is different  ?  Requirement is to match the country_name and email from raw events versus to what is there in the csv file.  Basically If the country_name in the raw events in DIFFERENT as in if it does not match the "Country" field in the lookup (based on user's email) then display those results only.   Lookup file structure: Email_id Country         The Raw events have fields called email and country_name. Below is what i am trying but its not working.       index=xxx | search [inputlookup file.csv ] where (country_name != Country) AND (Email!=Email_id) table displayName, Email_id country_name        
Each event has been ingested twice with the same uuid. i want to keep one event only for each uuid.   How to delete one event only for each uuid?    for searching index="okta*" | dedup uuid... See more...
Each event has been ingested twice with the same uuid. i want to keep one event only for each uuid.   How to delete one event only for each uuid?    for searching index="okta*" | dedup uuid, it will show events with the unique uuid only it will show half of total events that i want.   then i run index="okta*" | dedup uuid | delete , but this operation is not allowed  it will show "this command cannot be invoked after the command simpleresultcombiner"   Anyone have suggestion?
I'm having some troubles parsing data prepended to json logs. I can do it via search, but I'd like to do it upon logging within splunk so I can search the parsed data. Can you point me in the right d... See more...
I'm having some troubles parsing data prepended to json logs. I can do it via search, but I'd like to do it upon logging within splunk so I can search the parsed data. Can you point me in the right direction and if I can do this via the UI or need to go into props.conf manually? This is working via search       sourcetype="Untangle"| rex "(?<json>\{.+)" | spath input=json         What I've tried in props.conf       [untangle] EXTRACT-untangle=(?<json>\{.+)       Example Log:       Mar 29 01:45:04 _gateway Mar 28 20:45:04 INFO uvm[0]: {"timeStamp":"2022-03-28 20:45:04.762","s2pBytes":160,"p2sBytes":65,"sessionId":107845676257000,"endTime":0,"class":"class com.untangle.uvm.app.SessionStatsEvent","c2pBytes":65,"p2cBytes":160}        
Hello, Need to create a date/time token for a dbxlookup. The default values for start needs to be Thursday from the previous week, and the end needs to be Monday from the current week. Here is th... See more...
Hello, Need to create a date/time token for a dbxlookup. The default values for start needs to be Thursday from the previous week, and the end needs to be Monday from the current week. Here is the dbxlookup command.   | dbxquery connection="SPLUNK" maxrows=0 query="select * from VW_SPLUNK where to_date (TO_CHAR (create_date_time, 'yyyy-mm-dd hh24:mi:ss'), 'yyyy-mm-dd hh24:mi:ss') between to_date ('$tokFromDate1$ $tok_startTime$','yyyy-mm-dd hh24:mi:ss') AND to_date ('$tokToDate1$ $tok_EndTime$','yyyy-mm-dd hh24:mi:ss') "   Here is the XML I have for the tokens.   <input type="radio" token="tok_toggleTime"> <label>Select date: Thu-Mon vs Mon-Thu</label> <choice value="ThM">Last Thursday to Monday</choice> <choice value="MTh">Last Monday to Thursday</choice> <change> <condition value="ThM"> <set token="tok_startTime1">06:45:00</set> <set token="tok_endTime1">09:30:00</set> <set token="tok_textStartTime">1</set> <set token="tokFromDate1">@w0-3d</set> <set token="tokToDate1">@w0+1d</set> </condition> <condition value="MTh"> <set token="tok_startTime1">06:45:00</set> <set token="tok_endTime1">09:30:00</set> <set token="tok_textEndTime">1</set> <set token="tokFromDate1">@w0+1d</set> <set token="tokToDate1">@w0-3d</set> </condition> </change> </input> <input type="text" token="tok_startTime" searchWhenChanged="true"> <label>Start Time - Can enter new time</label> <default>$tok_startTime1$</default> <suffix/> </input> <input type="text" token="tok_endTime" searchWhenChanged="true"> <label>End Time - Can enter new time</label> <default>$tok_endTime1$</default> <suffix/> </input>     Thanks and God bless, Genesius
Hi, Let's say I have a Company directory lookup (e.g. Company_Directory) and I want to lookup the entire hierarchy of supervisors for a specific employee. For instance>>> Alice reports to Bob, the... See more...
Hi, Let's say I have a Company directory lookup (e.g. Company_Directory) and I want to lookup the entire hierarchy of supervisors for a specific employee. For instance>>> Alice reports to Bob, then take Bob as new lookup criteria... Bob reports to Cathy, etc....   and append this all  in a chain of command >>> Alice, Bob, Cathy, Donna, Eric, Fred.... etc Does Splunk have a command/capability to take the results as feed back to a lookup loop? Thank you
I'm trying to create a multi-series line chart in a Splunk dashboard that shows the availability percentages of a given service across multiple individual days for a set of hosts.  In other words, da... See more...
I'm trying to create a multi-series line chart in a Splunk dashboard that shows the availability percentages of a given service across multiple individual days for a set of hosts.  In other words, date is my x-axis, availability is my y-axis, and my legend contains the various hosts.  Since a picture is worth a thousand words, here's an example of what I'm trying to create:   And here is my attempt at the query that's currently not working (it's using the rhnsd service as an example and assumes ps.sh is being run every 1800 seconds so it calculates availability based off ps.sh running 86400/1800=48 times per day):     index=os host="my-db-*" sourcetype=ps rhnsd | timechart span=1d count by host | eval availability=if(count>=86400/1800,100,count/floor(86400/1800)*100) | rename _time as Date host as Host availability as Availability | fieldformat Date = strftime(Date, "%m/%d/%Y") | chart first(Availability) over Host by Date     Any assistance with getting this query working so I can visualize it in a multi-series line chart in a dashboard panel would be greatly appreciated.
When I navigate to https://<splunk-server>:8089/ServiceNS I am running into an error. When I go to other pages..."/services" and so on, they work fine. Could I get some guidance on what to do in orde... See more...
When I navigate to https://<splunk-server>:8089/ServiceNS I am running into an error. When I go to other pages..."/services" and so on, they work fine. Could I get some guidance on what to do in order to bring this up?
Due to some issue, We have to discontinue our existing Heavy Forwarder and move all the sources, data inputs, Splunk TA Apps/add-on one new server where we have already installed Heavy forwarder. B... See more...
Due to some issue, We have to discontinue our existing Heavy Forwarder and move all the sources, data inputs, Splunk TA Apps/add-on one new server where we have already installed Heavy forwarder. Both of the heavy forwarders are having the same version and present in Linux environment only. So just need to understand how can we easily move all the existing conf files, Splunk TA Apps/add-on and data inputs. So that without hampering any data loss we can move the HF server. Please add the splunk doc link as well if possible which we can use for linux environment. 
I am trying to bring in some .txt logfiles using Splunk forwarder. There are several logs in the directory, such as Log.txt, 10Log.txt, 20Log.txt, etc. These files are changed daily, and the 10, 20, ... See more...
I am trying to bring in some .txt logfiles using Splunk forwarder. There are several logs in the directory, such as Log.txt, 10Log.txt, 20Log.txt, etc. These files are changed daily, and the 10, 20, etclog.txt files are written to daily. So far, I can only get Splunk to ingest the Log.txt file and nothing else. My inputs.conf file is currently as below. I have tried to monitor just *.txt with the same results. Only Log.txt is read/ingested. [monitor://E:\Logs\CIR_Remote\*Log.txt] disabled = false sourcetype = LOG4NET index = log4net initCrcLength=1024 any input would be appreciated!
Hello, I know we can use SPLUNK GUI to create source types. But how I would create a new source type from CLI or using props.conf file. Any help will be highly appreciated, thank you. Does this p... See more...
Hello, I know we can use SPLUNK GUI to create source types. But how I would create a new source type from CLI or using props.conf file. Any help will be highly appreciated, thank you. Does this props.conf  is going to create  new source type test:audit  if it doesn't exist [test:audit] SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]*)<MODTRANSAUDTRL> TIME_PREFIX= <TIMESTAMP> TIME_FORMAT=%Y-%m-%dT%H:%M:%S.%2N MAX_TIMESTAMP_LOOKAHEAD=24 TRUNCATE=1000  
#Splunk t-shirt idea
I've read all the articles and past questions but I must be missing something. Our requirement is simple 6 months searchable, 6 months frozen. then delete. but seems there is not an easy setting for ... See more...
I've read all the articles and past questions but I must be missing something. Our requirement is simple 6 months searchable, 6 months frozen. then delete. but seems there is not an easy setting for anything less than cold to say 6months before roll. just seems data sizes?  currently our hot/warm/cold disk space is full and frozen is empty [ns-switches] homePath = volume:primary/ns-switches/db coldPath = volume:primary/ns-switches/colddb thawedPath = $SPLUNK_DB/ns-switches/thaweddb maxTotalDataSizeMB = 512000 maxDataSize = auto_high_volume coldToFrozenDir = /splunkfrozen/idx1/ns-switches/frozendb frozenTimePeriodInSecs = 4320000
Hi Team, I've multiple monitors on multiple forwarders and multiple tcpouts, I need to use forwarder hostname to route the monitor to respective tcpout, is there a configuration which can provi... See more...
Hi Team, I've multiple monitors on multiple forwarders and multiple tcpouts, I need to use forwarder hostname to route the monitor to respective tcpout, is there a configuration which can provide this forwarder host based routing ?   Thanks in advacne.  
  payload: Message { channel=EMAIL , type=security_event_postinfection_admin , locale=it_IT , recipientAddress=LIOUDMILA@ME.COM, data=[MESSAGEDATA { key=domain, value=https://okt.to/ , type=n... See more...
  payload: Message { channel=EMAIL , type=security_event_postinfection_admin , locale=it_IT , recipientAddress=LIOUDMILA@ME.COM, data=[MESSAGEDATA { key=domain, value=https://okt.to/ , type=null } , MessageData { key=date_time , value=2022-03-24T22:22:48.809 , type=null } MessageData { key=policy , value=botnet , type=null } , MessageData {key=content_categories , value=[malware] , type=null } , MessageData { key=manfacturer , value=Intel , type=null } ]}  
Hello, I've been trying to find and download Splunk Exchange app from the archive, but I couldn't find it. I know the app already hit the EOL, but still need it. Anybody knows where can I downl... See more...
Hello, I've been trying to find and download Splunk Exchange app from the archive, but I couldn't find it. I know the app already hit the EOL, but still need it. Anybody knows where can I download it or share me a link to download it Thank you Christian 
Hi folks,  I've been suffering from a creative crisis the last few days and looking for some brainstorming idea: I'm trying to come up with some use case ideas for SSO (particularly from Okta log... See more...
Hi folks,  I've been suffering from a creative crisis the last few days and looking for some brainstorming idea: I'm trying to come up with some use case ideas for SSO (particularly from Okta logs).  Does anyone have any examples on how to leverage the logs and what can I do with them in terms of any reports and alerts? All suggestions are more than welcome! 
So I am looking for the number of a specific event (sign-ins)  deduped by a user, which is simple. The challenge I am having is that I need the results deduped by date. So if i am looking at a weeks ... See more...
So I am looking for the number of a specific event (sign-ins)  deduped by a user, which is simple. The challenge I am having is that I need the results deduped by date. So if i am looking at a weeks worth of data I would like to see how many sign ins happened each day deduped by user. So, each user would only appear once each day but could appear multiple times over the course of the week depending.  Does this make sense? Please let me know if I can clarify anything and thanks in advance for any/all help. 
Hello Experts, I am facing difficulty at index time fields extraction. My sample log file format: Time stamp: Fri Mar 18 00:00:49 2022 File: File_name_1 Renamed to: Rename_1 Time stamp: Fri Ma... See more...
Hello Experts, I am facing difficulty at index time fields extraction. My sample log file format: Time stamp: Fri Mar 18 00:00:49 2022 File: File_name_1 Renamed to: Rename_1 Time stamp: Fri Mar 18 00:00:50 2022 File: File_name_1 Renamed to: Rename_1 Time stamp: Fri Mar 18 00:00:51 2022 File: File_name_1 Renamed to: Rename_1 Time stamp: Fri Mar 18 00:00:52 2022 File: File_name_1 Renamed to: Rename_1 Time stamp: Fri Mar 18 00:00:53 2022 File: File_name_1 Renamed to: Rename_1   props.conf [ demo ] CHARSET=AUTO LINE_BREAKER=([\r\n]+) MAX_TIMESTAMP_LOOKAHEAD=24 NO_BINARY_CHECK=true SHOULD_LINEMERGE=true TIME_FORMAT=%a %b %d %H:%M:%S %Y TIME_PREFIX=^Time stamp:\s+ TRANSFORMS-extractfield=extract_demo_field TRUNCATE=100000 transforms.conf [extract_demo_field] REGEX =^Time stamp:\s*(?<timeStamp>.*)$\s*^File:\s*(?<file>.*)$\s*^Renamed to:\s+(?<renameFile>.*)$ FORMAT = time_stamp::$1 file::$2 renamed_to::$3 WRITE_META = true