All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

In order to bin the Event time, you need to keep it as a number (after parsing with strptime). You can format it as a string later or use fieldformat for display purposes   index=test1 sourcetype=t... See more...
In order to bin the Event time, you need to keep it as a number (after parsing with strptime). You can format it as a string later or use fieldformat for display purposes   index=test1 sourcetype=test2 | eval Event_Time=strptime(SUBMIT_TIME,"%Y%m%d%H%M%S") | table Event_Time ``` This next line is redundant since you only have Event_Time to the nearest second anyway ``` | bin Event_Time span=1s | sort 0 Event_Time | fieldformat Event_Time=strftime(Event_Time, "%m/%d/%y %H:%M:%S")  
Ideally you'd be able to chunk the Json log event into smaller subunits, but this depends on what your JSON log event looks like. If your json log events are over 10k characters long, they may be ge... See more...
Ideally you'd be able to chunk the Json log event into smaller subunits, but this depends on what your JSON log event looks like. If your json log events are over 10k characters long, they may be getting truncated. If this is the case, you can override the truncation by putting the following setting in a props.conf file on the indexing machines: [<yoursourcetype>] TRUNCATE = <some number above the size of your json logs, or 0 for no truncation> If your broken json logs in Splunk are less than 10k characters long, then it could be that Splunk is splitting the logs part-way through the json object, so you would need to set the LINE_BREAKER field so that it properly splits whole json objects.
Can someone assist on this request please? Thank you.
Since you already have applicationName=" as your prefix, this line index=mulesoft environment=$env$ applicationName=$BankApp$ InterfaceName=$interface$ will expand to index=mulesoft environment=$e... See more...
Since you already have applicationName=" as your prefix, this line index=mulesoft environment=$env$ applicationName=$BankApp$ InterfaceName=$interface$ will expand to index=mulesoft environment=$env$ applicationName=applicationName="*" InterfaceName=InterfaceName="*" Either remove applicationName= from your prefix or from your search index=mulesoft environment=$env$ $BankApp$ $interface$
Hi, I'm currently ingesting CSV files to Splunk. One of the fields record actual Event Timestamp in this format YYYYmmddHHMMSS (e.g. 20240418142025). I need to format this field's value in a way th... See more...
Hi, I'm currently ingesting CSV files to Splunk. One of the fields record actual Event Timestamp in this format YYYYmmddHHMMSS (e.g. 20240418142025). I need to format this field's value in a way that Splunk will understand the data (e.g. date, hour, minutes, second etc.). Once this formatting is complete, I need to sort these time stamps/events for each Second (e.g. bucket span=1s Event_Time). Note here Event_Time is the formatted data from original Event Timestamp field. So far, I've tried this: index=test1 sourcetype=test2 | eval Event_Time=strftime(strptime(SUBMIT_TIME,"%Y%m%d%H%M%S"), "%m/%d/%y %H:%M:%S") | table Event_Time Above command gives me decent output such as 04/18/24 14:20:25. But, when I try to group values of Event_Time using "bucket span=1s Event_Time", it does not do anything. Note that "bucket span=1s _time" works as I'm using Splunk default time field. Appreciate any help to make this formatting work for post processing Event_Time. Thank you in advance.
I am struggling to find a post for my answer because the naming for Splunk Enterprise and Enterprise Security is so similar and I am only seeing results for ES.. I want to find a way to add Threat I... See more...
I am struggling to find a post for my answer because the naming for Splunk Enterprise and Enterprise Security is so similar and I am only seeing results for ES.. I want to find a way to add Threat Intelligence feeds into my Splunk Enterprise environment so my organization can eventually move off of the other SIEM we have been using in tandem with Splunk.  Is this possible with Splunk Enterprise? I know ES has the capability but we are strictly on-prem at the moment and I do not see us moving to it anytime soon. Any suggestions? Has anyone set these up on prem?
@richgalloway  : Sorry I did not get what rule you are mentioning. Could you please be more clear on this ? 434531263412:us-west-2:lambda_functions -> lambda_functions 434531263412:us-west-2:nat_... See more...
@richgalloway  : Sorry I did not get what rule you are mentioning. Could you please be more clear on this ? 434531263412:us-west-2:lambda_functions -> lambda_functions 434531263412:us-west-2:nat_gateways -> gateways 434531263412:us-west-2:application_load_balancers -> load_balancers yes , this is the requirement. In the above , right side values are the values from source field. I want to extract service name from this field value.
Any luck with support?  I tried the outputs.conf solution in this thread but it doesn't seem to have worked.     Pre-upgrade from 9.0.x to 9.2.1 I had 300ish clients in my DS.  right now only 14 ar... See more...
Any luck with support?  I tried the outputs.conf solution in this thread but it doesn't seem to have worked.     Pre-upgrade from 9.0.x to 9.2.1 I had 300ish clients in my DS.  right now only 14 are showing up.   Thanks, Dave
OK. Time to dig into the gory details of Splunk licensing. When you have an enforcing license (either a trial, dev or "full" license not big enough to be non-enforcing), each day you exceed your dai... See more...
OK. Time to dig into the gory details of Splunk licensing. When you have an enforcing license (either a trial, dev or "full" license not big enough to be non-enforcing), each day you exceed your daily ingestion allowance will generate a warning. If you exceed given number of warnings during a given time period (with a trial version it's 5 warnings in 30-day rolling window; with a "full" Splunk Enterprise license it's 45 warnings in 60 day), your environment will go into a "violation mode". Most importantly - it will stop allowing you search any data other than internal indexes. And the tricky question is that even if you add new/bigger/whatever license at this point, it will not automatically "unlock" your environment. You need to either wait for the violations to clear (for some license types) or request a special unlock license from the Splunk sales team. So tl,dr -  if you let your Splunk run out of license, it's not as easy as "I add my freshly bought license" and it starts working again.
Yes, But its still showing same error  Error in 'search' command: Unable to parse the search: Comparator '=' has an invalid term on the left hand   side: applicationName=APPLICATION_NAME. ... See more...
Yes, But its still showing same error  Error in 'search' command: Unable to parse the search: Comparator '=' has an invalid term on the left hand   side: applicationName=APPLICATION_NAME.   This the query which i am using:     index=mulesoft environment=$env$ applicationName=$BankApp$ InterfaceName=$interface$ (priority="ERROR" OR priority="WARN") | stats values(*) as * by correlationId | rename content.InterfaceName as InterfaceName content.FileList{} as FileList content.Filename as FileName content.ErrorMsg as ErrorMsg | eval Status=case(priority="ERROR","ERROR",priority="WARN","WARN",priority!="ERROR","SUCCESS") | fields Status InterfaceName applicationName FileList FileName correlationId ErrorMsg message | where FileList!=" "  
Try changing the applicationName to APPLICATION_NAME in the prefix <input type="dropdown" token="BankApp" searchWhenChanged="true"> <label>ApplicationName</label> <choice value=... See more...
Try changing the applicationName to APPLICATION_NAME in the prefix <input type="dropdown" token="BankApp" searchWhenChanged="true"> <label>ApplicationName</label> <choice value="*">All</choice> <search> <query> | inputlookup BankIntegration.csv | dedup APPLICATION_NAME | sort APPLICATION_NAME | table APPLICATION_NAME </query> </search> <fieldForLabel>ApplicationName</fieldForLabel> <fieldForValue>APPLICATION_NAME</fieldForValue> <default>*</default> <prefix>APPLICATION_NAME="</prefix> <suffix>"</suffix> </input> in the second look up, you are trying to filter with applicationName="" where as the lookup file seems to have APPLICATION_NAME as header
Your fieldForLabel has to be a field returned by the search query, which it isn't in both instances
Hi, I have installed cisco networks app and add-on. I have a labdata file with many events loaded to splunk. All data can be seen from search engine, but the app shows no result. Is it possible to us... See more...
Hi, I have installed cisco networks app and add-on. I have a labdata file with many events loaded to splunk. All data can be seen from search engine, but the app shows no result. Is it possible to use the labdata information on Cisco Networks? Should I add some configuration in order to it work?
To summarize: 434531263412:us-west-2:lambda_functions -> lambda_functions 434531263412:us-west-2:nat_gateways -> gateways 434531263412:us-west-2:application_load_balancers -> load_balancers If th... See more...
To summarize: 434531263412:us-west-2:lambda_functions -> lambda_functions 434531263412:us-west-2:nat_gateways -> gateways 434531263412:us-west-2:application_load_balancers -> load_balancers If this is correct then more information is needed.  What is the rule to use to determine how much of the service is to be used?
04-18-2024 13:36:06.590 ERROR EvalCommand [102993 searchOrchestrator] - The 'bit_shift_left' function is unsupported or undefined. I believe the function requires 9.2.0+ Thanks for noticing! ... See more...
04-18-2024 13:36:06.590 ERROR EvalCommand [102993 searchOrchestrator] - The 'bit_shift_left' function is unsupported or undefined. I believe the function requires 9.2.0+ Thanks for noticing!  I always assumed that bitwise operations had been part of SPL from day one but no.  The document has this footer: "This documentation applies to the following versions of Splunk® Enterprise: 9.2.0, 9.2.1." (Searching in previous versions results in the same pointers to 9.2.) For the above, should the second set have been given a different value for the field? Those are really bad copy-and-paste errors.  Corrected.
Thanks in advance . I am trying to fetch application name and inteface details from input lookup and match with the splunk query .But i am getting below error.  Error in 'search' command: U... See more...
Thanks in advance . I am trying to fetch application name and inteface details from input lookup and match with the splunk query .But i am getting below error.  Error in 'search' command: Unable to parse the search: Comparator '=' has an invalid term on the left hand side: applicationName=applicationName.       <input type="dropdown" token="BankApp" searchWhenChanged="true" depends="$BankDropDown$"> <label>ApplicationName</label> <choice value="*">All</choice> <search> <query> | inputlookup BankIntegration.csv | dedup APPLICATION_NAME | sort APPLICATION_NAME | table APPLICATION_NAME </query> </search> <fieldForLabel>ApplicationName</fieldForLabel> <fieldForValue>APPLICATION_NAME</fieldForValue> <default>*</default> <prefix>applicationName="</prefix> <suffix>"</suffix> </input> <input type="dropdown" token="interface" searchWhenChanged="true" depends="$BankDropDown$"> <label>InterfaceName</label> <choice value="*">All</choice> <search> <query> | inputlookup BankIntegration.csv | search $BankApp$ | sort INTERFACE_NAME | table INTERFACE_NAME </query> </search> <fieldForLabel>InterfaceName</fieldForLabel> <fieldForValue>INTERFACE_NAME</fieldForValue> <default>*</default> <prefix>InterfaceName="</prefix> <suffix>"</suffix> </input>    
Hi @Jerg.Weick, Thanks for your patience, Eng has confirmed it's a bug and is expected to be fixed in 24.4. Which should hopefully be by mid-May.
You should just replace this  splunk_server=* and then it sends that to all search peers. I cannot recall what are those endpoints, but it’s something under config or configurations.
It's okay. I was able to figure out how to install this. It's a bit odd that dependencies like this are not automatically managed.
Hi @yew, I’m a Community Moderator in the Splunk Community. This question was posted 8 years ago, so it might not get the attention you need for your question to be answered. We recommend that yo... See more...
Hi @yew, I’m a Community Moderator in the Splunk Community. This question was posted 8 years ago, so it might not get the attention you need for your question to be answered. We recommend that you post a new question so that your issue can get the  visibility it deserves. To increase your chances of getting help from the community, follow these guidelines in the Splunk Answers User Manual when creating your post. Thank you!