All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @hieuba  i assume you created a custom Visualization in Splunk Classic Dashboards and now you would like to recreate that one thru the Splunk Dashboard Studio... is that correct, pls suggest us. ... See more...
Hi @hieuba  i assume you created a custom Visualization in Splunk Classic Dashboards and now you would like to recreate that one thru the Splunk Dashboard Studio... is that correct, pls suggest us.  maybe could you pls tell us more details about the custom dashboard you are looking to create in the Dashboard Studio pls, thanks. 
Does anyone know if version 7.x of Threat Defense Manager (f.k.a. Firepower Management Center)  is compatible with the latest version of Cisco's eStreamer add-on? https://splunkbase.splunk.com/app... See more...
Does anyone know if version 7.x of Threat Defense Manager (f.k.a. Firepower Management Center)  is compatible with the latest version of Cisco's eStreamer add-on? https://splunkbase.splunk.com/app/3662
We had the same problem initially and found more details about code command usage under \TA-code\default\searchbnf.conf We are able to decode the URL or process using | code method=base64 field=en... See more...
We had the same problem initially and found more details about code command usage under \TA-code\default\searchbnf.conf We are able to decode the URL or process using | code method=base64 field=encodedcommand action=decode destfield=decoded_command key=abc123 but when we stats the decoded_command it gives the result as "p". I tried the base64 conversion matrix macro as well, it does the same p thing.  Can anyone help?
First off, I would suggest doing what @sshelly_splunk said if possible. If not possible then you can try this method with SPL. I see this question come over a lot and people usually respond with "... See more...
First off, I would suggest doing what @sshelly_splunk said if possible. If not possible then you can try this method with SPL. I see this question come over a lot and people usually respond with "its complicated", and it is. With that said, I have been working on trying to standardize a solution by using macros and think I have a good first iteration worked out, but I'm sure still needs some more regression testing. Here is what results look like using your sample timestamp that is assumed to be GMT but because of the user running the query's timezone preference is set to something else the epoch conversion isn't working as expected. You can see inputs of the first macro `convert_timestamp_to_epoch(3)` are $timestamp_field$ ----> REPORTED_DATE $timestamp_format$ ----> %Y-%m-%d %H:%M:%S.%1N $assumed_timezone$ ----> GMT     This first macro should convert a timestamp to a standardized epoch time by using either a timezone found in the timestamp itself or if no timezone is found in the timestamp to revert to using the 3rd argument of the "assumed_timezone". You have the ability to leave the 3rd argument blank as well and then the catchall timezone is the user's configured timezone preference. The second macro `convert_epoch_to_specific_timezone(3)` has the input args $epoch$ ----> standardized_epoch (this is default fieldname of the output of the previous macro) $timestamp_format$ ----> %Y-%m-%d %H:%M:%S.%1N $output_timezone$ ----> EST     This macro is taking in a epoch value and returns a human readable timestamp set to any timezone requested in the 3rd argument. (thats the idea at least) Using the 2 macros together should be able to convert any timestamp to another with a desired timezone association. If you are interested in the macros, shoot me a message and I can get them packaged up for you and share. In the mean time why dont you try appending "+0000" to your REPORTED_DATE and convert to epoch including the timezone specifier Example:   | eval REPORTED_DATE2=strptime('REPORTED_DATE'."+0000", "%Y-%m-%d %H:%M:%S.%1N%z")  
I've tried this, but without de "as _time" Now works perfect. Thank you very much!!!!!
Greetings @xxkenta  Were you ever able to find a viable solution for this issue?  I'm having a similar situation.
Presuming your firewall is logging allow and deny events to Splunk and presuming those events are stored in the 'network' index and also presuming those events have an 'action' field saying whether t... See more...
Presuming your firewall is logging allow and deny events to Splunk and presuming those events are stored in the 'network' index and also presuming those events have an 'action' field saying whether the traffic was allowed or blocked then this may get you started index=network | stats count by action
opps, it did work, my eval was not written correctly, i think, i was missing  space after commas in below syntax. <eval token="input_tok">replace($form.input_tok$, "(\\\\)", "\\\\\1")
Recently we upgraded FMC from 6.x to 7.x and noticed no data was being streamed into the /opt/splunk/etc/apps/TA-eStreamer/bin/encore/data/splunk directory.  We then started getting a firewall error ... See more...
Recently we upgraded FMC from 6.x to 7.x and noticed no data was being streamed into the /opt/splunk/etc/apps/TA-eStreamer/bin/encore/data/splunk directory.  We then started getting a firewall error when testing the connection.. Does anyone know if FMC 7.x is compatible with the TA-eStreamer add-on?  ./splencore.sh test Diagnostics ERROR [no message or attrs]: Could not connect to eStreamer Server at all. Are you sure the host and port are correct? If so then perhaps it is a firewall issue.
I would suggest pinging the Splunk admins, as the data is coming in with an issue, and will always be an issue until they modify the input or sourcetype. You can add/remove whatever number of hours ... See more...
I would suggest pinging the Splunk admins, as the data is coming in with an issue, and will always be an issue until they modify the input or sourcetype. You can add/remove whatever number of hours you need for a particular _time field, but if it gets corrected in the future, all of your searches will fail. As well, I'm not sure how things would behave if you were to drilldown from a dashboard into raw data.   It really is a simple as adding that TZ key/value to the sourcetype. What that does is makes the display of the data with different timezones seamless to end users. For example, searching for the last 60 minutes data sets configured in GMT AND CST will correctly display to the end user if TZ is configured for the sourcetypes.  
I do not have access to update that. So I was trying to figure out how to do it with SPL
Do you have the ability to modify the sourcetype for the ticketing system data?  You can add a single config to the input / sourcetype:  # The following props.conf entry sets Eastern Time Zone if... See more...
Do you have the ability to modify the sourcetype for the ticketing system data?  You can add a single config to the input / sourcetype:  # The following props.conf entry sets Eastern Time Zone if host matches nyc*. [host::nyc*] TZ = US/Eastern Is your Splunk environment Splunk Cloud, or self-hosted?  If cloud, you should be able to go to "Settings"->"Source Types", click on the specific sourcetype and add a key/value pair in the advanced section key="TZ", value ="US/Eastern" 
the value of the backslash \ to double backslash \\ in side of token is sets up in dashboard xml. tried following but did not worked. <eval token="input_tok">replace($form.input_tok$, "(\\\\)", "\\... See more...
the value of the backslash \ to double backslash \\ in side of token is sets up in dashboard xml. tried following but did not worked. <eval token="input_tok">replace($form.input_tok$, "(\\\\)", "\\\\\1") </eval>
how to change backslash of text input of a dashboard to use in subsequent search?
The field _time needs to be available at the time of using the "| timechart " command you example of: index=prueba source="*blablabla*"  | eval Date=strftime(_time,"%Y/%m/%d") | eval Time=strft... See more...
The field _time needs to be available at the time of using the "| timechart " command you example of: index=prueba source="*blablabla*"  | eval Date=strftime(_time,"%Y/%m/%d") | eval Time=strftime(_time,"%H:%M:%S") | eval Fecha=strftime(_time,"%Y/%m/%d %H:%M:%S") | rex "^.+transactType:\s(?P<transactType>(.\w+)+)" | stats values(Fecha) as Fecha, values(transactType) as transactType by ID | timechart span=5m count by transactType is not carrying over the_time field from the raw events. In the bolded SPL above the stats transformation will need some sort of method of carrying over the _time field I would recommend either a     | stats earliest(_time) as _time, values(Fecha) as Fecha, values(transactType) as transactType by ID | timechart span=5m count as count by transactType     OR a      | stats latest(_time) as _time, values(Fecha) as Fecha, values(transactType) as transactType by ID | timechart span=5m count as count by transactType     (depending on what makes more sense for your scenario)  So example of your Full SPL would look something like this:   index=prueba source="*blablabla*" ``` The field ID is assumed to already be extracted ``` ``` regex extraction of transactType field ``` | rex "^.+transactType:\s(?P<transactType>(.\w+)+)" ``` transform raw events to singular events, each representing a unique ID with their own list of tranactType value and _time value ``` | stats latest(_time) as _time, values(transactType) as transactType by ID ``` make a time series tallying up all the unique IDs belonging to the unique transactType values in 5 minute buckets ``` | timechart span=5m count as count by transactType  
Hello, I need some help. Manipulating time is something I have struggled with  Below is the code I have   ((index="desktop_os") (sourcetype="itsm_remedy")) earliest=-1d@d | search ASSIGNED_GROUP ... See more...
Hello, I need some help. Manipulating time is something I have struggled with  Below is the code I have   ((index="desktop_os") (sourcetype="itsm_remedy")) earliest=-1d@d | search ASSIGNED_GROUP IN ("Desktop_Support_1", "Remote_Support") ``` Convert REPORTED_DATE to epoch form ``` | eval REPORTED_DATE2=strptime(REPORTED_DATE, "%Y-%m-%d %H:%M:%S") ``` Keep events reported more than 12 hours ago so are due in < 12 hours ``` | where REPORTED_DATE2 <= relative_time(now(), "-12h") | eval MTTRSET = round((now()-REPORTED_DATE2)/3600) | dedup INCIDENT_NUMBER | stats values(REPORTED_DATE) AS Reported, values(DESCRIPTION) AS Title, values(ASSIGNED_GROUP) AS Group, values(ASSIGNEE) AS Assignee, LAST(STATUS_TXT) as Status,values(MTTRSET) as MTTRHours, values(STATUS_REASON_TXT) as PendStatus by INCIDENT_NUMBER | search Status IN ("ASSIGNED", "IN PROGRESS", "PENDING") | sort Assignee | table Assignee MTTRHours INCIDENT_NUMBER Reported Title Title Status PendStatus  this code runs and gives us the results we need, but the issue is that REPORTED_DATE field is off by 5 hours due to time zone issue. that is a custom field from out ticketing system that is stuck on GMT and the output looks like  2024-01-08 09:22:49.0 I need to get that field produce a correct timezone for EST. I am struggling with making it work. I looked at this thread but that is not working for us: Solved: How to convert date and time in UTC to EST? - Splunk Community Any help is appreciated.   Thanks  
Hi @michaelteck , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
It works, even if I have to manage the time range. Thanks a lot!
Hi, I have a log with several transactions, each one have some events. All event in one transaction share the same ID. The other events contains some information each one, for example, execution ti... See more...
Hi, I have a log with several transactions, each one have some events. All event in one transaction share the same ID. The other events contains some information each one, for example, execution time, transact type, url. login url, etc.... This fields can be in one or several of the events. I want to obtain the total transactions of each type in spanned time, for example each 5m. I need to group the events of each trasaction for extract the info for it. index=prueba source="*blablabla*" | eval Date=strftime(_time,"%Y/%m/%d") | eval Time=strftime(_time,"%H:%M:%S") | eval Fecha=strftime(_time,"%Y/%m/%d %H:%M:%S") | rex "^.+transactType:\s(?P<transactType>(.\w+)+)" | stats values(Fecha) as Fecha, values(transactType) as transactType by ID This is Ok, if i want count transactType then i do: index=prueba source="*blablabla*" | eval Date=strftime(_time,"%Y/%m/%d") | eval Time=strftime(_time,"%H:%M:%S") | eval Fecha=strftime(_time,"%Y/%m/%d %H:%M:%S") | rex "^.+transactType:\s(?P<transactType>(.\w+)+)" | stats values(Fecha) as Fecha, values(transactType) as transactType by ID |stats count by transactType The problem is if i want to obtain that in a span time: I cant do this because there is some events with the transactType field in one transaction: index=prueba source="*blablabla*" | eval Date=strftime(_time,"%Y/%m/%d") | eval Time=strftime(_time,"%H:%M:%S") | eval Fecha=strftime(_time,"%Y/%m/%d %H:%M:%S") | rex "^.+transactType:\s(?P<transactType>(.\w+)+)" | timechart span=5m count by transactType And following query dont give me any result: index=prueba source="*blablabla*" | eval Date=strftime(_time,"%Y/%m/%d") | eval Time=strftime(_time,"%H:%M:%S") | eval Fecha=strftime(_time,"%Y/%m/%d %H:%M:%S") | rex "^.+transactType:\s(?P<transactType>(.\w+)+)" | stats values(Fecha) as Fecha, values(transactType) as transactType by ID | timechart span=5m count by transactType Im tried too (but i dont get results): index=prueba source="*blablabla*" | eval Date=strftime(_time,"%Y/%m/%d") | eval Time=strftime(_time,"%H:%M:%S") | eval Fecha=strftime(_time,"%Y/%m/%d %H:%M:%S") | rex "^.+transactType:\s(?P<transactType>(.\w+)+)" | bucket Fecha span=5m | stats values(Fecha) as Fecha, values(transactType) as transactType by ID |stats count by transactType Or: index=prueba source="*blablabla*" | eval Date=strftime(_time,"%Y/%m/%d") | eval Time=strftime(_time,"%H:%M:%S") | eval Fecha=strftime(_time,"%Y/%m/%d %H:%M:%S") | rex "^.+transactType:\s(?P<transactType>(.\w+)+)" | stats values(Fecha) as Fecha, values(transactType) as transactType by ID | bucket Fecha span=5m |stats count by transactType How can i obtain what i want?  
i have configured the splunk addon for jmx and added the jmx server. i could able to get jmx server data. When i delete and reinstall new splunk enterprise and I copied splunk addon for jmx app of pr... See more...
i have configured the splunk addon for jmx and added the jmx server. i could able to get jmx server data. When i delete and reinstall new splunk enterprise and I copied splunk addon for jmx app of previous splunk to /etc/app folder. But here I am getting error as internal server cannot reach in configuration page. But input is configuration clear. Is their any option to add jmx server other then web interface . When I copy app why same configuration of jmx server is not applying.