All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The linked article by @dmacintosh_splu shows you how to create the relative comparable time for the same period in the previous year using a dummy search. To make the 1 year calculation, I would do ... See more...
The linked article by @dmacintosh_splu shows you how to create the relative comparable time for the same period in the previous year using a dummy search. To make the 1 year calculation, I would do <search> <query> | makeresults | addinfo | eval prev_year_earliest=relative(info_min_time, "-1y") | eval prev_year_latest=relative(info_max_time, "-1y") | fields prev_* </query> <done> <set token="prev_year_earliest">$result.prev_year_earliest$</eval> <set token="prev_year_latest">$result.prev_year_latest$</eval> </done> </search> what is that you can't do specifically? Do you want a single panel to show both years on a timechart - when you say trend, do you mean a straight line indicating direction or comparative data points for the previous year? If you want a single panel showing both years, then you still need the above search and your main search to populate the data will be something like this to include both token sets and then timewrap to wrap previous year to current year search (earliest=$time.earliest$ latest=$time.latest$) OR (earliest=$prev_year_earliest$ latest=$prev_year_latest$) ... | timechart ... | timewrap 1y  
Yeah it still doesn't mark the URLs in my list that don't exist in the index as 0. I'll just accept your solution though, thanks for your help
Thanks @dmacintosh_splu for the response, but i doesn't really help me.  When i select the duration in the time picker, say from Jan 1, 2023 to May 1, 2023 , then my dashboard will have to use the tr... See more...
Thanks @dmacintosh_splu for the response, but i doesn't really help me.  When i select the duration in the time picker, say from Jan 1, 2023 to May 1, 2023 , then my dashboard will have to use the trend for the number of tickets in first panel and the trend for the number of the tickets in the second panel for the same duration for previous year (Jan 1, 2022 to May 1, 2022).  I am not sure how to frame the search query for extracting the tickets trend for previous year.
What ticketing system are you using? Are you trying to avoid modifying the saved search for the alert?
This Answer may be what you need.  https://community.splunk.com/t5/Dashboards-Visualizations/Panel-not-updating-when-changing-the-time-range-picker/m-p/332407/highlight/true#M21545
I had already tried that as well but with no luck.  It has to be something else that I missing.  Thanks for replying though.   If I figure it out, I'll post an update here.
We currently have an alert set up that generates a ticket in our ticketing platform. We are currently moving to a new ticketing platform and have utilized collect to collect the event and put it in a... See more...
We currently have an alert set up that generates a ticket in our ticketing platform. We are currently moving to a new ticketing platform and have utilized collect to collect the event and put it in a new index for that ticketing platform to pull data from. Is there a way to rename fields of the event that is collected, but not change the field names for the current alert? We have to have different field names for the new ticketing system to map correctly. My only idea right now is either duplicate the alert and have them run in parallel, or when the ticketing system queries splunk for new events, to have that query contain a search macro that does the renaming before the events are ingested,
Hi @Balaji.M, When did AppD set this up? Are you able to reach out to the person/team who helped set it up to ask them for additional help?
I am trying to create a Dashboard that hold multiple table of WebSphere App Server configuration data.  The data I have looks like this: {"ObjectType ":"AppServer","Object":"HJn6server1","Order":"1... See more...
I am trying to create a Dashboard that hold multiple table of WebSphere App Server configuration data.  The data I have looks like this: {"ObjectType ":"AppServer","Object":"HJn6server1","Order":"147","Env":"UAT","SectionName":"Transport chain: WCInboundDefaultSecure:Channel HTTP", "Attributes":{"discriminationWeight": "10","enableLogging": "FALSE","keepAlive": "TRUE","maxFieldSize": "32768","maxHeaders": "500","maxRequestMessageBodySize": "-1","maximumPersistentRequests": "100","name": "HTTP_4","persistentTimeout": "30","readTimeout": "60","useChannelAccessLoggingSettings": "FALSE","useChannelErrorLoggingSettings": "FALSE","useChannelFRCALoggingSettings": "FALSE","writeTimeout": "60"}} Where every event is a configuration section within an appserver where: ObjectType - AppServer Object - Name of Appserver (ex. "HJn6server1") Env - Environment. (ex. Test, UAT, PROD) SectionName - name within the appserver configuration that holds attributes. Attributes - configuration attributes for a SectionName I have been able to create one table per SectionName, but can't extend that to multiple sections.  I used the following code to make one table:  index = websphere_cct (Object= "HJn5server1" Env="Prod") OR (Object = "HJn7server3" Env="UAT") SectionName="Process Definition" Order [ search index=websphere_cct SectionName | dedup Order | table Order ] | fields - _* | fields Object Attributes.* SectionName | eval Object = ltrim(Object, " ") | rename Attributes.* AS * | table SectionName Object * | fillnull value="" | transpose column_name=Attribute header_field=Object | eval match = if('HJn5server1' == 'HJn7server3', "y", "n") Output:  Attribute HJn7server3 HJn5server1 match SectionName Process Definition Process Definition y IBM_HEAPDUMP_OUTOFMEMORY     y executableArguments [] [] y executableTarget com.ibm.ws.runtime.WsServer com.ibm.ws.runtime.WsServer y executableTargetKind JAVA_CLASS JAVA_CLASS y startCommandArgs [] [] y stopCommandArgs [] [] y terminateCommandArgs [] [] y workingDirectory ${USER_INSTALL_ROOT} ${USER_INSTALL_ROOT} y   What I would like to do is to create as many tables as there are SectionNames for a given comparison between two Objects. But I cannot figure out how to modify the code for allowing to have several tables in one dashboard for multiple SectionNames with their associated Attributes for two appservers in comparison.  Please help.   
I have 2 look up data and I want to join them through a common field MonthYear. I need to calculate transmission per dept = Total transmission *(size of dept/total size of dept) In lookup1 I need... See more...
I have 2 look up data and I want to join them through a common field MonthYear. I need to calculate transmission per dept = Total transmission *(size of dept/total size of dept) In lookup1 I need to calculate the propotion of size based on dept eg; Transmission for Eng dept = 119 *((100+23)/ 170) | inputlookup lookup1.csv | stats sum(size) as DeptMem by dept | eventstats sum(DeptMem) as TotalSize | append [inputlookup lookup2.csv | stats sum(Transmission) as TotalTransmission] | eventstats values(TotalTransmission) as TotalTransmission | eval "transmission per dept" = round(TotalTransmission * DeptMem / Totalsize, 2) | stats values('transmission per dept') as "transmission per dept" by dept Note: Based on your description, I believe that breakdown by dept is the goal. You cannot get total as you illustrated with "by MonthYear". You cannot get a pie chart with "by MonthYear" if you want to break down by dept Because you need a breakdown by dept, the only useful data in lookup2.csv is a total of Transmission. A single append is a lot more efficient than doing joins.
Hi, I have a dashboard that shows service tickets count based on different parameters.  Now I need to show a trend for current year and previous year for the duration selected by the user in the ti... See more...
Hi, I have a dashboard that shows service tickets count based on different parameters.  Now I need to show a trend for current year and previous year for the duration selected by the user in the time picker. For example, if the user selects time from Jan 1, 2023 to Apr 1, 2023 in the time picker , then I need to form a query to select the same duration of previous year( Jan 1, 2022 to Apr 1, 2022) and show the trend . How to create the previous year duration based on the duration selected in the time picker.  Please advise.
Pagination in table only appears in Edit mode of Splunk dashboard not in View. Can we correct this?
I have a table in Database that I need to check every 30 minutes,starting from 7.00 AM in the morning. The first alert i.e. at 7.00 AM should send the entire table without any checking any conditions... See more...
I have a table in Database that I need to check every 30 minutes,starting from 7.00 AM in the morning. The first alert i.e. at 7.00 AM should send the entire table without any checking any conditions.  Next here I have a field from the table named ACTUAL_END_TIME. This column can have only any of the three values, first a timestamp in HH:MM:SS format, second a String In-Progress, and third is again a String NotYetStarted. I need to check this table every 30 mins, and only trigger the alert when all the rows of the column ACTUAL_END_TIME have only timestamp. NOTE: The alert should trigger only once per day. How do I setup this alert
Hi, I’m using splunk docker image with HEC to send log. I got Success message as the guideline. How could I query the log to see “hello world”, which was what I just sent?I tried a few search related... See more...
Hi, I’m using splunk docker image with HEC to send log. I got Success message as the guideline. How could I query the log to see “hello world”, which was what I just sent?I tried a few search related curl commands but all of them just returns a very long xml. “hello world” is not in the response. Such as   curl -k -u admin:1234567Aa! https://localhost:8089/services/search/jobs -d "search *"  Could anyways share me a search curl command that can return "hello world" that I sent? I only have one record so I don't need complicated filtering.
The prefix is the part that comes *before* the timestamp string and must not describe the timestamp string itself.  The prefix for the sample event would be ^[
Glad to hear it!  Nice to know that the solution works on other systems as well.
If the file doesn't change then Splunk won't re-index it.  You'll have to delete the fishbucket to force Splunk to re-index the file. splunk cmd btprobe -d /opt/splunkforwarder/var/lib/splunk/fishbu... See more...
If the file doesn't change then Splunk won't re-index it.  You'll have to delete the fishbucket to force Splunk to re-index the file. splunk cmd btprobe -d /opt/splunkforwarder/var/lib/splunk/fishbucket/splunk_private_db --file <small file> --reset
can't figure out how to indexing my data from zigbee2mgtt.  The logs are exported from Home assistance via syslog, as Json.  I have tried various settings in props on the forwarder. Current sett... See more...
can't figure out how to indexing my data from zigbee2mgtt.  The logs are exported from Home assistance via syslog, as Json.  I have tried various settings in props on the forwarder. Current setting: [zigbee2mqtt] DATETIME_CONFIG = INDEXED_EXTRACTIONS = JSON category = structured NO_BINARY_CHECK = true TIMESTAMP_FIELDS = timestamp LINE_BREAKER = ([\r\n]+) disabled = false pulldown_type = true And on the search: Current: [zigbee2mqtt] KV_MODE = JSON And this is how the data appears in the log.  for me it looks like some kind mix, not just JSON data. Sep 20 19:13:19 linsrv 1 2023-09-20T17:13:19.941+02:00 localhost Zigbee2MQTT - - - MQTT publish: topic 'zigbee2mqtt/P001', payload '{"auto_off":null,"button_lock":null,"consumer_connected":true,"consumption":7.82,"current":0,"device_temperature":25,"energy":7.82,"led_disabled_night":null,"linkquality":255,"overload_protection":null,"power":0,"power_outage_count":3,"power_outage_memory":null,"state":"OFF","update":{"installed_version":41,"latest_version":32,"state":"idle"},"update_available":false,"voltage":234}'/n host = linsrv index = zigbee source = /disk1/syslog/in/linsrv/2023-09-20/messages.log sourcetype = zigbee2mqtt   Sep 20 19:08:13 linsrv06.hemdata.hemdata.se 1 2023-09-20T17:08:13.988+02:00 localhost Zigbee2MQTT - - - MQTT publish: topic 'zigbee2mqtt/P002', payload '{"auto_off":null,"button_lock":null,"consumer_connected":true,"consumption":2.58,"current":0,"device_temperature":23,"energy":2.58,"led_disabled_night":null,"linkquality":255,"overload_protection":null,"power":0,"power_outage_count":0,"power_outage_memory":null,"state":"OFF","update":{"installed_version":41,"latest_version":32,"state":"idle"},"update_available":false,"voltage":229}'/n host = linsrv index = zigbee source = /disk1/syslog/in/linsrv/2023-09-20/messages.log sourcetype = zigbee2mqtt   Sep 20 19:08:13 linsrv 1 2023-09-20T17:08:13.968+02:00 localhost Zigbee2MQTT - - - MQTT publish: topic 'zigbee2mqtt/P001', payload '{"auto_off":null,"button_lock":null,"consumer_connected":true,"consumption":7.82,"current":0,"device_temperature":25,"energy":7.82,"led_disabled_night":null,"linkquality":255,"overload_protection":null,"power":0,"power_outage_count":3,"power_outage_memory":null,"state":"OFF","update":{"installed_version":41,"latest_version":32,"state":"idle"},"update_available":false,"voltage":234}'/n host = linsrv index = zigbee source = /disk1/syslog/in/linsrv/2023-09-20/messages.logsourcetype = zigbee2mqtt   Sep 20 19:08:06 linsrv 1 2023-09-20T17:08:06.199+02:00 localhost Zigbee2MQTT - - - MQTT publish: topic 'zigbee2mqtt/P002', payload '{"auto_off":null,"button_lock":null,"consumer_connected":true,"consumption":2.58,"current":0,"device_temperature":23,"energy":2.58,"led_disabled_night":null,"linkquality":255,"overload_protection":null,"power":0,"power_outage_count":0,"power_outage_memory":null,"state":"OFF","update":{"installed_version":41,"latest_version":32,"state":"idle"},"update_available":false,"voltage":229}'/n host = linsrv index = zigbee source = /disk1/syslog/in/linsrv/2023-09-20/messages.log sourcetype = zigbee2mqtt  
Thanks ITWHisperer , Much Appreciated !
I tried your string in the datapreview screen.  I placed it in the timestamp format field.  I used \d{8}\:\d{6}\.\d{3} as the prefix put I'm still getting timestamp=none