All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

  Able to get event output in table format. But looking for eval condition: 1. Remove T from the timestamp and convert the below UTC/GMT to EST and need this in YYYY-MM-DD HH:MM:SS 2. And need the... See more...
  Able to get event output in table format. But looking for eval condition: 1. Remove T from the timestamp and convert the below UTC/GMT to EST and need this in YYYY-MM-DD HH:MM:SS 2. And need the time different between c_timestamp and c_mod and add the time difference in Timetaknen column.  
I am working on one Bug. In TAV dashboard Graphs are not visible in the CFF IT/Business KPIs. After my initial analysis I found that data came from "get_cff_trends" macros, and this macros is not ret... See more...
I am working on one Bug. In TAV dashboard Graphs are not visible in the CFF IT/Business KPIs. After my initial analysis I found that data came from "get_cff_trends" macros, and this macros is not returning any values. So, I starts validating the "get_cff_trends" macros  code.   Query :   | mstats latest(avg.alert_*) as latest.alert_* avg(avg.alert_*) as avg.alert_* sum(sum.alert_*) as sum.alert_* WHERE source="iobserve_v5" AND index="em_metrics" AND ( service="TA:CFF:Business:Sweden" AND kpi="ServiceHealthScore" ) OR ( service="TA:CFF:Business Orders Created" AND kpi="Orders count - Total" ) OR ( service="TA:CFF:Business Work Orders Fulfilled" AND kpi="Orders fulfilled in last 1 hr" ) OR ( service="TA:CFF:Business Work Orders Delivered" AND kpi="Orders Delivered*" ) OR ( service="TA:CFF:Business Work Orders Released" AND kpi="Released Orders - Nr Orders In Latest Release" ) earliest="1718179949.136" latest="1718179949.136" span="10m" BY kpi service | eval alert_value='avg.alert_value', alert_level=round('avg.alert_level',0) | eval value = if(kpi like "%Order%" , 'sum.alert_value', alert_value) | stats avg(value) as avgValue by _time service,kpi | eval avgValue=round(avgValue,0), minValue=round(minValue,2), maxValue=round(maxValue,2), dday=strftime('_time',"%Y-%m-%d") | eval avgValue = if( isnull(mvfind(_time, all_times)), 0, mvindex(avgValue,mvfind(_time, all_times))) | fillnull value="N/A" | stats list(avgValue) as avgValue values(all_times) as _time by service kpi | eval avgValue=mvjoin(avgValue,",") | eval unit=case(like(lower(kpi),"%percent%"),"%", like(lower(kpi),"%conversion%"),"%", like(lower(kpi),"%syncronisation%"),"%", like(lower(kpi),"%availability%"),"%", like(lower(kpi),"%order%"),"#", like(kpiid,"SHKPI%"),"%", like(lower(kpi),"%lead time%"),"days", like(lower(kpi),"%size%"),"#", like(lower(kpi),"%price%"),"#", like(lower(kpi),"%cff%"),"%", like(lower(kpi),"%sample%"),"#", like(lower(kpi),"%calls%"),"#", like(lower(kpi),"%transactions%"),"#", like(lower(kpi),"%sessions%"),"#", like(lower(kpi),"%error%"),"#", like(lower(kpi),"%checkouts%"),"#", like(lower(kpi),"%response time%"),"ms", like(lower(service),"%data quality%"),"%", true(),"%") | eval display_name=case(kpi like "ServiceHealthScore", "Fulfillment Flow Health", kpi like "Orders count - Total%", "Orders created", kpi like "Orders Delivered*%", "Orders delivered*", kpi like "Orders fulfilled in last 1 hr%", "Orders fulfilled*", kpi like "Released Orders - Nr Orders In Latest Release", "Orders released", true(),kpi) | appendcols     [| inputlookup slack_incidents.csv]        In this query we found ,when we are using "_time" in our query its not returnning value and if we remove "_time" than query returns value upto 9th lines but If we run whole query without "_time" its not returned any value. also if we run query with "_time"  than also it not return values. Can you please help me to resolve this issue.
Hi i am new to splunk. As I am trying to integrate splunk with sentinelone, I found it frustrated to find which api key/token should I use... ( The SDL one or Management Console one). Also, I cannot ... See more...
Hi i am new to splunk. As I am trying to integrate splunk with sentinelone, I found it frustrated to find which api key/token should I use... ( The SDL one or Management Console one). Also, I cannot find what the url and name should be under the Application Configuration page in Splunk. Hope you can help... Many thanks
As the title suggests I have a dashboard with various panels and wondering if it's possible to export a single panel and all its contents (including token values) XML to JavaScript.  
Hello,  I'm fairly new to splunk, trying to search using where clause and filter the results. The query is running long, wondering if i'm not doin this right. a tone down version of the search: ... See more...
Hello,  I'm fairly new to splunk, trying to search using where clause and filter the results. The query is running long, wondering if i'm not doin this right. a tone down version of the search: index=provisioning_index cf_org_name=abcd cf_app_name=xyz "ReconCount:" |where jobNumber IN ([search index=provisioning_index cf_org_name=abcd cf_app_name=xyz operation="operation1" status=SUCCESS |search NOT jobType="Canc"|table jobNumber ]) |stats count by deliveryInd | addcoltotals
In Splunk, I added an AWS add-on and tried to get data from AWS S3. While creating the input, it took the sourcetype as aws:s3:csv by default, and I was receiving the data properly. However, I accide... See more...
In Splunk, I added an AWS add-on and tried to get data from AWS S3. While creating the input, it took the sourcetype as aws:s3:csv by default, and I was receiving the data properly. However, I accidentally changed the configuration for the aws:s3:csv sourcetype, and now the logs are not being received correctly. Can anyone help me by providing the default configuration for this sourcetype?"
SE ver 9.1.2 Upgrading from ES 7.2 to 7.3.1.  Ran the install (expands the SPL out to the respective apps) Restarted Splunkd Went into UI and started ES ap where its stating that ES has not been f... See more...
SE ver 9.1.2 Upgrading from ES 7.2 to 7.3.1.  Ran the install (expands the SPL out to the respective apps) Restarted Splunkd Went into UI and started ES ap where its stating that ES has not been fully configured and to hit the green button to do so (continue to app setup page) When I do, I get the error message popup of: ParsingError: Source contains parsing errors: '<string>' [line 2]: '[]\n' Hit ok and we dont go anywhere... can go to other apps so Splunk Enterprise is ok... ES... no so ok... Help
When using regex how can I take a field formatted as "0012-4250" and only show the 1st and lat 3 digits? I tried the following in which maintains the original output: | eval AcctCode = replace(Acct... See more...
When using regex how can I take a field formatted as "0012-4250" and only show the 1st and lat 3 digits? I tried the following in which maintains the original output: | eval AcctCode = replace(AcctCode,"(\d{4}-)(\d{4})","\1\2")
Config validation failure reported in peer=usxzvrspidx1.usaccess.gsa.gov guid=62899FCC-C4E8-4A86-903D-C72234AE7F38. In index '_audit': Failed to create directory '/opt/splunk/var/lib/splunk/cold/audi... See more...
Config validation failure reported in peer=usxzvrspidx1.usaccess.gsa.gov guid=62899FCC-C4E8-4A86-903D-C72234AE7F38. In index '_audit': Failed to create directory '/opt/splunk/var/lib/splunk/cold/audit/colddb' (File exists); . I made a change to my my indexes: [wineventlog] homePath = volume:hotwarm/wineventlog/db coldPath = volume:cold/wineventlog/colddb thawedPath = $SPLUNK_DB/wineventlog/thaweddb maxDataSize = auto_high_volume coldPath.maxDataSizeMB = 0 maxWarmDBCount = 300 frozenTimePeriodInSecs = 33696000 repFactor = auto [syslog] homePath = volume:hotwarm/syslog/db coldPath = volume:cold/syslog/colddb thawedPath = $SPLUNK_DB/syslog/thaweddb repFactor = auto maxDataSize = auto_high_volume coldPath.maxDataSizeMB = 11059200 maxWarmDBCount = 4294967295 frozenTimePeriodInSecs = 33696000 Since this change  the indexers quit receiving data from their forwarders.   So I want to put the values back and Im getting this error when I want to apply the bundle change  Need help on how to fix this  
Recently, I installed new Splunk Enterprise 9.2.1 (on-prem) on RHEL8 server and have installed Universal Forwarders on bunch of Linux (RHEL and Ubuntu) and Windows clients and logs are being ingestin... See more...
Recently, I installed new Splunk Enterprise 9.2.1 (on-prem) on RHEL8 server and have installed Universal Forwarders on bunch of Linux (RHEL and Ubuntu) and Windows clients and logs are being ingesting fine. However, after waiting for few days, I installed Universal Forwarder on few more Linux machines (followed the same process as before) and installation was successful but logs are not showing up on indexers. I have checked and compared the inputs.conf, outputs.conf, server.conf under $SPLUNK_HOME/etc/system/local (like other hosts) and looks good.  I have done tcpdump on the clients and the indexers and clinent is sending logs to the indexer  I did look search for the new hosts in $SPLUNK_HOME/var/log/splunk and they do show up in metrics.log but when I search for index="my-index-name", I only see logs from the hosts that I installed last week; nothing from new UF I installed/configured yesterday.  What's the best way to troubleshoot further?    *This is a air-gapped environment, so I can't provide any logs.   
Hi, We are currently using the Azure extension to monitor metrics for Azure Application Gateway (AAG), specifically the Total Time metric. However, we have observed that the data displayed on our da... See more...
Hi, We are currently using the Azure extension to monitor metrics for Azure Application Gateway (AAG), specifically the Total Time metric. However, we have observed that the data displayed on our dashboard is not continuous, and it appears that some data points are missing. Could you please review the attached dashboard and provide your comments and guidance on this issue? Thank you for your assistance. Regards, Jeff 
Below is my row text in Splunk and i want to extract JSON array from  from below row text. After extract I want to do group by ARUNAME  and wanted to calculate the sum of SKIPPED and PROCESSED  valu... See more...
Below is my row text in Splunk and i want to extract JSON array from  from below row text. After extract I want to do group by ARUNAME  and wanted to calculate the sum of SKIPPED and PROCESSED  value  {"id":"0","severity":"Information","message":"[{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_P\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":36,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_P\",\"TOTAL\":0,\"PROCESSED\":8,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_P\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":1,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_00410\",\"TOTAL\":0,\"PROCESSED\":4,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_H\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":29,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_00410\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":1,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_H\",\"TOTAL\":0,\"PROCESSED\":139,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23}]"}  
Hi All, We have an application that gets events in from an external party but occasionally we see out of sequence events that occur due to underlying issues with the MQ interface [guaranteed deliver... See more...
Hi All, We have an application that gets events in from an external party but occasionally we see out of sequence events that occur due to underlying issues with the MQ interface [guaranteed delivery but not necessarily in correct order].  Identifying out of sequence events would then point to an issue with the underlying MQ.  Given this set of data.. | makeresults format=csv data="timelogged, formDataId, eventOrder 00:02,AA,2 00:03,AA,3 00:04,AA,3 00:05,AA,4 00:06,AA,5 00:07,AA,9 01:02,BB,2 01:03,BB,3 01:04,BB,3 01:05,BB,4 01:07,BB,9 01:08,BB,5 02:02,CC,2 02:03,CC,3 02:04,CC,3 02:05,CC,4 02:06,CC,5 02:07,CC,9 03:01,DD,1 04:02,EE,2 04:03,EE,4 04:04,EE,3 04:05,EE,9" | table timelogged, formDataId, eventOrder ...how could the highlighted transactions be identified? Note: We do not get all types of events and the 'first' event is not usually seen [as indicates an error on vendor side]
I am trying to get DeviceName and DeviceToken to var from 365 log first I use eval Device =mvindex('ModifiedProperties{}.NewValue', 0) which retuns another MV with the data I want but can seem to g... See more...
I am trying to get DeviceName and DeviceToken to var from 365 log first I use eval Device =mvindex('ModifiedProperties{}.NewValue', 0) which retuns another MV with the data I want but can seem to get to the field. Below is what Device shows in editor. Any help? What something like eval DeviceName = ModifiedProperties{}.NewValue{0}.DeviceName but nothing is right I try. Tried to save as sting and extract but even that I cant figure out. Its the Mv in a MV I think is throwing me.  [ { "DeviceName": "iPhone 13 mini", "DeviceToken": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "DeviceTag": "SoftwareTokenActivated", "PhoneAppVersion": "6.8.11", "OathTokenTimeDrift": 0, "DeviceId": "00000000-0000-0000-0000-000000000000", "Id": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "TimeInterval": 0, "AuthenticationType": 3, "NotificationType": 2, "LastAuthenticatedTimestamp": "2024-06-27T15:00:42.8784693Z", "AuthenticatorFlavor": null, "HashFunction": null, "TenantDeviceId": null, "SecuredPartitionId": 0, "SecuredKeyId": 0 } ]
I have a search that returns two results per day (a job's log entry of when it started and when it ended). I want to be able to see the time difference between the two entries, grouped by day. I'm a... See more...
I have a search that returns two results per day (a job's log entry of when it started and when it ended). I want to be able to see the time difference between the two entries, grouped by day. I'm a newbie to Splunk advanced searching so hopefully you can help. My query is: index=main ExportConfigInfo AND ("Message=Job started" OR "Message=Job completed")  
Hello Splunkers, My clients are experiencing issue because of the formatting of the results which is present Splunk vs which is sent as a part of attchment. This is how it is showing in Splunk   ... See more...
Hello Splunkers, My clients are experiencing issue because of the formatting of the results which is present Splunk vs which is sent as a part of attchment. This is how it is showing in Splunk   jacquetta@evie.com LOU - HONG hong-lou-victorina-sid-001k1.active.zenobia hong-lou-victorina-sid-000r1.active.zenobia hong-lou-victorina-sid-001e1.active.zenobia hong-lou-victorina-sid-003f1.active.zenobia hong-lou-victorina-sid-004i0.active.zenobia hong-lou-victorina-sid-002d0.active.zenobia hong-lou-dvpqlqwpy005-001k1.active.zenobia hong-lou-dvpqlqwpy005-000r1.active.zenobia hong-lou-dvpqlqwpy005-001e1.active.zenobia hong-lou-dvpqlqwpy005-003f1.active.zenobia hong-lou-dvpqlqwpy005-004i0.active.zenobia hong-lou-dvpqlqwpy005-002d0.active.zenobia hong-lou-dvpqlqwpy005-004r1.active.zenobia hong-lou-dvpqlqwpy005-006z0.active.zenobia hong-lou-stephany-001k1.ae.active.zenobia hong-lou-stephany-000r1.ae.active.zenobia hong-lou-uvyycdyjewys-001k1.ae.active.zenobia hong-lou-uvyycdyjewys-000r1.ae.active.zenobia hong-lou-uvyycdyjewys-001e1.ae.active.zenobia hong-lou-uvyycdyjewys-003f1.ae.active.zenobia hong-lou-jackeline-001k1.ae.active.zenobia hong-lou-jackeline-000r1.ae.active.zenobia hong-lou-jackeline-001e1.ae.active.zenobia hong-lou-jackeline-003f1.ae.active.zenobia hong-lou-proxy-001k1.active.zenobia hong-lou-proxy-000r1.active.zenobia Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs jacquetta@evie.com AE Member Services - HONG hong-member-001k1.ae.active.zenobia hong-member-000r1.ae.active.zenobia hong-member-001e1.ae.active.zenobia hong-member-003f1.ae.active.zenobia hong-jackeline-001k1.ae.active.zenobia hong-jackeline-000r1.ae.active.zenobia hong-jackeline-001e1.ae.active.zenobia hong-jackeline-003f1.ae.active.zenobia hong-ymefvuphccrj-001k1.ae.active.zenobia hong-ymefvuphccrj-000r1.ae.active.zenobia hong-ymefvuphccrj-001e1.ae.active.zenobia hong-ymefvuphccrj-003f1.ae.active.zenobia hong-raymonde-001k1.ae.active.zenobia hong-raymonde-000r1.ae.active.zenobia hong-raymonde-001e1.ae.active.zenobia hong-raymonde-003f1.ae.active.zenobia Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs jacquetta@evie.com AE Member Services - HUI hui-member-001k1.ae.active.zenobia hui-member-000r1.ae.active.zenobia hui-jackeline-001k1.ae.active.zenobia hui-jackeline-000r1.ae.active.zenobia hui-ymefvuphccrj-001k1.ae.active.zenobia hui-ymefvuphccrj-000r1.ae.active.zenobia hui-raymonde-001k1.ae.active.zenobia hui-raymonde-000r1 Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs Insufficient Logs MgDzvf   And this is how it is going in attachment, as single line which is disturbing my output a lot Can anyone please advice how to correct it.  Thanks in advance.    
Greetings all, I'm trying to search inside a lookup table and I need to use a search command follow by an OR and regex I need the regex to match anything in the lookup table and not just the two fi... See more...
Greetings all, I'm trying to search inside a lookup table and I need to use a search command follow by an OR and regex I need the regex to match anything in the lookup table and not just the two fields before it. Below is some sample SPL, I know it won't work this way but I'm including it to give an idea of what I'm trying to accomplish.     | inputlookup data_source.csv | fillnull value=MISSING | search (count=MISSING AND percent=MISSING) OR regex "[^0-9a-zA-Z\-\._,]"     Thanks in advance for the help, I really appreciate it.
Hello, I have a dashboard with multiselection + text input field.  I'd like to use checkbox instead of multiselect but if I modify it and click the 'Any field' option the dashboard is crashed.    ... See more...
Hello, I have a dashboard with multiselection + text input field.  I'd like to use checkbox instead of multiselect but if I modify it and click the 'Any field' option the dashboard is crashed.    <form version="1.1" theme="light"> <label>Multiselect Text</label> <init> <set token="toktext">*</set> </init> <fieldset submitButton="false"> <input type="multiselect" token="tokselect"> <label>Field</label> <choice value="Any field">Any field</choice> <choice value="category">Group</choice> <choice value="severity">Severity</choice> <default>category</default> <valueSuffix>=REPLACE</valueSuffix> <delimiter> OR </delimiter> <prefix>(</prefix> <suffix>)</suffix> <change> <eval token="form.tokselect">case(mvcount('form.tokselect')=0,"category",mvcount('form.tokselect')&gt;1 AND mvfind('form.tokselect',"Any field")&gt;0,"Any field",mvcount('form.tokselect')&gt;1 AND mvfind('form.tokselect',"Any field")=0,mvfilter('form.select'!="Any field"),1==1,'form.tokselect')</eval> <eval token="tokselect">if('form.tokselect'="Any field","REPLACE",'tokselect')</eval> <eval token="tokfilter">replace($tokselect$,"REPLACE","\"".$toktext$."\"")</eval> </change> </input> <input type="text" token="toktext"> <label>Value</label> <default>*</default> <change> <eval token="tokfilter">replace($tokselect$,"REPLACE","\"".$toktext$."\"")</eval> </change> </input> </fieldset> <row> <panel> <event> <title>$tokfilter$</title> <search> <query>| makeresults</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> </event> </panel> </row> </form>    Could you please help to modify my dashboard from multiselect option to checkbox?   Thank you very much in advance!
I am using dashboard studio and i want to compare 2 values and if they are different, highlight it red. What is the best vizualization type for this, and how do i have it color based on the compariso... See more...
I am using dashboard studio and i want to compare 2 values and if they are different, highlight it red. What is the best vizualization type for this, and how do i have it color based on the comparison of the 2 values?
I want to custom payload for webhook ,but in webhook UI,only a input box for url ,I don't know where I can configure the payload parameter . thanks