All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am creating an alert where the time range should be from 7 to 18 and corn schedule is for 5 mins So in my alert if i give earliest=@d+7h and latest =@d+18h will this works?? And i dont want to ... See more...
I am creating an alert where the time range should be from 7 to 18 and corn schedule is for 5 mins So in my alert if i give earliest=@d+7h and latest =@d+18h will this works?? And i dont want to receive alert after this time range. how i can do this??
So i have a trendline like below: I dont know why is there no link between the two data points between april and may and may and june. Here is the query i have: | inputlookup category | e... See more...
So i have a trendline like below: I dont know why is there no link between the two data points between april and may and may and june. Here is the query i have: | inputlookup category | eval _time=strptime(_time, "%m/%d/%Y") | search envcor="$env$" | eval group=envcor."_" | timechart sum(Total_Number_Unique_Names) sum(Total_Number_Shops) by group   Please help. Thanks.  
Hi i'm trying to have spaces in my overlay field name in a column chart. As example field1 is working but "field 1" not. Not working:  "options": {         "chart.stackMode": "stacked",     ... See more...
Hi i'm trying to have spaces in my overlay field name in a column chart. As example field1 is working but "field 1" not. Not working:  "options": {         "chart.stackMode": "stacked",          "axisY2.enabled": true,           "overlayFields": "field 1, field 2, field 3",         "fieldColors": {             "field4": "#E7DA78",             "field5": "#82BACF"         }     }, Working:  "options": {         "chart.stackMode": "stacked",          "axisY2.enabled": true,         "overlayFields": "field1, field2, field3",         "fieldColors": {             "field4": "#E7DA78",             "field5": "#82BACF"         }     },
Platform: Splunk Cloud Problem statement illustration: we have 4 intermediate forwarders, and more than 2500 universal forwarders are routing data to these four IFWs.   UF (700)----> IFW1----... See more...
Platform: Splunk Cloud Problem statement illustration: we have 4 intermediate forwarders, and more than 2500 universal forwarders are routing data to these four IFWs.   UF (700)----> IFW1------>Splunk cloud UF (600)----> IFW2------>Splunk cloud UF (700)----> IFW3------>Splunk cloud UF (500)----> IFW4------>Splunk cloud What is needed: How a splunk admin/power user can create a dashboard or fetch information from searchead that which are all sources being routed to splunk cloud through each IFW query should list the universal forwarder hostname and respective ifw from where it getting routed to splunk cloud Any lead on this
I have sanitized the index names- I have users that have propagated a lookup command in dashboards that is now a major issue with the lookup file being so large now its causing bundle replication er... See more...
I have sanitized the index names- I have users that have propagated a lookup command in dashboards that is now a major issue with the lookup file being so large now its causing bundle replication errors (lookup table is a whopping 512mb). They do not have an append=true in their dashboards and personally in my opinion its bad practice to create a lookup table using a dashboard but to use a scheduled search instead and reference in the dashboard. Id like a few sets of eyes on this and do a sanity check or am i looking at this totally wrong? BTW do i need commas to separate the field value's? The user want just certain values updated when run every 15 mins - " pcenter office externalIP" Their lookup command within the dashboard - | lookup agentsessions.csv sessionId OUTPUTNEW pcenter office externalIP Now, my theory is they wanted to update  just the data in the fields for pcenter, office, and  externalIP. IIRC the OUTPUTNEW command is used to fill a field that previously had no data (was blank) The OUTPUT command IIRC replaces the specified data with the new data so the new dashboard command should look like this :   | lookup agentsessions.csv sessionId OUTPUT pcenter office externalIP. I created a scheduled search which should update the whole table (renamed it for testing) index IN (one, two, three, four) source="wineventlog:custom SourceName=DesktopAgentService action timestamp sessionId Heartbeat | table Message | spath input=Message | dedup sessionId sortby +_time | lookup agentsessions2.csv sessionId OUTPUT sessionId as existingSessionId | where isnull(existingSessionId) | fields - action existingSessionId Message | outputlookup agentsessions2.csv append=true OR I clould modifiy the scheduled search like this: index IN (one, two, three, four) source="wineventlog:custom" SourceName=DesktopAgentService action timestamp sessionId Heartbeat | table Message | spath input=Message | dedup sessionId sortby +_time | lookup agentsessions2.csv sessionId OUTPUT sessionId as existingSessionId pcenter office externalIP | where isnull(existingSessionId) | fields - action existingSessionId Message | outputlookup agentsessions2.csv append=true Also, with the append=true, wont that duplicate entries each time it is ran? or will it just update the table with new fresh data? Ran both my scheduled searches and they do seem to work, I just want to verify I am doing it correctly and getting the updated data instead of them trying to do all this in a dashboard that runs every 15 mins. Or should I have them create a dataset table to do all this more efficiently? https://docs.splunk.com/Documentation/Splunk/9.0.4/Knowledge/Aboutdatasets
how to include quote within LIKE keyword in Dbxquery? For example:    There are 10k people in the DB and I would like to query only those who contains number:123 The problem is the content has quot... See more...
how to include quote within LIKE keyword in Dbxquery? For example:    There are 10k people in the DB and I would like to query only those who contains number:123 The problem is the content has quote ".   Note I cannot filter only 123 because it will also match another record.   name = person1 content = {"description:"test",number":123,"status":"good"} | dbxquery connection = visibility query = "select name, content from TestDB where content like '%number":123%' Thanks
how to parse field data with delimiter from dbxquery result? For example: Dbxquery result is FW Rule name: DNS FW Rule:  "protocol":udp","port:53","dest_IP:10.10.10.1","direction:ingress" I would... See more...
how to parse field data with delimiter from dbxquery result? For example: Dbxquery result is FW Rule name: DNS FW Rule:  "protocol":udp","port:53","dest_IP:10.10.10.1","direction:ingress" I would like to have a FW rule display in a separate table in dashboard Dropdown menu: FW Rule: DNS Protocol   |  Port  | Dest IP           | Direction UDP            |   53    | 10.10.10.1   | Ingress Thanks
I was wondering why the Microsoft 365 App Teams Activity Report dashboard did not show any data in the dropdown on the page so I took it apart and looked at the query:   `m365_default_index` source... See more...
I was wondering why the Microsoft 365 App Teams Activity Report dashboard did not show any data in the dropdown on the page so I took it apart and looked at the query:   `m365_default_index` sourcetype="o365:graph:api" source=TeamsUserActivityUserDetail | stats latest(_time) AS _time by "Report Refresh Date" | rename "Report Refresh Date" AS ReportRefreshDate | sort - _time So I ran this in a search in the app's context and got nothing.  I pared the search down to just: `m365_default_index` sourcetype="o365:graph:api" source=TeamsUserActivityUserDetail and looked at the fields.  The field the search is looking for, Report Refresh Date, is in the field list in Smart Mode and in the syntax highlighted record.  So I tried just returning a table with that field and got nothing but the field name, no data. I took the first result with the simple query: `m365_default_index` sourcetype="o365:graph:api" source=TeamsUserActivityUserDetail and clicked Show as raw text.  The field I am looking for is the very first field but is prefaced with \ufeff, making it "\ufeffReport Refresh Date".  This is why searching the field name is not working. I drilled into one of the Report Refresh Date contents and in the resulting search it show the field name with a character at the front of it - ".Report Refresh Date" with the period highlighted in pink.  That search returned correct results.  I tried copying that and pasting it into another search and THAT one worked. Has anyone else seen this in this report and is there a fix for it?  I am currently going through the query and replacing the field name with the one copied from the query that works to a point but this is a band-aid.  And unfortunately when I try to fix the dashboard it gets hung up on the Field for value input (won't let me copy that special character in there).  I am no Splunk expert.  Is there any way to filter this what looks to be a UTF-8 character from this field name in a search?  The issue is coming from Microsoft in the ingested logs. Thoughts?
Hi, I have issue similar to: https://community.splunk.com/t5/Getting-Data-In/how-to-split-the-json-array-into-multiple-new-events/m-p/122265 But my case is a bit different. json structure: { "... See more...
Hi, I have issue similar to: https://community.splunk.com/t5/Getting-Data-In/how-to-split-the-json-array-into-multiple-new-events/m-p/122265 But my case is a bit different. json structure: { "MetaData": { "Host Name": "....", "Wi-Fi Driver Version": "..." }, "Payloads": [ { "Header": { "Type": "Event", "Name": "...", "TimeStamp": ... }, "Payload": { "MAC Address": "00:00:00:00:00:00", "Network Adapter Type": ... } }, { ] } i need to: 1. extract table contain the following columns : MetaData.host name,MetaData.Wi-Fi Driver Version,Header.Type, Header.Name,Payload.MAC Address,Payload.Network Adapter Type 2. i expected to see 2 rows in this case 3. the fields name under MetaData,Header and Payload can changed, so it's should be generic. I have started to write something like that- but it's not generic (type,name,..) and it doesn't extract the meta data: | spath input=json output=payloadsObjects path=Payloads{} | mvexpand payloadsObjects | spath input=payloadsObjects output=Type path=Header.Type | spath input=payloadsObjects output=Name path=Header.Name | table *   json as example to use: | makeresults | eval _raw="{ \"MetaData\": { \"Host Name\": \"maya-MOBL\", \"Driver Version\": \"99.0.100.4\" }, \"Payloads\": [ { \"Header\": { \"Type\": \"Event\", \"Name\": \"IP Disconnection\", \"TimeStamp\": 133265876804261336 }, \"Payload\": { \"MAC Address\": \"00:00:00:00:00:00\", \"Adapter Type\": 140 } }, { \"Header\": { \"Type\": \"Event\", \"Name\": \"Connection success\", \"TimeStamp\": 133265877087374706 }, \"Payload\": { \"MAC Address\": \"00:00:00:00:00:00\", \"Network Adapter Type\": 131, \"Address\": \"0000:0:0000:000:000:df:0000:0000\", \"Prefix Length\": 64, \"Is Local Address\": false, \"Gateway IP Address\": \"::\", \"DNS Server\": [], \"DHCP Server\": null, \"DHCP Lease Duration\": 000000000, \"DHCP Retrieval Time\": 0 } } ] }" May I get your help please? *note- nice to have also solution that doesn't use makeresult because it made me problem to find Payloads{} field when i used real json file in my report and not makeresult. *note 2- need to take time performance into consideration
Hi , I have a field which has 3 values i.e., 0 , 1 & 2. 0 for Green , 1 for Blue and 2 for Red. I'm using this values to show colour differences in dashboard. But the search fetches data only at a... See more...
Hi , I have a field which has 3 values i.e., 0 , 1 & 2. 0 for Green , 1 for Blue and 2 for Red. I'm using this values to show colour differences in dashboard. But the search fetches data only at a particular time in a day , so the remaining time there is no data. When there is no data , the dashboard is not loading.  I want to indicate no data with black colour in the dashboard. How can I do that? where in if there is no day I can create and map a field value in dashboard
Dear All, I have a question regarding exchange monitoring by AppDynamics does it support Monitoring Exchange Message Queue? Any dashboard as an example or documents can help. Thanks
  I have a requirement where i want send the above query result from splunk to slack as an FYI alert. But somehow i am not able to achieve it and i am unable find the right documents to proces... See more...
  I have a requirement where i want send the above query result from splunk to slack as an FYI alert. But somehow i am not able to achieve it and i am unable find the right documents to processed. I did try to print using  $result.req$ $result.method$ and $result.percentile99$ but i get to see the slack message for just 1st row.  But i would need the complete query to be shown in slack message. How do i achieve this ?   @ITWhisperer or anyone please help me out here
My Dashboards when seen by logging into portal is looking fine.But when opened using the shareable link it is not loading data completely (showing while blocks in place of widgets) Can anyone help m... See more...
My Dashboards when seen by logging into portal is looking fine.But when opened using the shareable link it is not loading data completely (showing while blocks in place of widgets) Can anyone help me out with this?
Hello, The default format of my subsearch result looks like: (( Host_Name="srv1" AND icid="va1_icid1" AND mid="val_mid1" ) OR ( Host_Name="srv2" AND icid="va1_icid2" AND mid="val_mid2" )) I wo... See more...
Hello, The default format of my subsearch result looks like: (( Host_Name="srv1" AND icid="va1_icid1" AND mid="val_mid1" ) OR ( Host_Name="srv2" AND icid="va1_icid2" AND mid="val_mid2" )) I would like to modify subsearch format result like: (( Host_Name="srv1" AND ( icid="va1_icid1" OR mid="val_mid1" )) OR ( Host_Name="srv2" AND ( icid="va1_icid2" OR mid="val_mid2" ))) Do you think it is possible? Regards, Emile
Copy of warm buckets to smartstore hosted on GCP cloud storage failling due to below error. We are using remote.gs.service_account_email setting in index.conf.  This issue started happening after mi... See more...
Copy of warm buckets to smartstore hosted on GCP cloud storage failling due to below error. We are using remote.gs.service_account_email setting in index.conf.  This issue started happening after migrating to Splunk v9.0.4 ERROR ProcessTracker [43654 MainThread] - (child_150__Fsck) ProcessTracker - (subchild_43__RollFixMetadata) GCPCredentials - credentials not found for gcs volume=remote_store
I want to add the "Latest" keyword to my drop down filter instead of date which displays the latest data. Below is my source code:     <input type="dropdown" token="reportingDayA"> <lab... See more...
I want to add the "Latest" keyword to my drop down filter instead of date which displays the latest data. Below is my source code:     <input type="dropdown" token="reportingDayA"> <label>Reporting Day Misra apu</label> <fieldForLabel>readableDateA</fieldForLabel> <fieldForValue>epochA</fieldForValue> <search> <done> <set token="reportingDayDefA">$result.epochA$</set> </done> <query>index=`std_getAttrIndexBaseName`_qac$oem$ sourcetype=qac Variant="apu $Variant$" Release=$release$ | dedup _time| eval readableDateA=strftime(_time,"%F") | eval epochA=strftime(_time,"%s") | dedup readableDateA | sort - _time</query> <earliest>0</earliest> <latest></latest> </search> <default>$reportingDayDefA$</default> </input>       and Latest keyword should be default value to be selected
I have set of records where the data has time column in it. Eg: Here I will have an input from user where user will enter the date in input box in any format (yyyy/mm/dd) I want to find ... See more...
I have set of records where the data has time column in it. Eg: Here I will have an input from user where user will enter the date in input box in any format (yyyy/mm/dd) I want to find all records that are greater than the time entered by user.  
I'm trying to figure out how do we add an external link to documentation in checklist.conf. This is for a healthcheck that will be run in the Splunk Monitoring Console. Here is an example from the S... See more...
I'm trying to figure out how do we add an external link to documentation in checklist.conf. This is for a healthcheck that will be run in the Splunk Monitoring Console. Here is an example from the System Hardware provisioning assessment health check:     [system_hardware_provisioning_assessment] title = System hardware provisioning assessment category = System and Environment tags = best_practices, capacity, scalability description = This evaluates the hardware specifications of Splunk instances tasked with indexing and/or searching data, using reference hardware. failure_text = One or more hosts has returned CPU or memory specifications that fall below reference hardware recommendations. This might adversely affect indexing or search performance. suggested_action = If you are experiencing performance issues, consider upgrading hosts to meet or exceed the recommended hardware specs. doc_link = healthcheck.hardware.reference doc_title = recommended minimum hardware applicable_to_groups = dmc_group_search_head,dmc_group_indexer search = | rest $rest_scope$ /services/server/info \ | eval cpu_core_count = if(isnotnull(numberOfVirtualCores), numberOfVirtualCores, numberOfCores) \ | eval physical_memory_GB = round(physicalMemoryMB / 1024, 0) \ | fields splunk_server cpu_core_count physical_memory_GB \ | eval severity_level = case(cpu_core_count <= 4 OR physical_memory_GB <= 4, 2, cpu_core_count < 12 OR physical_memory_GB < 12, 1, cpu_core_count >= 12 AND physical_memory_GB >= 12, 0, true(), -1) \ | rename splunk_server AS instance cpu_core_count AS "cpu_core_count (current / recommended)" physical_memory_GB AS "physical_memory_GB (current / recommended)" \ | fieldformat cpu_core_count (current / recommended) = 'cpu_core_count (current / recommended)'." / 12" \ | fieldformat physical_memory_GB (current / recommended) = 'physical_memory_GB (current / recommended)'." / 12"     the doc_link isn't a URL but it does link to : https://docs.splunk.com/Documentation/Splunk/9.0.4/Capacity/Referencehardware?ref=hk   How  can I link to an external doc?
When I run the following query:  "com.server" | table id uri statusCode _time | join type=inner saga_id [search "SecondServer" path="/myPath/*" | tablepath, id | where statusCode >= 400 | stats coun... See more...
When I run the following query:  "com.server" | table id uri statusCode _time | join type=inner saga_id [search "SecondServer" path="/myPath/*" | tablepath, id | where statusCode >= 400 | stats count by uri,statusCode,path | sort -count Over the last 15 minutes, it returns results. When I run it over a longer time range like 60 min or last 24h, it does not I am puzzled by this and I am not sure what I am doing wrong. Could you please help?   
Hello Splunkers ,   I have the following source file which has the date/time in it .. How do I write the props and transforms to use the source date/time as the _time   Below is the sample fi... See more...
Hello Splunkers ,   I have the following source file which has the date/time in it .. How do I write the props and transforms to use the source date/time as the _time   Below is the sample file /project/admin/sv/re/sniff/pre/logs/2022-12-16T11-57-36/status  i want the _time or indexed time to be 2022-12-16 11-57-36.   Thanks in Advance