All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, @sainag_splunk  I entered your search command on my splunk search app, the results were not shown. No results in your command from my source type, "my_json". I have confused how to resolve this... See more...
Hi, @sainag_splunk  I entered your search command on my splunk search app, the results were not shown. No results in your command from my source type, "my_json". I have confused how to resolve this issue, It may cause critical errors for analysing our data.  Is there anything to try to resolve the issue? I have tried that,  the data has line breaking after ':', so the parsing error was caused, in my think. I treid to change the value "LINE_BREAKER=[}|,]+[\r\n]+", this means if the end of line is ":\r\n", UF will don't break the line. But though changing the LINE_BREAKER value, the parsing errors are still raised.  24/10/23 12:02:22.193   10-23-2024 12:02:22.193 +0900 ERROR JsonLineBreaker [7804 structuredparsing] - JSON StreamId:15916142412051242565 had parsing error:Unexpected character: ':' - data_source="C:\splunk\<my_path>.bin", data_host="<my_host>", data_sourcetype="my_json"
Hi @gcusello, Thank you for sharing the formula for the storage, but is it applicable to Splunk Cloud? Also the average license consumption, is it the data ingestion or the storage entitlement we a... See more...
Hi @gcusello, Thank you for sharing the formula for the storage, but is it applicable to Splunk Cloud? Also the average license consumption, is it the data ingestion or the storage entitlement we are talking about? Thanks in advance!
Hi @afeng You want to extract at for the already ingested/existing logs at Splunk indexer(search time)  or for the new logs yet to be ingested to splunk (are you using any addons, TA's.. are you us... See more...
Hi @afeng You want to extract at for the already ingested/existing logs at Splunk indexer(search time)  or for the new logs yet to be ingested to splunk (are you using any addons, TA's.. are you using UF and/or HF?)  
I allowed Windows Firewall port 8000. And I got firewall log. Then at local server browser I accessed https://192.168.0.8:8000. This browser access was timeout. And in firewall log access log didn... See more...
I allowed Windows Firewall port 8000. And I got firewall log. Then at local server browser I accessed https://192.168.0.8:8000. This browser access was timeout. And in firewall log access log didn't remain. I think before Windows firewall allow or block, browser access is being blocked by anyone. But I don't know local access is denied without Windows firewall. I use Windows defender. I don't use firewall application without it. What does stop browser access in local server. Who do have any idea? Thank you.
Over a decade later but here is my RPI info and which forwarder worked on it: @raspberrypi:/opt# uname -a Linux raspberrypi 6.1.53-v8+ #1680 SMP PREEMPT Wed Sep 13 18:09:06 BST 2023 aarch64 GNU/Lin... See more...
Over a decade later but here is my RPI info and which forwarder worked on it: @raspberrypi:/opt# uname -a Linux raspberrypi 6.1.53-v8+ #1680 SMP PREEMPT Wed Sep 13 18:09:06 BST 2023 aarch64 GNU/Linux From previous releases page: Splunk Universal Forwarder 8.1.9 / ARMv6 / 2.6+, 3.x+, 4.x+, or 5.x+ kernel Linux distributions 32-bit
Hi  I am building dashboard for UPS monitoring and i would like to convert a specific metric which is battery age.  Which give us some information about last battery changed however i would like ... See more...
Hi  I am building dashboard for UPS monitoring and i would like to convert a specific metric which is battery age.  Which give us some information about last battery changed however i would like to see the result in month , days like below  Expected outcome - 1 month 20 days. current outcome  below image  Spl query -  index="ups" indexed_is_service_aggregate=1 kpi=BatteryAge| lookup service_kpi_lookup _key as itsi_service_id OUTPUT title AS service_name | search service_name="MainUPS" |stats latest(alert_value) AS BatteryAge Can anyone help me on this 
I tried removing the stats and have: | mstat min(df_metric.*) WHERE (host=myhost) span=1h index="linux_os_metric" BY MountedOn |table MountedOn Still nothing in the dropdown It would be useful ... See more...
I tried removing the stats and have: | mstat min(df_metric.*) WHERE (host=myhost) span=1h index="linux_os_metric" BY MountedOn |table MountedOn Still nothing in the dropdown It would be useful to understand a bit more about how these results are returned as it seems you are implying that these results are not suitable for a <query> in dropdown? Is there a way of converting the resultset to be non-multi value? If I run the Search in Search & reporting I just get a list of Filesystems. appreciate your input - thanks!
Thanks for the solution which worked When i select the data entity and the time  and hit the submit button with below query ...But without selecting the env test or prod the query get search based on... See more...
Thanks for the solution which worked When i select the data entity and the time  and hit the submit button with below query ...But without selecting the env test or prod the query get search based on the default dropdown applies to the query if it is test  index as "np-ap" and sets stageToken as test. I want  the submit button to work even for the env selection ...along with data entity and date index="np-ap" AND source="--a-test" <query>index=$indexToken$ AND source="-a-$stageToken$"   <form version="1.1" theme="dark"> <label> stats</label> <fieldset submitButton="true"> <input type="dropdown" token="indexToken1"> <label>Environment</label> <choice value="pd-ap,prod">PROD</choice> <choice value="np-ap,test">TEST</choice> <change> <eval token="stageToken">mvindex(split($value$,","),1)</eval> <eval token="indexToken">mvindex(split($value$,","),0)</eval> </change> <default>np-ap,test</default> </input> <input type="dropdown" token="entityToken"> <label>Data Entity</label> <choice value="aa">aa</choice> <choice value="bb">bb</choice> <choice value="cc">cc</choice> <choice value="dd">dd</choice> <choice value="ee">ee</choice> <choice value="ff">ff</choice> <default>aa</default> </input> <input type="time" token="timeToken" searchWhenChanged="false"> <label>Time</label> <default> <earliest>-24h@h</earliest> <latest>now</latest> </default> </input> </fieldset> <row> <panel> <html id="APIStats"> <style> #user{ text-align:center; color:#BFFF00; } </style> <h2 id="user">API</h2> </html> </panel> </row> <row> <panel> <table> <title>Unique</title> <search> <query>index=$indexToken$ AND source="-a-$stageToken$" | stats count </query> <earliest>$timeToken.earliest$</earliest> <latest>$timeToken.latest$</latest> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </table> </panel> </row> </form>  
Thanks for the solution which worked When i select the data entity and the time  and hit the submit button ...But without selecting the env test or prod the query get search based on the default drop... See more...
Thanks for the solution which worked When i select the data entity and the time  and hit the submit button ...But without selecting the env test or prod the query get search based on the default dropdown applies to the query if it is test  index as "np-ap" and sets stageToken as test. I want  the submit button to work even for the env selection ...along with data entity and date index="np-ap" AND source="--a-test" <query>index=$indexToken$ AND source="-a-$stageToken$"   <form version="1.1" theme="dark"> <label> stats</label> <fieldset submitButton="true"> <input type="dropdown" token="indexToken1"> <label>Environment</label> <choice value="pd-ap,prod">PROD</choice> <choice value="np-ap,test">TEST</choice> <change> <eval token="stageToken">mvindex(split($value$,","),1)</eval> <eval token="indexToken">mvindex(split($value$,","),0)</eval> </change> <default>np-ap,test</default> </input> <input type="dropdown" token="entityToken"> <label>Data Entity</label> <choice value="aa">aa</choice> <choice value="bb">bb</choice> <choice value="cc">cc</choice> <choice value="dd">dd</choice> <choice value="ee">ee</choice> <choice value="ff">ff</choice> <default>aa</default> </input> <input type="time" token="timeToken" searchWhenChanged="false"> <label>Time</label> <default> <earliest>-24h@h</earliest> <latest>now</latest> </default> </input> </fieldset> <row> <panel> <html id="APIStats"> <style> #user{ text-align:center; color:#BFFF00; } </style> <h2 id="user">API</h2> </html> </panel> </row> <row> <panel> <table> <title>Unique</title> <search> <query>index=$indexToken$ AND source="-a-$stageToken$" | stats count </query> <earliest>$timeToken.earliest$</earliest> <latest>$timeToken.latest$</latest> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </table> </panel> </row> </form>  
Oct 22 14:20:45 10.5.0.200 DNAC {"version":"1.0.0","instanceId":"20fd8163-4ca8-424b-a5a9-1e4018372abb","eventId":"AUDIT_LOG_EVENT","namespace":"AUDIT_LOG","name":"AUDIT_LOG","description":"Executing... See more...
Oct 22 14:20:45 10.5.0.200 DNAC {"version":"1.0.0","instanceId":"20fd8163-4ca8-424b-a5a9-1e4018372abb","eventId":"AUDIT_LOG_EVENT","namespace":"AUDIT_LOG","name":"AUDIT_LOG","description":"Executing command terminal width 0\nconfig t\nFailed to fetch the preview commands.\n","type":"AUDIT_LOG","category":"INFO","domain":"Audit","subDomain":"","severity":1,"source":"NA","timestamp":1729606845043,"details":{"requestPayloadDescriptor":"terminal width 0\nconfig t\nFailed to fetch the preview commands.\n","requestPayload":"\n"},"ciscoDnaEventLink":null,"note":null,"tntId":"630db6e989269c11640abd49","context":null,"userId":"system","i18n":null,"eventHierarchy":{"hierarchy":"20fd8163-4ca8-424b-a5a9-1e4018372abb","hierarchyDelimiter":"."},"message":null,"messageParams":null,"additionalDetails":{"eventMetadata":{"auditLogMetadata":{"type":"CLI","version":"1.0.0"}}},"parentInstanceId":"9dde297d-845e-40d0-aeb0-a11e141f95b5","network":{"siteId":"","deviceId":"10.7.140.2"},"isSimulated":false,"startTime":1729606845055,"dnacIP":"10.5.0.200","tenantId":"SYS0"} host = 10.5.0.200 sourcetype = syslog   how do I extract : seperated fields?
With SOAR's Splunk app (Splunk | Splunkbase), you can pull the SID of your search and append that to your Splunk instance's base URL. This is the same format as if you had clicked the share button in... See more...
With SOAR's Splunk app (Splunk | Splunkbase), you can pull the SID of your search and append that to your Splunk instance's base URL. This is the same format as if you had clicked the share button in Splunk. Unfortunately, using the link returns "Permission Denied" because the SID hasn't actually been shared.   Does anyone know how to make the results of a search run by the Splunk app shareable?
No. It doesn't work like that. A bucket doesn't "roll to smartstore". A bucket rolls to warm and cache manager uploads it to smartstore when it can. So if you: 1) Didn't give Splunk a chance to uplo... See more...
No. It doesn't work like that. A bucket doesn't "roll to smartstore". A bucket rolls to warm and cache manager uploads it to smartstore when it can. So if you: 1) Didn't give Splunk a chance to upload the bucket to smartstore and 2) Didn't have more copies of a bucket (or just destroyed all instances at once) yes, you might have experienced data loss.  
Searching for both "AuthenticationResult=passed" as well as "Authentication failed" at the same time seems counterintuitive. Are you sure your data matches those conditions. Also if you can think of... See more...
Searching for both "AuthenticationResult=passed" as well as "Authentication failed" at the same time seems counterintuitive. Are you sure your data matches those conditions. Also if you can think of reorganizing your search to not include negation, that would be a significant performance benefit.
My test machine is also on Splunk version 9.3.1. Could you post sanitized snippets of your JS or dashboard source code? It's hard to see where the issue lies without seeing the full picture.
Well, both yes and no. No, because the message only indicates that schedules searches have been delayed (ad-hoc searches have highest priority and unless you have many concurrent users and very low-... See more...
Well, both yes and no. No, because the message only indicates that schedules searches have been delayed (ad-hoc searches have highest priority and unless you have many concurrent users and very low-spec environment are usually properly run). Yes, because ad-hoc search activity influences how many scheduled searches can be spawned. And yes, all-time searches are very rarely a good idea. At least on raw data. Also even if you have many searches that are supposed to be running every 5 minutes, you can often "spread" them over those 5 minutes so that some of them start at 0,5,10 and so on, some on 1,6,11... some on 2,7,12... You get the drift.
Hi @redmandba , as @ITWhisperer said, you surely have a multivalue in MountedOn field and this isn't acceptable, so use the BY clause: | mstat min(df_metric.*) WHERE (host=myhost) span=1h index="l... See more...
Hi @redmandba , as @ITWhisperer said, you surely have a multivalue in MountedOn field and this isn't acceptable, so use the BY clause: | mstat min(df_metric.*) WHERE (host=myhost) span=1h index="linux_os_metric" BY MountedOn | stats count BY MountedOn | sort MountedOn | table MountedOn Ciao. Giuseppe
Hi @richgalloway , Apologies, this might be silly question but I am fairly new to Splunk. I want to understand, is this delayed error message because of only scheduled searches, or ad-hoc searches ... See more...
Hi @richgalloway , Apologies, this might be silly question but I am fairly new to Splunk. I want to understand, is this delayed error message because of only scheduled searches, or ad-hoc searches also contributes to the error. I have few scheduled searches running on "All time" , this could be the cause of delayed search? Should I reduce the timeframe of these searches. Also, there are many schedules searches all running at a cron of every 5 mins, do I need to change them as well.   Thanks in advance.
Could you try this regex: (?s)EventCode=4688.*Token Elevation Type: (%%1936|%%1938|TokenElevationTypeDefault|TokenElevationTypeLimited) And also post your (sanitized) props.conf and transforms.conf... See more...
Could you try this regex: (?s)EventCode=4688.*Token Elevation Type: (%%1936|%%1938|TokenElevationTypeDefault|TokenElevationTypeLimited) And also post your (sanitized) props.conf and transforms.conf if it does not work?
@sainag_splunk I opened inspect for two recently created dashboards, the last one that does not have this problem and the first one that has.  The one that I cannot use the magnifying glass has data-... See more...
@sainag_splunk I opened inspect for two recently created dashboards, the last one that does not have this problem and the first one that has.  The one that I cannot use the magnifying glass has data-disabled="true" whereas the earlier one has data-disabled="false". condition data-disabled date dashboard created Unable to Open in Search true 2024-10-16 Able to Open in Search false 2024-10-09 Date created is the only obvious difference between the two.  Even the construction of the two dashboards are the same.  I saved a search to create a dashboard, and added some inputs. So, my hope is that there is a code element in JSON that I can tweak to fix this problem.  Just need to know where.
So i had the same issues on my splunk forwarder 9.3.** version and used the recommendation provided on https://www.hurricanelabs.com/splunk-tutorials/splunk-7-1-performing-a-splunk-password-reset. Es... See more...
So i had the same issues on my splunk forwarder 9.3.** version and used the recommendation provided on https://www.hurricanelabs.com/splunk-tutorials/splunk-7-1-performing-a-splunk-password-reset. Especially the last video, which finally granted me access.