All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi,  I am trying to use btool to find an index that is used in an inputs.conf: ./splunk btool inputs list --debug | grep "indexname" However I get nothing back, am I doing something wrong?   Tha... See more...
Hi,  I am trying to use btool to find an index that is used in an inputs.conf: ./splunk btool inputs list --debug | grep "indexname" However I get nothing back, am I doing something wrong?   Thanks,   Joe   
Hi  I'm investigating Windows log in Splunk, struggling to apply the correct filter. What filter do I need to apply to find the persistence in the Windows registry? What filter do I need to app... See more...
Hi  I'm investigating Windows log in Splunk, struggling to apply the correct filter. What filter do I need to apply to find the persistence in the Windows registry? What filter do I need to apply to find the Sysmon id 13 events to find the registry key used to maintain persistence in Windows? Filter for what port number is listening for an incoming connection, using Sysmon  12 and sysmon13 event IDs. my current search: index=* Any assistance will be immensely appreciated
Hi , I am trying to create a dashboard having stats summary basis on - error log resides into log message.  Can someone help , how can I extract respective text and mark them as a field to get count... See more...
Hi , I am trying to create a dashboard having stats summary basis on - error log resides into log message.  Can someone help , how can I extract respective text and mark them as a field to get count for final stats - Sample Event Data : 2023-05-12 09:48:30,580 - abc_sdk._internal.worker - INFO - _worker_process - request_id=xyz-4fa1-b28a-0b62e4199a53 - x-gw-ims-client-id=X_webapp x-gw-ims-user-id=abc@xyz.com Total time taken: 63.58056879043579 can not stack video #Total requests , #Total errors for unstacked videos.
Hi Need help to fix the below error  My Props : Sample events:  
Hello Need some help to fix the below error 03-14-2014 17:11:49.108 -0300 ERROR LineBreakingProcessor - Line breaking regex has no capturing groups: ^\{ - data_source=.......... My props: [... See more...
Hello Need some help to fix the below error 03-14-2014 17:11:49.108 -0300 ERROR LineBreakingProcessor - Line breaking regex has no capturing groups: ^\{ - data_source=.......... My props: [source::abc] disabled=false pulldown_type=true TRUNCATE=25000 TIME_PREFIX="timestamp"\s* :\s*" LINE_BREAKER=^\{ BREAK_ONLY_BEFORE=^{ CHARSET=UTF-8 SHOULD_LINEMERGE=true category=Custom pulldown=true Sample log:   { "maexUniqueld": "414D51204D4532352020202020202020B3A95C64016F0040", "mgexEventCommon": { "examgr": "ME25", "exreason": "CHLSTPU", "extype": "CHANNEL", "evobjname": "DIRECT.TCP", "exobjtype": "CHANNEL", "evuserid":"", "summary": "Channel - Stopped by User - Channel:DIRECT.TCP", "cfbcmd": 46, "cfhreason": 2279, "extime": "2023-05-11T08:39:23Z", "extimesecs": 1683794363 }, "mgexData": { "channel": "DIRECT.TCP", "csnqual": 10 }
Hello Need some help on below issue. 05-11-2023 07:01:23:156 -0400 ERROR LineBreakingProcessor [1956104 parsing_3] - Line breaking regex has no capturing groups: ^\{ - data_source="D:\Apps.......TX... See more...
Hello Need some help on below issue. 05-11-2023 07:01:23:156 -0400 ERROR LineBreakingProcessor [1956104 parsing_3] - Line breaking regex has no capturing groups: ^\{ - data_source="D:\Apps.......TXT", ..... My Props : LINE BREAKER=^\{ NO BINARY CHECK-true BREAK ONLY_BEFORE=^\{ CHARSET=UTF-8 disabled=false KV MODE=json MAX TIMESTAMP LOOKAHEAD=70 TIME PREFIX=timeStamplevtime"\: \s*"  TIME FORMAT=%Y-%m-%dT%I:%M:%S TRUNCATE=999999   Sample logs : { "maexUniqueld": "414D51204D4532352020202020202020B3A95C64016F0040", "mgexEventCommon": { "examgr": "ME25", "exreason": "CHLSTPU", "extype": "CHANNEL", "evobjname": "DIRECT.TCP", "exobjtype": "CHANNEL", "evuserid":"", "summary": "Channel - Stopped by User - Channel:DIRECT.TCP", "cfbcmd": 46, "cfhreason": 2279, "extime": "2023-05-11T08:39:23Z", "extimesecs": 1683794363 }, "mgexData": { "channe l": "DIRECT.TCP", "csnqual": 10 }
Hi , In the Splunk Cloud environment, How  to determine the logs and sourcetypes being utilized for a specific dashboard, use-case, or other metrics.
Hi Team,   I am collecting metrics using API calls for every 5 minutes , but all the metrics are coming as a single event as below for every 5 minutes.   confluent_kafka_server_request_bytes{kafk... See more...
Hi Team,   I am collecting metrics using API calls for every 5 minutes , but all the metrics are coming as a single event as below for every 5 minutes.   confluent_kafka_server_request_bytes{kafka_id="tythtyt",principal_id="sa-r29997",type="Fetch",} 2092668.0 1683872880000 confluent_kafka_server_request_bytes{kafka_id="tythtyt",principal_id="sa-9pyr8m",type="Metadata",} 1849.0 1683872880000 confluent_kafka_server_request_bytes{kafka_id="tythtyt",principal_id="sa-r29997",type="Metadata",} 66279.0 1683872880000 confluent_kafka_server_request_bytes{kafka_id="tythtyt",principal_id="u-09pr56",type="Metadata",} 0.0 1683872880000 confluent_kafka_server_response_bytes{kafka_id="rtrtt",principal_id="sa-y629ok",type="Fetch",} 5019.0 1683872880000 confluent_kafka_server_response_bytes{kafka_id="trtrt",principal_id="sa-8gg7jr",type="Metadata",} 0.0 1683872880000 confluent_kafka_server_memory{kafka_id="yyyy",topic="host002.json.cs.tt",} 1.0 1683872880000 confluent_kafka_server_memory{kafka_id="yyyy",topic="host002.json.cs.tt.enriched",} 1.0 1683872880000 confluent_kafka_server_memory{kafka_id="yyyy",topic="host002.json.cs.tt.fulfilment.auto",} 1.0 1683872880000 confluent_kafka_server_memory{kafka_id="yyyy",topic="host002.json.cs.tt.gg",} 0.0 1683872880000     I need to break these events as individuals (which ever events starting from text “confluent_kafka_”)  . I have edited my props.conf as below but its not coming as expected still its coming as a single event. Can some one please guide me how to do it.   [source::kafka_metrics://kafka_metrics] LINE_BREAKER = (confluent_kafka_)(\s) SHOULD_LINEMERGE = false
Hi All, we have a scenario to throw an alert if  System error rate exceeds 5%  i.e. (#system errors / #total volume)*100% . How to get the count of total events and system errors then calculate the... See more...
Hi All, we have a scenario to throw an alert if  System error rate exceeds 5%  i.e. (#system errors / #total volume)*100% . How to get the count of total events and system errors then calculate the percentage based on count as per above formula. Ex : Query for total volume : sourcetype="sfdc:transaction_log__c" | eval message = "b2cforce-liveperson" | where like(_raw,"%".message."%") Query for System errors  :  sourcetype="sfdc:transaction_log__c" | eval message = "userId Retrieval Failure" | where like(_raw,"%".message."%")
i can't extract the exact text using rex command .  e.g User:  This is my user Name\n  This is just some random text i want to extract a new filed which values is "This is my user Name"
Hi Team, We have events that are being received from csv and json files in the below format. "message": "Dataframe row : {\"_c0\":{\"0\":\"Linux\",\"1\":\"00:00:01\",\"2\":\"00:10:01\",\"3\":\"00:2... See more...
Hi Team, We have events that are being received from csv and json files in the below format. "message": "Dataframe row : {\"_c0\":{\"0\":\"Linux\",\"1\":\"00:00:01\",\"2\":\"00:10:01\",\"3\":\"00:20:01\",\"4\":\"00:30:01\",\"5\":\"00:40:01\",\"6\":\"00:50:01\",\"7\":\"01:00:01\",\"8\":\"01:10:01\",\"9\":\"01:20:02\",\"10\":\"01:30:01\",\"11\":\"01:40:01\",\"12\":\"01:50:01\"},\"_c2\":{\"0\":\"(fraasdwhbdd1.de.db.com)\",\"1\":\"%user\",\"2\":\"1.28\",\"3\":\"1.05\",\"4\":\"1.13\",\"5\":\"1.25\",\"6\":\"0.98\",\"7\":\"1.08\",\"8\":\"1.75\",\"9\":\"1.04\",\"10\":\"1.22\",\"11\":\"1.11\",\"12\":\"1.05\"} The above event is related to cpu utilization which has been sent to splunk from csv and json files by executing SAR command.  The requirement is that we have to extract the time stamps such as 00:00:01 , 00:10:01, 00:20:01 values and %utilization values such as 1,28, 1.05 from the above message event and add them as a separate columns or fields. Also we have to construct a dashboard panel using the above values. Please do the needful on this,
Hi ALL, I am trying to figure out from where the logs are ingesting into my splunk cloud, where do we get to know from logs. Thanks
How do i edit the below append command into my real time environment  The below source are obtain from Splunk Dasboard Studio   | makeresults count=50 | eval app="Web Server" | eval type="User" ... See more...
How do i edit the below append command into my real time environment  The below source are obtain from Splunk Dasboard Studio   | makeresults count=50 | eval app="Web Server" | eval type="User" | append [ makeresults count=28 | eval app="Web Server" | eval type="System" ] | append [ makeresults count=22 | eval app="Web Server" | eval type="Idle" ] | append [makeresults count=22 | eval app="Network" | eval type="User" ] | append [ makeresults count=48 | eval app="Network" | eval type="System" ]| append [ makeresults count=30 | eval app="Network" | eval type="Idle" ] | append [makeresults count=65 | eval app="Load Balancer" | eval type="User" ] | append [ makeresults count=17 | eval app="Load Balancer" | eval type="System" ] | append [ makeresults count=18 | eval app="Load Balancer" | eval type="Idle" ] | append [makeresults count=50 | eval app="Storage" | eval type="User" ] | append [ makeresults count=27 | eval app="Storage" | eval type="System" ] | append [ makeresults count=23 | eval app="Storage" | eval type="Idle" ] | append [makeresults count=18 | eval app="Database" | eval type="User" ] | append [ makeresults count=60 | eval app="Database" | eval type="System" ] | append [ makeresults count=22 | eval app="Database" | eval type="Idle" ] | append [makeresults count=50 | eval app="Security" | eval type="User" ] | append [ makeresults count=12 | eval app="Security" | eval type="System" ] | append [ makeresults count=38 | eval app="Security" | eval type="Idle" ] | append [makeresults count=45 | eval app="Auth Server" | eval type="User" ] | append [ makeresults count=32 | eval app="Auth Server" | eval type="System" ] | append [ makeresults count=23 | eval app="Auth Server" | eval type="Idle" ] | append [makeresults count=50 | eval app="CDN" | eval type="User" ] | append [ makeresults count=28 | eval app="CDN" | eval type="System" ] | append [ makeresults count=22 | eval app="CDN" | eval type="Idle" ] | chart count by app, type | eval sort_field=case(app=="Login",1,app=="Search",2,app=="Cart",3,app=="Pricing",4,app=="Checkout",5,app=="Order Management",6,app=="Gifting",7,app=="Chat",8) | sort sort_field | table app, User, System, Idle   My current splunk  index=sample  x_host_header=www.sample.com | eval Device = if(match(useragent,"SM-"),"Android", if(match(useragent,"Windows"),"Windows",if(match(useragent,"Mac"),"Mac",if(match(useragent,"CPH"),"Android",if(match(useragent,"Nokia"),"Android",if(match(cs_user_agent,"Pixel"),"Android",if(match(useragent,"TB-"),"Android",if(match(useragent,"VFD"),"Android",if(match(useragent,"HP%20Pro%20Slate"),"Android",if(match(cs_user_agent,"VOG-L09"),"Android",if(match(useragent,"YAL-L21"),"Android",if(match(useragent,"ATU-L22"),"Android",if(match(useragent,"MAR-LX1A"),"Android",if(match(useragent,"RNE-L22"),"Android",if(match(useragent,"INE-LX2"),"Android",if(match(useragent,"AMN-LX2"),"Android",if(match(useragent,"LYO-LO2"),"Android",if(match(useragent,"DRA-LX9"),"Android",if(match(useragent,"LYA-L29"),"Android",if(match(useragent,"ANE-LX2J"),"Android",if(match(useragent,"STK-L22"),"Android",if(match(useragent,"EML-AL00"),"Android",if(match(useragent,"BLA-L29"),"Android",if(match(useragent,"X11"),"Linux",if(match(useragent,"LDN-LX2"),"Android",if(match(useragent,"TB3-"),"Android",if(match(useragent,"5033T"),"Android",if(match(useragent,"5028D"),"Android",if(match(useragent,"5002X"),"Android",if(match(useragent,"COR-"),"Android",if(match(useragent,"MI%20MAX"),"Android",if(match(useragent,"WAS-LX2"),"Android",if(match(useragent,"vivo"),"Android",if(match(useragent,"EML-L29"),"Android",if(match(useragent ent,"Moto"),"Android",if(match(useragent,"MMB"),"Android", OTHER"))))))))))))))))))))))))))
Hi, I have a bar chart(Fig 1:),where it's y-axis has three different field-i.e; based on each month we get this in-terms  of three different bar(A,B & C) for that particular month. where i wanted d... See more...
Hi, I have a bar chart(Fig 1:),where it's y-axis has three different field-i.e; based on each month we get this in-terms  of three different bar(A,B & C) for that particular month. where i wanted drilldown for this panel,but the problem is this particular panel is dependent on these 3bars,while specifying the token within panel i.e; <panel depends="$token$" > how can i specify these different tokens within one particular panel such that when i click on any of these bar,I should get drilldown for only that particular selected bar in the panel. Fig 1:  
Hello All, We have a extracted field (example field name "Field1) with multiple value such as YYN, YNN, NYN etc. Based on the current field and field value, would like to have "NewFieldName" and ma... See more...
Hello All, We have a extracted field (example field name "Field1) with multiple value such as YYN, YNN, NYN etc. Based on the current field and field value, would like to have "NewFieldName" and match the result as below smaple.   Sample Field1 NewFieldName YYN "OK" YNN "NOT OK" NYN "NOT OK"   Thanks
Hello team, As Splunk app for AWS not working or its EOS. Which is the another suitable app for AWS Dashboard in Splunk enterprise?   Reagrds, Nikhil
Hello, We need to ingest Cloudflare logs using the Cloudflare TA. Do you have any recommendation on how we proceed with this log ingestion process? Any help will be highly appreciated. Thank you!
Hello, I have a Roll Up events. One file created every month and new events added up every day within that file. How would I avoid duplicate ingestion (or avoid same events to be indexed twice) for ... See more...
Hello, I have a Roll Up events. One file created every month and new events added up every day within that file. How would I avoid duplicate ingestion (or avoid same events to be indexed twice) for the same events as SPLUNK is using the same file to read and ingest? Any help will be highly appreciated. Thank you.
Hello, I deploy Splunk via SCCM using a PowerShell script which runs the MSI and then copies a specific deploymentclient.conf file depending on the server type. For some reason, application deplo... See more...
Hello, I deploy Splunk via SCCM using a PowerShell script which runs the MSI and then copies a specific deploymentclient.conf file depending on the server type. For some reason, application deployment is failing on all of our domain controllers with the error which correlates to "invalid detection method used". I can see when the MSI runs, the old version gets uninstalled, but then ultimately it just gets reinstalled again. This newer version is superseding an older version, so could that be part of the issue? Why would this only affect domain controllers when all of our other server installations are successful? Would the MSI detection string be different for domain controllers? This is the PowerShell install command I am using: (start-process "msiexec.exe" -ArgumentList '/i "splunkforwarder.msi" INSTALLDIR="C:\Program Files\SplunkUniversalForwarder" AGREETOLICENSE=yes /qn /l c:\Install\Log\Splunk_Forwarder_Install.log' -Wait -NoNewWindow -PassThru).ExitCode
Hi  Does anyone know how to fix the Proofpoint TAP WebUI for inputs and configuration? When I launched a clean install of the app on a clean install of 9.0.4.1, the inputs and configuration pages a... See more...
Hi  Does anyone know how to fix the Proofpoint TAP WebUI for inputs and configuration? When I launched a clean install of the app on a clean install of 9.0.4.1, the inputs and configuration pages are blank. However, when I edit the url to /search it loads properly. Anyone know how to fix this? Thank you