All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Unfortunately, I never found a solution.  If you happen to find the fix, please reply with it.
Hi Team, we facing issue in our environment, as Episodes are not generating after upgrading of ITSI 4.17.1 version. And everything is enabled (Services, Correlation searches, NEAP policies) and... See more...
Hi Team, we facing issue in our environment, as Episodes are not generating after upgrading of ITSI 4.17.1 version. And everything is enabled (Services, Correlation searches, NEAP policies) and even though it is not creating. ITSI_event_grouping is enabled and Rules engine is working fine. Please provide solution for this.
Hi @riley_lewis, the avg function must be used in a streaming command as stats or timechart: index="mydata" | stats avg(floatnumbers) AS floatnumbers BY mymean Ciao. Giuseppe  
When I do this search: index="mydata" | eval mymean=avg(floatnumbers) | table floatnumbers,mymean mymean just mimics whatever is in floatnumbers. How do I calculate the mean? I have tried the fie... See more...
When I do this search: index="mydata" | eval mymean=avg(floatnumbers) | table floatnumbers,mymean mymean just mimics whatever is in floatnumbers. How do I calculate the mean? I have tried the fieldsummary command, but when I did that, it would not port to chart correctly.
Finally, this worked for me.  Thanks for advice.  | mvexpand Json | spath "Action{}" input=Json output="Action" | spath "Effect" input=Json output="Effect" | mvexpand Action | table Action Effect
The problem appears to stem from missing newlines before the timestamps.  Try these props.conf settings: [sourcetype::_json] SHOULD_LINEMERGE = false LINE_BREAKER = app_context=port_data"([\r\n]*)\d... See more...
The problem appears to stem from missing newlines before the timestamps.  Try these props.conf settings: [sourcetype::_json] SHOULD_LINEMERGE = false LINE_BREAKER = app_context=port_data"([\r\n]*)\d{2}\/\d{2}\/\d{4}
Hello team, I am facing an issue with multiple events getting merged as a single event in tier 3. I do not have this issue with tier 1 or when I manually run the saved search. However when the save... See more...
Hello team, I am facing an issue with multiple events getting merged as a single event in tier 3. I do not have this issue with tier 1 or when I manually run the saved search. However when the saved search runs at a scheduled time these multiple events gets merged as 1 single event. I even tried adding the below values in props.conf of Data App but did not help [sourcetype::_json] SHOULD_LINEMERGE = false LINE_BREAKER = ([\r\n]+)\d{2}\/\d{2}\/\d{4}\s\d{2}:\d{2}:\d{2}\s\+\d{4}   Below is how the event in tier 3 is like: 03/28/2024 10:35:00 +0000,search_now=1711622100.000000000,source_host="1.1.1.1 : ip-sample_ip.ec2.internal",metric_label="Port_Connectivity : Reporting no data",instance="Port : 45",metric_value="0",metric_unit="latest",alert_value="100",tower="Port reporting no data",threshold1="-2",threshold2="-1",threshold3="0.5",threshold4="0.5",blacklist_alerts="1",add_info="Time=1711622100.000000;!@#;state=offline;!@#;message=NA;!@#;protocol=NA;!@#;responsetext=NA;!@#;responsetime=1711622100.000000;!@#;returncode=NA;!@#;roundtriptime=NULL;!@#;service_name=NA;!@#;app_context=port_data"03/28/2024 10:35:00 +0000,search_now=1711622100.000000000,source_host="1.1.1.1 : ip-sample_ip.ec2.internal",metric_label="Port_Monitoring : Port_Status",instance="Port : 45",metric_value="201",metric_unit="Status",alert_value="100",tower="Infra",threshold1="0",threshold2="0",threshold3="300",threshold4="500",blacklist_alerts="1",add_info="Time=2024-03-28T10:33:48Z;!@#;state=reachable;!@#;message=reachable;!@#;protocol=UDP;!@#;responsetext=/bin/sh: line 1: nc: command not found;!@#;responsetime=na;!@#;returncode=0;!@#;roundtriptime=NULL;!@#;service_name=IMP;!@#;app_context=port_data"03/28/2024 10:35:00 +0000,search_now=1711622100.000000000,source_host="127.0.0.1 : ip-sample_ip.ec2.internal",metric_label="Port_Connectivity : Reporting no data",instance="Port : 3389",metric_value="0",metric_unit="latest",alert_value="100",tower="Port reporting no data",threshold1="-2",threshold2="-1",threshold3="0.5",threshold4="0.5",blacklist_alerts="1",add_info="Time=1711622100.000000;!@#;state=offline;!@#;message=NA;!@#;protocol=NA;!@#;responsetext=NA;!@#;responsetime=1711622100.000000;!@#;returncode=NA;!@#;roundtriptime=NULL;!@#;service_name=NA;!@#;app_context=port_data" Every event will end at "app_context=port_data"" to be exact. Please let me know how to resolve this.
Hi Team, The below is the event which we have received into the splunk, Dataframe row : {"_c0":{"0":"{","1":" \"0\": {","2":" \"jobname\": \"A001_GVE_ADHOC_AUDIT\"","3":" \"status\": \"ENDED NOTOK\... See more...
Hi Team, The below is the event which we have received into the splunk, Dataframe row : {"_c0":{"0":"{","1":" \"0\": {","2":" \"jobname\": \"A001_GVE_ADHOC_AUDIT\"","3":" \"status\": \"ENDED NOTOK\"","4":" \"Timestamp\": \"20240317 13:25:23\"","5":" }","6":" \"1\": {","7":" \"jobname\": \"BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS\"","8":" \"status\": \"ENDED NOTOK\"","9":" \"Timestamp\": \"20240317 13:25:23\"","10":" }","11":" \"2\": {","12":" \"jobname\": \"BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS_WEEKLY\"","13":" \"status\": \"ENDED NOTOK\"","14":" \"Timestamp\": \"20240317 13:25:23\"","15":" }","16":" \"3\": {","17":" \"jobname\": \"D001_GVE_SOFT_MATCHING_GDH_CA\"","18":" \"status\": \"ENDED NOTOK\"","19":" \"Timestamp\": \"20240317 13:25:23\"","20":" }","21":" \"4\": {","22":" \"jobname\": \"D100_AKS_CDWH_SQOOP_TRX_ORG\"","23":" \"status\": \"ENDED NOTOK\"","24":" \"Timestamp\": \"20240317 13:25:23\"","25":" }","26":" \"5\": {","27":" \"jobname\": \"D100_AKS_CDWH_SQOOP_TYP_123\"","28":" \"status\": \"ENDED NOTOK\"","29":" \"Timestamp\": \"20240317 13:25:23\"","30":" }","31":" \"6\": {","32":" \"jobname\": \"D100_AKS_CDWH_SQOOP_TYP_45\"","33":" \"status\": \"ENDED OK\"","34":" \"Timestamp\": \"20240317 13:25:23\"","35":" }","36":" \"7\": {","37":" \"jobname\": \"D100_AKS_CDWH_SQOOP_TYP_ENPW\"","38":" \"status\": \"ENDED NOTOK\"","39":" \"Timestamp\": \"20240317 13:25:23\"","40":" }","41":" \"8\": {","42":" \"jobname\": \"D100_AKS_CDWH_SQOOP_TYP_T\"","43":" \"status\": \"ENDED NOTOK\"","44":" \"Timestamp\": \"20240317 13:25:23\"","45":" }","46":" \"9\": {","47":" \"jobname\": \"DREAMPC_CALC_ML_NAMESAPCE\"","48":" \"status\": \"ENDED NOTOK\"","49":" \"Timestamp\": \"20240317 13:25:23\"","50":" }","51":" \"10\": {","52":" \"jobname\": \"DREAMPC_MEMORY_AlERT_SIT\"","53":" \"status\": \"ENDED NOTOK\"","54":" \"Timestamp\": \"20240317 13:25:23\"","55":" }","56":" \"11\": {","57":" \"jobname\": \"DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS\"","58":" \"status\": \"ENDED NOTOK\"","59":" \"Timestamp\": \"20240317 13:25:23\"","60":" }","61":" \"12\": {","62":" \"jobname\": \"DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS_WEEKLY\"","63":" \"status\": \"ENDED NOTOK\"","64":" \"Timestamp\": \"20240317 13:25:23\"","65":" }","66":" \"13\": {","67":" \"jobname\": \"DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS\"","68":" \"status\": \"ENDED OK\"","69":" \"Timestamp\": \"20240317 13:25:23\"","70":" }","71":" \"14\": {","72":" \"jobname\": \"DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS_WEEKLY\"","73":" \"status\": \"ENDED OK\"","74":" \"Timestamp\": \"20240317 13:25:23\"","75":" }","76":" \"15\": {","77":" \"jobname\": \"DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS\"","78":" \"status\": \"ENDED OK\"","79":" \"Timestamp\": \"20240317 13:25:23\"","80":" }","81":" \"16\": {","82":" \"jobname\": \"DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS_WEEKLY\"","83":" \"status\": \"ENDED OK\"","84":" \"Timestamp\": \"20240317 13:25:23\"","85":" }","86":" \"17\": {","87":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH\"","88":" \"status\": \"ENDED OK\"","89":" \"Timestamp\": \"20240317 13:25:23\"","90":" }","91":" \"18\": {","92":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH_WEEKLY\"","93":" \"status\": \"ENDED OK\"","94":" \"Timestamp\": \"20240317 13:25:23\"","95":" }","96":" \"19\": {","97":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_SAMCONTDEPOT\"","98":" \"status\": \"ENDED NOTOK\"","99":" \"Timestamp\": \"20240317 13:25:23\"","100":" }","101":" \"20\": {","102":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TLXLSP_TRXN\"","103":" \"status\": \"ENDED NOTOK\"","104":" \"Timestamp\": \"20240317 13:25:23\"","105":" }","106":" \"21\": {","107":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR\"","108":" \"status\": \"ENDED OK\"","109":" \"Timestamp\": \"20240317 13:25:23\"","110":" }","111":" \"22\": {","112":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR_WEEKLY\"","113":" \"status\": \"ENDED OK\"","114":" \"Timestamp\": \"20240317 13:25:23\"","115":" }","116":" \"23\": {","117":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON\"","118":" \"status\": \"ENDED NOTOK\"","119":" \"Timestamp\": \"20240317 13:25:23\"","120":" }","121":" \"24\": {","122":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON_WEEKLY\"","123":" \"status\": \"ENDED OK\"","124":" \"Timestamp\": \"20240317 13:25:23\"","125":" }","126":" \"25\": {","127":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI\"","128":" \"status\": \"ENDED NOTOK\"","129":" \"Timestamp\": \"20240317 13:25:23\"","130":" }","131":" \"26\": {","132":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI_WEEKLY\"","133":" \"status\": \"ENDED NOTOK\"","134":" \"Timestamp\": \"20240317 13:25:23\"","135":" }" we have tried to extract the required fields such as Timestamp, Jobname, Status from the above events using the below splunk query index=app_events_dwh2_de_int _raw=*jobname* | rex max_match=0 "\\\\\\\\\\\\\"jobname\\\\\\\\\\\\\":\s*\\\\\\\\\\\\\"(?<Name>[^\\\]+)" | rex max_match=0 "\\\\\\\\\\\\\"status\\\\\\\\\\\\\":\s*\\\\\\\\\\\\\"(?<State>[^\\\]+)" | rex max_match=0 "Timestamp\\\\\\\\\\\\\": \\\\\\\\\\\\\"(?<TIME>\d+\s*\d+\:\d+\:\d+)" | rex max_match=0 "execution_time_in_seconds\\\\\\\\\\\\\": \\\\\\\\\\\\\"(?<EXECUTION_TIME>[\d\.\-]+)" | table "TIME", "Name", "State", "EXECUTION_TIME" | mvexpand TIME But the issue we want to extract only those status jobs with status as " ENDED NOTOK". But we are unable to extract them. Also when we use mvexpand command for the table, it is showing multiple duplicate values.   We request you to kindly look into this and help us on this issue.
Such a small and straightforward environment I used GUI.  I get the sense there is a bug in 9.2.0.x
I have been using this script to update many of our lookups/datasets but it's no longer working, giving the following error when downloading the file: "status 403, reason: Forbidden" It was working... See more...
I have been using this script to update many of our lookups/datasets but it's no longer working, giving the following error when downloading the file: "status 403, reason: Forbidden" It was working last week but it suddenly stop. The user I'm using is the owner of these lookups/datasets and no permissions were changed. The Splunk instance is currently having a issue with a expired certificate, could it be because of that? Or something else?
Hello, I have this really weird problem I've been trying to figure out for the past 2 days without success. Basically I have a Splunk architecture where I want to put the deployment server (DS) on t... See more...
Hello, I have this really weird problem I've been trying to figure out for the past 2 days without success. Basically I have a Splunk architecture where I want to put the deployment server (DS) on the heavy forwarder since I don't have a lot of clients and it's just a lab. The problem is as follows : With a fresh Splunk Enterprise instance that is going to be the heavy forwarder, when I set up the client by putting in the deploymentclient.conf  the IP address of the heavy forwarder and port, it first works as intended and I can see the client in Forwarder Management. As soon as I enable forwarding on the Heavy Forwarder and put the IP addresses of the Indexers, the client doesn't show up on the Heavy Forwarder Management panel anymore but shows up in every other instance's Forwarder Management panel (Manager node, indexers etc..) ???? It's as if the heavy forwarder is forwarding the deployment client to all instances apart the heavy forwarder itself. Thanks in advance
We can't use xmlkv, customer will fire the index=indexname sourcetype=soucetypename and data should appear with all the fields extracted !!   the events are the combination of Non-XML and XML forma... See more...
We can't use xmlkv, customer will fire the index=indexname sourcetype=soucetypename and data should appear with all the fields extracted !!   the events are the combination of Non-XML and XML format.   From the Non-xml format we have the fields coming in but from the XML formats we dont have any fields.   Finally, we have to automate the extraction using the props.conf in the backend.
Is it possible to extract those xml parts 1st and then use xmlkv command to those?
Was the issue fixed? I'm having the exactly same issue and weeks ago it was working fine. No change was done to the lookup/dataset permissions and the user I'm using to access is the owner of the... See more...
Was the issue fixed? I'm having the exactly same issue and weeks ago it was working fine. No change was done to the lookup/dataset permissions and the user I'm using to access is the owner of the lookup. Could this be related to a splunk certificate being expired? or something else?
I appreciate your response here, but there are many xml tags in the event , as I mentioned in the example : abc xyz   So, you do not know what are the tags coming in the event, so it is dynamic. ... See more...
I appreciate your response here, but there are many xml tags in the event , as I mentioned in the example : abc xyz   So, you do not know what are the tags coming in the event, so it is dynamic.   My Field should be created dynamically with the tag's name and the corresponding value.   ex:- <abc>Wow</abc> field name should not be hardcoded as "abc", it should take "abc" dynamically and the value should be "Wow"
I mean that you should use " in this search | search test_field_name="test_field_name_1"  
Hi @meetmshah  not working as expected. search :- log_type=Passed_Authentications MESSAGE_TEXT="Command Authorization succeeded"  | rex field=CmdSet max_match=0 "CmdAV=(?<Command>[^\s]+)|\sCm... See more...
Hi @meetmshah  not working as expected. search :- log_type=Passed_Authentications MESSAGE_TEXT="Command Authorization succeeded"  | rex field=CmdSet max_match=0 "CmdAV=(?<Command>[^\s]+)|\sCmdArgAV=(?<Command1>[^\s]+)" | makemv delim="," allowempty=t Command1 | table _time,Command,Command1
There doesn't appear to be anything wrong with case statement on its own. However, there are other statements which might affect your result, e.g. dedup. Please can you share some events demonstratin... See more...
There doesn't appear to be anything wrong with case statement on its own. However, there are other statements which might affect your result, e.g. dedup. Please can you share some events demonstrating your issue?
| spath "Action{}" output="Action" | spath "Effect" | mvexpand Action
Hi Guys, I am using multiple keywords to get count of errors from different message.So i am trying case statement to acheive it. index="mulesoft" applicationName="api" environment="*" (message="Con... See more...
Hi Guys, I am using multiple keywords to get count of errors from different message.So i am trying case statement to acheive it. index="mulesoft" applicationName="api" environment="*" (message="Concur Ondemand Started") OR (message="API: START: /v1/fin_Concur") OR (message="*(ERROR): concur import failed for file*") OR (tracePoint="EXCEPTION") | dedup correlationId | eval JobName=case(like('message',"Concur Ondemand Started") OR like('message',"API: START: /v1/fin_Concur%") AND like('tracePoint',"EXCEPTION"),"EXPENSE JOB",like('message',"%(ERROR): concur import failed for file%"),"ACCURAL JOB") | stats count by JobName But i am getting only EXPENSE JOB JobName.But when i split into two query both JobName having result .