All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

hi @yuanliu , when i run the below query, trans values are fine, but getting negative values  and empty row for the delta_Trans and pct_delta_Trans fields values are not correct. _time applicat... See more...
hi @yuanliu , when i run the below query, trans values are fine, but getting negative values  and empty row for the delta_Trans and pct_delta_Trans fields values are not correct. _time application Trans delta_Trans pct_delta_Trans 2022-01-22 02:00 app1 3456.000000     2022-01-22 02:00 app2 5632.000000 -1839.000000 -5438.786543 2022-01-22 02:00 app3 5643.000000 36758.000000 99.76435678 2022-01-22 02:00 app4 16543.00000 -8796.908678 -8607.065438
Hi @yuanliu , Thanks for the response the first query, as you have mentioned it 'Select events in which list{}.name has one unique value "Hello" ' is there a way select events in which all the obje... See more...
Hi @yuanliu , Thanks for the response the first query, as you have mentioned it 'Select events in which list{}.name has one unique value "Hello" ' is there a way select events in which all the objects should contain name == "Hello" instead of just one unique value? To clarify about your query - 'given that list is an array, selecting only the first element for matching may not be what the use case demands' I understand that it sounds weird  , but our use case is about selecting events where the first object in an array/list should have    type == "code"    
Something like this? |mstats sum(transaction) as Trans where index=host-metrics service=login application IN(app1, app2, app3, app4) span=1h by application | delta Trans as delta_Trans | eval pct_de... See more...
Something like this? |mstats sum(transaction) as Trans where index=host-metrics service=login application IN(app1, app2, app3, app4) span=1h by application | delta Trans as delta_Trans | eval pct_delta_Trans = delta_Trans / Trans * 100
I don't quite get why you want a sparse corner for Total_* but it is hackable   | appendpipe [ eval Total_A = null() ] | eval Total_B = if(isnull(Total_A), Total_B, null()) | eval Unixtime_... See more...
I don't quite get why you want a sparse corner for Total_* but it is hackable   | appendpipe [ eval Total_A = null() ] | eval Total_B = if(isnull(Total_A), Total_B, null()) | eval Unixtime_AB = if(isnull(Total_B), Unixtime_A, Unixtime_B) | fields Total_* Unixtime_AB   (Note this hack works for a small number of Unixtime_* but not particularly scalable.) Just in case you want a dense matrix, I'm offering an obvious result set: Total_AB Unixtime_AB 1 imaginary_unix_3 2 imaginary_unix_1 3 imaginary_unix_4 4 imaginary_unix_3 5 imaginary_unix_1 6 imaginary_unix_4 To get this, do   | appendpipe [ eval Total_A = null() ] | eval Total_AB = if(isnull(Total_A), Total_B, Total_A) | eval Unixtime_AB = if(isnull(Total_B), Unixtime_A, Unixtime_B) | fields - *_A *_B   Here is an emulation you can play with and compare with real data.   | makeresults format=csv data="Unixtime_A, Total_A, Unixtime_B, Total_B imaginary_unix_1, 1, imaginary_unix_3, 4 imaginary_unix_2, 2, imaginary_unix_1, 5 imaginary_unix_3, 3, imaginary_unix_4, 6" ``` data emulation above ```   Hope this helps.
I want to compare pervious hour data with present hour data and get the percentage using below query. |mstats sum(transaction) as Trans where index=host-metrics service=login application IN(app1, ap... See more...
I want to compare pervious hour data with present hour data and get the percentage using below query. |mstats sum(transaction) as Trans where index=host-metrics service=login application IN(app1, app2, app3, app4) span=1h by application
From the Subject Title, what I mean is it will increase the row count and decrease the column count - that is my intention. After a series of mathematical computations, I ended up with the following... See more...
From the Subject Title, what I mean is it will increase the row count and decrease the column count - that is my intention. After a series of mathematical computations, I ended up with the following table: Unixtime_A Total_A Unixtime_B Total_B imaginary_unix_1 1 imaginary_unix_3 4 imaginary_unix_2 2 imaginary_unix_1 5 imaginary_unix_3 3 imaginary_unix_4 6 Notes: Unixtime_A may not equal Unixtime_B, but they are formatted the same that is snapped to the month with @mon (unixtime) Total_A and Total_B were the result of various conditional counts, so they need to be seperate fields   The desired table is: Unixtime_AB Total_A Total_B imaginary_unix_1 1   imaginary_unix_2 2   imaginary_unix_3 3   imaginary_unix_3   4 imaginary_unix_1   5 imaginary_unix_4   6 Which I can then use | fillnull and use a simple stats to sum both totals by Unixtime_AB. Like so:   | stats sum(Total_A), sum(Total_B) by Unixtime_AB     I'm not 100% sure if transpose, untable, or xyseries could do this - or if I was misusing them somehow.
Hi,  What are the options to integrate Appdynamics with zabbix or the other way around to send data from zabbix to AppDynamics Thanks Akhila
I've been struggling to decide the best method to instrument a Java web app running on Azure App Service. There's plenty of documentation for AKS services, ECS services and so on. There's even docume... See more...
I've been struggling to decide the best method to instrument a Java web app running on Azure App Service. There's plenty of documentation for AKS services, ECS services and so on. There's even documentation for .NET services running as an Azure App Service but nothing for my use case.  Is there any documentation available for this specific scenario? I've read and re-read the Java APM documentation but I still feel a bit lost.  Thank you for any help and suggestions!
I want to search if FailureMsg field (fail_msg1 OR fail_msg2) is found in _raw of my splunk query search results and return only those matching lines. If they (fail_msg1 OR fail_msg2) are not foun... See more...
I want to search if FailureMsg field (fail_msg1 OR fail_msg2) is found in _raw of my splunk query search results and return only those matching lines. If they (fail_msg1 OR fail_msg2) are not found, return nothing I think this sentence is confusing everybody:-). Is it correct to say that FailureMsg already exists in raw event search, and you only want events matching one of FailureMsg values in your lookup? If the above are true, you have a simple formula index="demo1" source="demo2" [inputlookup sample.csv | fields FailureMsg] Put back into your sample code and incorporating the correction from @isoutamo, you get index="demo1" source="demo2" [inputlookup timelookup.csv | fields FailureMsg] | rex field=_raw "id_num \{ data: (?P<id_num>\d+) \}" | rex field=_raw "test_field_name=(?P<test_field_name>.+)]:" | search test_field_name="test_field_name_1" | table _raw id_num | reverse | filldown id_num  
when i try running a search on my Splunk enterprise in the search and reporting app i get the "insufficient permission to access this resource" message.  i tried to click on  the things under setting... See more...
when i try running a search on my Splunk enterprise in the search and reporting app i get the "insufficient permission to access this resource" message.  i tried to click on  the things under settings and i get the "500 internal server error" message and there's this severe warning message in my search scheduler that says "searches skipped in the last 24 hours" how do i troubleshoot these and get my splunk enterprise running normally again?
  (cont.)   | eval c0_key = json_keys(c0) | foreach c0_key mode=json_array [eval c0_job = mvappend(c0_job, json_object("key", <<ITEM>>, "job", json_extract(c0, <<ITEM>>)))] | mvexpand c0_job |... See more...
  (cont.)   | eval c0_key = json_keys(c0) | foreach c0_key mode=json_array [eval c0_job = mvappend(c0_job, json_object("key", <<ITEM>>, "job", json_extract(c0, <<ITEM>>)))] | mvexpand c0_job | fields c0_job     You now get c0_job {"key":0,"job":{"jobname":"A001_GVE_ADHOC_AUDIT","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":1,"job":{"jobname":"BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":2,"job":{"jobname":"BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS_WEEKLY","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":3,"job":{"jobname":"D001_GVE_SOFT_MATCHING_GDH_CA","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":4,"job":{"jobname":"D100_AKS_CDWH_SQOOP_TRX_ORG","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":5,"job":{"jobname":"D100_AKS_CDWH_SQOOP_TYP_123","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":6,"job":{"jobname":"D100_AKS_CDWH_SQOOP_TYP_45","status":"ENDED OK","Timestamp":"20240317 13:25:23"}} {"key":7,"job":{"jobname":"D100_AKS_CDWH_SQOOP_TYP_ENPW","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":8,"job":{"jobname":"D100_AKS_CDWH_SQOOP_TYP_T","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":9,"job":{"jobname":"DREAMPC_CALC_ML_NAMESAPCE","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":10,"job":{"jobname":"DREAMPC_MEMORY_AlERT_SIT","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":11,"job":{"jobname":"DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":12,"job":{"jobname":"DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS_WEEKLY","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":13,"job":{"jobname":"DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS","status":"ENDED OK","Timestamp":"20240317 13:25:23"}} {"key":14,"job":{"jobname":"DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS_WEEKLY","status":"ENDED OK","Timestamp":"20240317 13:25:23"}} {"key":15,"job":{"jobname":"DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS","status":"ENDED OK","Timestamp":"20240317 13:25:23"}} {"key":16,"job":{"jobname":"DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS_WEEKLY","status":"ENDED OK","Timestamp":"20240317 13:25:23"}} {"key":17,"job":{"jobname":"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH","status":"ENDED OK","Timestamp":"20240317 13:25:23"}} {"key":18,"job":{"jobname":"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH_WEEKLY","status":"ENDED OK","Timestamp":"20240317 13:25:23"}} {"key":19,"job":{"jobname":"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_SAMCONTDEPOT","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":20,"job":{"jobname":"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TLXLSP_TRXN","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":21,"job":{"jobname":"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR","status":"ENDED OK","Timestamp":"20240317 13:25:23"}} {"key":22,"job":{"jobname":"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR_WEEKLY","status":"ENDED OK","Timestamp":"20240317 13:25:23"}} {"key":23,"job":{"jobname":"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":24,"job":{"jobname":"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON_WEEKLY","status":"ENDED OK","Timestamp":"20240317 13:25:23"}} {"key":25,"job":{"jobname":"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} {"key":26,"job":{"jobname":"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI_WEEKLY","status":"ENDED NOTOK","Timestamp":"20240317 13:25:23"}} (I could have placed "jobname", etc., directly into root with more SPL magic but it is not worth it.) Then, you just extract all data using standard spath.  Put everything together,     | rex mode=sed "s/^([^_]+)_/\1row_/" | rex "^[^:]+\s*:\s*(?<json_frame>.+)" ```| eval good = if(json_valid(json_frame), "yes", "no")``` | spath input=json_frame path=row_c0 | eval row_key = json_keys(row_c0) ```| eval r_c0 = json_extract(row_c0, "0") . json_extract(row_c0, "1")``` | eval c0 = "" | foreach row_key mode=json_array [eval c0 = c0 . json_extract(row_c0, <<ITEM>>)] | fields - _* json_frame row_* | rex field=c0 mode=sed "s/} *\"/}, \"/g s/\" *\"/\", \"/g s/$/}/" ```| eval good = if(json_valid(c0), "yes", "no")``` | eval c0_key = json_keys(c0) | foreach c0_key mode=json_array [eval c0_job = mvappend(c0_job, json_object("key", <<ITEM>>, "job", json_extract(c0, <<ITEM>>)))] | mvexpand c0_job | spath input=c0_job | fields - c0*     You then get job.Timestamp job.jobname job.status key 20240317 13:25:23 A001_GVE_ADHOC_AUDIT ENDED NOTOK 0 20240317 13:25:23 BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS ENDED NOTOK 1 20240317 13:25:23 BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS_WEEKLY ENDED NOTOK 2 20240317 13:25:23 D001_GVE_SOFT_MATCHING_GDH_CA ENDED NOTOK 3 20240317 13:25:23 D100_AKS_CDWH_SQOOP_TRX_ORG ENDED NOTOK 4 20240317 13:25:23 D100_AKS_CDWH_SQOOP_TYP_123 ENDED NOTOK 5 20240317 13:25:23 D100_AKS_CDWH_SQOOP_TYP_45 ENDED OK 6 20240317 13:25:23 D100_AKS_CDWH_SQOOP_TYP_ENPW ENDED NOTOK 7 20240317 13:25:23 D100_AKS_CDWH_SQOOP_TYP_T ENDED NOTOK 8 20240317 13:25:23 DREAMPC_CALC_ML_NAMESAPCE ENDED NOTOK 9 20240317 13:25:23 DREAMPC_MEMORY_AlERT_SIT ENDED NOTOK 10 20240317 13:25:23 DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS ENDED NOTOK 11 20240317 13:25:23 DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS_WEEKLY ENDED NOTOK 12 20240317 13:25:23 DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS ENDED OK 13 20240317 13:25:23 DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS_WEEKLY ENDED OK 14 20240317 13:25:23 DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS ENDED OK 15 20240317 13:25:23 DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS_WEEKLY ENDED OK 16 20240317 13:25:23 DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH ENDED OK 17 20240317 13:25:23 DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH_WEEKLY ENDED OK 18 20240317 13:25:23 DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_SAMCONTDEPOT ENDED NOTOK 19 20240317 13:25:23 DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TLXLSP_TRXN ENDED NOTOK 20 20240317 13:25:23 DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR ENDED OK 21 20240317 13:25:23 DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR_WEEKLY ENDED OK 22 20240317 13:25:23 DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON ENDED NOTOK 23 20240317 13:25:23 DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON_WEEKLY ENDED OK 24 20240317 13:25:23 DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI ENDED NOTOK 25 20240317 13:25:23 DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI_WEEKLY ENDED NOTOK 26 Only this way, I can be confident that this is what the app/equipment/device is trying to tell me. Here is a data emulation you can play with and compare with real data     | makeresults | eval _raw = "Dataframe row : {\"_c0\":{\"0\":\"{\",\"1\":\" \\\"0\\\": {\",\"2\":\" \\\"jobname\\\": \\\"A001_GVE_ADHOC_AUDIT\\\"\",\"3\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"4\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"5\":\" }\",\"6\":\" \\\"1\\\": {\",\"7\":\" \\\"jobname\\\": \\\"BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS\\\"\",\"8\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"9\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"10\":\" }\",\"11\":\" \\\"2\\\": {\",\"12\":\" \\\"jobname\\\": \\\"BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS_WEEKLY\\\"\",\"13\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"14\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"15\":\" }\",\"16\":\" \\\"3\\\": {\",\"17\":\" \\\"jobname\\\": \\\"D001_GVE_SOFT_MATCHING_GDH_CA\\\"\",\"18\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"19\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"20\":\" }\",\"21\":\" \\\"4\\\": {\",\"22\":\" \\\"jobname\\\": \\\"D100_AKS_CDWH_SQOOP_TRX_ORG\\\"\",\"23\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"24\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"25\":\" }\",\"26\":\" \\\"5\\\": {\",\"27\":\" \\\"jobname\\\": \\\"D100_AKS_CDWH_SQOOP_TYP_123\\\"\",\"28\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"29\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"30\":\" }\",\"31\":\" \\\"6\\\": {\",\"32\":\" \\\"jobname\\\": \\\"D100_AKS_CDWH_SQOOP_TYP_45\\\"\",\"33\":\" \\\"status\\\": \\\"ENDED OK\\\"\",\"34\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"35\":\" }\",\"36\":\" \\\"7\\\": {\",\"37\":\" \\\"jobname\\\": \\\"D100_AKS_CDWH_SQOOP_TYP_ENPW\\\"\",\"38\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"39\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"40\":\" }\",\"41\":\" \\\"8\\\": {\",\"42\":\" \\\"jobname\\\": \\\"D100_AKS_CDWH_SQOOP_TYP_T\\\"\",\"43\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"44\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"45\":\" }\",\"46\":\" \\\"9\\\": {\",\"47\":\" \\\"jobname\\\": \\\"DREAMPC_CALC_ML_NAMESAPCE\\\"\",\"48\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"49\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"50\":\" }\",\"51\":\" \\\"10\\\": {\",\"52\":\" \\\"jobname\\\": \\\"DREAMPC_MEMORY_AlERT_SIT\\\"\",\"53\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"54\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"55\":\" }\",\"56\":\" \\\"11\\\": {\",\"57\":\" \\\"jobname\\\": \\\"DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS\\\"\",\"58\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"59\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"60\":\" }\",\"61\":\" \\\"12\\\": {\",\"62\":\" \\\"jobname\\\": \\\"DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS_WEEKLY\\\"\",\"63\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"64\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"65\":\" }\",\"66\":\" \\\"13\\\": {\",\"67\":\" \\\"jobname\\\": \\\"DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS\\\"\",\"68\":\" \\\"status\\\": \\\"ENDED OK\\\"\",\"69\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"70\":\" }\",\"71\":\" \\\"14\\\": {\",\"72\":\" \\\"jobname\\\": \\\"DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS_WEEKLY\\\"\",\"73\":\" \\\"status\\\": \\\"ENDED OK\\\"\",\"74\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"75\":\" }\",\"76\":\" \\\"15\\\": {\",\"77\":\" \\\"jobname\\\": \\\"DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS\\\"\",\"78\":\" \\\"status\\\": \\\"ENDED OK\\\"\",\"79\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"80\":\" }\",\"81\":\" \\\"16\\\": {\",\"82\":\" \\\"jobname\\\": \\\"DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS_WEEKLY\\\"\",\"83\":\" \\\"status\\\": \\\"ENDED OK\\\"\",\"84\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"85\":\" }\",\"86\":\" \\\"17\\\": {\",\"87\":\" \\\"jobname\\\": \\\"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH\\\"\",\"88\":\" \\\"status\\\": \\\"ENDED OK\\\"\",\"89\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"90\":\" }\",\"91\":\" \\\"18\\\": {\",\"92\":\" \\\"jobname\\\": \\\"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH_WEEKLY\\\"\",\"93\":\" \\\"status\\\": \\\"ENDED OK\\\"\",\"94\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"95\":\" }\",\"96\":\" \\\"19\\\": {\",\"97\":\" \\\"jobname\\\": \\\"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_SAMCONTDEPOT\\\"\",\"98\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"99\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"100\":\" }\",\"101\":\" \\\"20\\\": {\",\"102\":\" \\\"jobname\\\": \\\"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TLXLSP_TRXN\\\"\",\"103\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"104\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"105\":\" }\",\"106\":\" \\\"21\\\": {\",\"107\":\" \\\"jobname\\\": \\\"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR\\\"\",\"108\":\" \\\"status\\\": \\\"ENDED OK\\\"\",\"109\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"110\":\" }\",\"111\":\" \\\"22\\\": {\",\"112\":\" \\\"jobname\\\": \\\"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR_WEEKLY\\\"\",\"113\":\" \\\"status\\\": \\\"ENDED OK\\\"\",\"114\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"115\":\" }\",\"116\":\" \\\"23\\\": {\",\"117\":\" \\\"jobname\\\": \\\"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON\\\"\",\"118\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"119\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"120\":\" }\",\"121\":\" \\\"24\\\": {\",\"122\":\" \\\"jobname\\\": \\\"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON_WEEKLY\\\"\",\"123\":\" \\\"status\\\": \\\"ENDED OK\\\"\",\"124\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"125\":\" }\",\"126\":\" \\\"25\\\": {\",\"127\":\" \\\"jobname\\\": \\\"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI\\\"\",\"128\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"129\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"130\":\" }\",\"131\":\" \\\"26\\\": {\",\"132\":\" \\\"jobname\\\": \\\"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI_WEEKLY\\\"\",\"133\":\" \\\"status\\\": \\\"ENDED NOTOK\\\"\",\"134\":\" \\\"Timestamp\\\": \\\"20240317 13:25:23\\\"\",\"135\":\" }\" } }" ``` data emulation above ```     As noted above, I artificially inserted two closing curly brackets into _raw.  If the app/equipment/device willfully drops them, you can insert them back with something simple as     | eval _raw = _raw . "}}"     Hope this helps.
(The steps are a bit long so this post is split into two) Part 1. Even if you accidentally extract the data you wanted, your code will not be robust.  Instead of trying to rex the piece of info you... See more...
(The steps are a bit long so this post is split into two) Part 1. Even if you accidentally extract the data you wanted, your code will not be robust.  Instead of trying to rex the piece of info you are seeking, try to restore the underlying data structure first, i.e., try to rex and restore the compliant JSON. Is it correct that the data you illustrated is just one part in a stream of data that make up a larger frame?  Is it possible to illustrate an entire frame, however many events there may be?  If my speculation has any merit, I suspect that this data stream is formulated such that once you string together the _c0.1, and _c0.2, c0.100, etc., you would get a valid JSON object, or a fragment of a valid JSON for key _c0. Let's test this out step by step.  Note: the data you illustrated seems to be missing two closing curly brackets (}).  So I add them in.  There is another problem: Splunk treats leading underscore (_) specially.  For some reason even fromjson is not handling _c0 correctly.  So, I also add a prefix to this key name.  It doesn't change semantics; you can change back to _c0 in the end.     | rex mode=sed "s/^([^_]+)_/\1row_/" ``` prefix key name with "row" ``` | rex "^[^:]+\s*:\s*(?<json_frame>.+)" ``` extract JSON format "row_c0" ``` ```| eval good = if(json_valid(json_frame), "yes", "no")``` | spath input=json_frame path=row_c0 | fields - _* json_frame | eval row_key = json_keys(row_c0) | eval c0 = "" | foreach row_key mode=json_array [eval c0 = c0 . json_extract(row_c0, <<ITEM>>)]     Using the modified sample data (see below), I get c0 row_c0 { "0": { "jobname": "A001_GVE_ADHOC_AUDIT" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "1": { "jobname": "BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "2": { "jobname": "BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS_WEEKLY" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "3": { "jobname": "D001_GVE_SOFT_MATCHING_GDH_CA" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "4": { "jobname": "D100_AKS_CDWH_SQOOP_TRX_ORG" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "5": { "jobname": "D100_AKS_CDWH_SQOOP_TYP_123" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "6": { "jobname": "D100_AKS_CDWH_SQOOP_TYP_45" "status": "ENDED OK" "Timestamp": "20240317 13:25:23" } "7": { "jobname": "D100_AKS_CDWH_SQOOP_TYP_ENPW" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "8": { "jobname": "D100_AKS_CDWH_SQOOP_TYP_T" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "9": { "jobname": "DREAMPC_CALC_ML_NAMESAPCE" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "10": { "jobname": "DREAMPC_MEMORY_AlERT_SIT" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "11": { "jobname": "DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "12": { "jobname": "DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS_WEEKLY" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "13": { "jobname": "DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS" "status": "ENDED OK" "Timestamp": "20240317 13:25:23" } "14": { "jobname": "DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS_WEEKLY" "status": "ENDED OK" "Timestamp": "20240317 13:25:23" } "15": { "jobname": "DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS" "status": "ENDED OK" "Timestamp": "20240317 13:25:23" } "16": { "jobname": "DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS_WEEKLY" "status": "ENDED OK" "Timestamp": "20240317 13:25:23" } "17": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH" "status": "ENDED OK" "Timestamp": "20240317 13:25:23" } "18": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH_WEEKLY" "status": "ENDED OK" "Timestamp": "20240317 13:25:23" } "19": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_SAMCONTDEPOT" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "20": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TLXLSP_TRXN" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "21": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR" "status": "ENDED OK" "Timestamp": "20240317 13:25:23" } "22": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR_WEEKLY" "status": "ENDED OK" "Timestamp": "20240317 13:25:23" } "23": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "24": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON_WEEKLY" "status": "ENDED OK" "Timestamp": "20240317 13:25:23" } "25": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } "26": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI_WEEKLY" "status": "ENDED NOTOK" "Timestamp": "20240317 13:25:23" } {"0":"{","1":" \"0\": {","2":" \"jobname\": \"A001_GVE_ADHOC_AUDIT\"","3":" \"status\": \"ENDED NOTOK\"","4":" \"Timestamp\": \"20240317 13:25:23\"","5":" }","6":" \"1\": {","7":" \"jobname\": \"BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS\"","8":" \"status\": \"ENDED NOTOK\"","9":" \"Timestamp\": \"20240317 13:25:23\"","10":" }","11":" \"2\": {","12":" \"jobname\": \"BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS_WEEKLY\"","13":" \"status\": \"ENDED NOTOK\"","14":" \"Timestamp\": \"20240317 13:25:23\"","15":" }","16":" \"3\": {","17":" \"jobname\": \"D001_GVE_SOFT_MATCHING_GDH_CA\"","18":" \"status\": \"ENDED NOTOK\"","19":" \"Timestamp\": \"20240317 13:25:23\"","20":" }","21":" \"4\": {","22":" \"jobname\": \"D100_AKS_CDWH_SQOOP_TRX_ORG\"","23":" \"status\": \"ENDED NOTOK\"","24":" \"Timestamp\": \"20240317 13:25:23\"","25":" }","26":" \"5\": {","27":" \"jobname\": \"D100_AKS_CDWH_SQOOP_TYP_123\"","28":" \"status\": \"ENDED NOTOK\"","29":" \"Timestamp\": \"20240317 13:25:23\"","30":" }","31":" \"6\": {","32":" \"jobname\": \"D100_AKS_CDWH_SQOOP_TYP_45\"","33":" \"status\": \"ENDED OK\"","34":" \"Timestamp\": \"20240317 13:25:23\"","35":" }","36":" \"7\": {","37":" \"jobname\": \"D100_AKS_CDWH_SQOOP_TYP_ENPW\"","38":" \"status\": \"ENDED NOTOK\"","39":" \"Timestamp\": \"20240317 13:25:23\"","40":" }","41":" \"8\": {","42":" \"jobname\": \"D100_AKS_CDWH_SQOOP_TYP_T\"","43":" \"status\": \"ENDED NOTOK\"","44":" \"Timestamp\": \"20240317 13:25:23\"","45":" }","46":" \"9\": {","47":" \"jobname\": \"DREAMPC_CALC_ML_NAMESAPCE\"","48":" \"status\": \"ENDED NOTOK\"","49":" \"Timestamp\": \"20240317 13:25:23\"","50":" }","51":" \"10\": {","52":" \"jobname\": \"DREAMPC_MEMORY_AlERT_SIT\"","53":" \"status\": \"ENDED NOTOK\"","54":" \"Timestamp\": \"20240317 13:25:23\"","55":" }","56":" \"11\": {","57":" \"jobname\": \"DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS\"","58":" \"status\": \"ENDED NOTOK\"","59":" \"Timestamp\": \"20240317 13:25:23\"","60":" }","61":" \"12\": {","62":" \"jobname\": \"DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS_WEEKLY\"","63":" \"status\": \"ENDED NOTOK\"","64":" \"Timestamp\": \"20240317 13:25:23\"","65":" }","66":" \"13\": {","67":" \"jobname\": \"DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS\"","68":" \"status\": \"ENDED OK\"","69":" \"Timestamp\": \"20240317 13:25:23\"","70":" }","71":" \"14\": {","72":" \"jobname\": \"DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS_WEEKLY\"","73":" \"status\": \"ENDED OK\"","74":" \"Timestamp\": \"20240317 13:25:23\"","75":" }","76":" \"15\": {","77":" \"jobname\": \"DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS\"","78":" \"status\": \"ENDED OK\"","79":" \"Timestamp\": \"20240317 13:25:23\"","80":" }","81":" \"16\": {","82":" \"jobname\": \"DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS_WEEKLY\"","83":" \"status\": \"ENDED OK\"","84":" \"Timestamp\": \"20240317 13:25:23\"","85":" }","86":" \"17\": {","87":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH\"","88":" \"status\": \"ENDED OK\"","89":" \"Timestamp\": \"20240317 13:25:23\"","90":" }","91":" \"18\": {","92":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH_WEEKLY\"","93":" \"status\": \"ENDED OK\"","94":" \"Timestamp\": \"20240317 13:25:23\"","95":" }","96":" \"19\": {","97":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_SAMCONTDEPOT\"","98":" \"status\": \"ENDED NOTOK\"","99":" \"Timestamp\": \"20240317 13:25:23\"","100":" }","101":" \"20\": {","102":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TLXLSP_TRXN\"","103":" \"status\": \"ENDED NOTOK\"","104":" \"Timestamp\": \"20240317 13:25:23\"","105":" }","106":" \"21\": {","107":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR\"","108":" \"status\": \"ENDED OK\"","109":" \"Timestamp\": \"20240317 13:25:23\"","110":" }","111":" \"22\": {","112":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR_WEEKLY\"","113":" \"status\": \"ENDED OK\"","114":" \"Timestamp\": \"20240317 13:25:23\"","115":" }","116":" \"23\": {","117":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON\"","118":" \"status\": \"ENDED NOTOK\"","119":" \"Timestamp\": \"20240317 13:25:23\"","120":" }","121":" \"24\": {","122":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON_WEEKLY\"","123":" \"status\": \"ENDED OK\"","124":" \"Timestamp\": \"20240317 13:25:23\"","125":" }","126":" \"25\": {","127":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI\"","128":" \"status\": \"ENDED NOTOK\"","129":" \"Timestamp\": \"20240317 13:25:23\"","130":" }","131":" \"26\": {","132":" \"jobname\": \"DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI_WEEKLY\"","133":" \"status\": \"ENDED NOTOK\"","134":" \"Timestamp\": \"20240317 13:25:23\"","135":" }" } So, my hypothesis is only partially correct. Obviously c0 resembles a JSON object but without proper comma separation; it also doesn't have the closing curly bracket. The intention of c0 appears to be an order list (as opposed to array).  So, I will rectify the format to fulfill my interpretation.     | rex field=c0 mode=sed "s/} *\"/}, \"/g s/\" *\"/\", \"/g s/$/}/" ```| eval good = if(json_valid(c0), "yes", "no")```     You now get the real c0: c0 { "0": { "jobname": "A001_GVE_ADHOC_AUDIT", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "1": { "jobname": "BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "2": { "jobname": "BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TSYS_WEEKLY", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "3": { "jobname": "D001_GVE_SOFT_MATCHING_GDH_CA", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "4": { "jobname": "D100_AKS_CDWH_SQOOP_TRX_ORG", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "5": { "jobname": "D100_AKS_CDWH_SQOOP_TYP_123", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "6": { "jobname": "D100_AKS_CDWH_SQOOP_TYP_45", "status": "ENDED OK", "Timestamp": "20240317 13:25:23" }, "7": { "jobname": "D100_AKS_CDWH_SQOOP_TYP_ENPW", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "8": { "jobname": "D100_AKS_CDWH_SQOOP_TYP_T", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "9": { "jobname": "DREAMPC_CALC_ML_NAMESAPCE", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "10": { "jobname": "DREAMPC_MEMORY_AlERT_SIT", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "11": { "jobname": "DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "12": { "jobname": "DREAM_BDV_NBR_PRE_REQUISITE_TLX_LSP_3RD_PARTY_TRNS_WEEKLY", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "13": { "jobname": "DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS", "status": "ENDED OK", "Timestamp": "20240317 13:25:23" }, "14": { "jobname": "DREAM_BDV_NBR_STG_TLX_LSP_3RD_PARTY_TRNS_WEEKLY", "status": "ENDED OK", "Timestamp": "20240317 13:25:23" }, "15": { "jobname": "DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS", "status": "ENDED OK", "Timestamp": "20240317 13:25:23" }, "16": { "jobname": "DREAM_BDV_NBR_TLX_LSP_3RD_PARTY_TRNS_WEEKLY", "status": "ENDED OK", "Timestamp": "20240317 13:25:23" }, "17": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH", "status": "ENDED OK", "Timestamp": "20240317 13:25:23" }, "18": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_GDH_WEEKLY", "status": "ENDED OK", "Timestamp": "20240317 13:25:23" }, "19": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_SAMCONTDEPOT", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "20": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TLXLSP_TRXN", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "21": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR", "status": "ENDED OK", "Timestamp": "20240317 13:25:23" }, "22": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADEABR_WEEKLY", "status": "ENDED OK", "Timestamp": "20240317 13:25:23" }, "23": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "24": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_TRADESON_WEEKLY", "status": "ENDED OK", "Timestamp": "20240317 13:25:23" }, "25": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }, "26": { "jobname": "DREAM_BDV_NEW_BUSINESS_REPORTING_PRE_REQUISITE_ZCI_WEEKLY", "status": "ENDED NOTOK", "Timestamp": "20240317 13:25:23" }} From here, I will assume that the order of this list has some semantics and apply the same tricks. (You really need to talk to developers or read the manual of this application/equipment/device that send these data frames.) (to continue)
You could normalise the hostname using a lookup such that the primary and secondary of a pair resolve to the same name. Then you can look to see when the last time either pair had an event.
Ideally, you should avoid join if possible. It looks like the first part of your search could be replaced by this index=default-va6* sourcetype="myengine-stage" "API call is True for MyEngine" OR "... See more...
Ideally, you should avoid join if possible. It looks like the first part of your search could be replaced by this index=default-va6* sourcetype="myengine-stage" "API call is True for MyEngine" OR "Target language count" | rex field=_raw "request_id=(?<reqID>.+?) - " | rex field=_raw "Target language count (?<num_target>\d+)" | stats first(num_target) as num_target by reqID For the second join, x-request-id is not returned by the subsearch so the join will fail anyway. Perhaps there is another way to approach this, but for that we would need some (anonymised) sample events from your data sources, and perhaps a non-SPL definition of what it is you are trying to achieve.
All, I am looking for a solution to identify the hosts that have stopped reporting to Splunk using lookup table.  However, the condition is there are Primary and Secondary hosts for some data types... See more...
All, I am looking for a solution to identify the hosts that have stopped reporting to Splunk using lookup table.  However, the condition is there are Primary and Secondary hosts for some data types. I do not want to get alerted if either of the hosts (Primary or Secondary) is reporting. At the same time I would like to map these hosts to their respective index. So if a host(both primary and secondary in some cases) from a particular index stops reporting an alert should trigger (will probably have another column for index mapping the hosts). Any solution would be highly appreciated!! 
I'm trying to calculate the data throughput for a cloud computing solution that will be charging based on outgoing data throughput. We're collecting on the link using security onion and forwarding ... See more...
I'm trying to calculate the data throughput for a cloud computing solution that will be charging based on outgoing data throughput. We're collecting on the link using security onion and forwarding those zeek logs to our splunk instance.  index="zeek" source="conn.log" ((id.orig_h IN `front end`) AND NOT (id.resp_h IN `backend`)) OR ((id.resp_h IN `front end`) AND NOT (id.orig_h IN `backend`)) | fields orig_bytes, resp_bytes | eval terabytes=(((((resp_bytes+orig_bytes)/1024)/1024)/1024)/1024) | stats sum (terabytes) This gives me traffic throughput in and out of the network for external connections however what I need is to calculate orig_bytes only when the id.orig_h is my `frontend` and resp_bytes when id.resp_h is `frontend`. I can get them separately by just doing two different searches and then adding the results up by hand. But I'm sure theres a way to do what I want to in one search using some sort of conditional. I've tried using where and eval if but I'm just not skilled enough it seems. 
I'm attempting to compute the total number of API calls from our backend engine. Initially, I process API identification text logs as events in the engine's index, enabling me to filter respective re... See more...
I'm attempting to compute the total number of API calls from our backend engine. Initially, I process API identification text logs as events in the engine's index, enabling me to filter respective request IDs. Simultaneously, I process the target_num count within the same index/source. By merging these two logs through a join operation, I filter out all relevant requests to compute the total API calls accurately, achieving the desired outcome. Subsequently, I aim to enhance this by joining the filtered request IDs with another platform's index/source. Here, I intend to determine the success or failure status of each request at the platform level and then multiply it by the original value of target_num. However, upon combining these queries, I'm experiencing discrepancies in the execution results. I'm uncertain about the missing piece causing this issue. My Final Query : <x-request-id is an existing field on platform index and there is no rex I am using> ---------------------- index=default-va6* sourcetype="myengine-stage" "API call is True for MyEngine" | rex field=_raw "request_id=(?<reqID>.+?) - " | dedup reqID | join reqID [ search index=default-va6* sourcetype="myengine-stage" "Target language count" | rex field=_raw "request_id=(?<reqID>.+?) - " | rex field=_raw "Target language count (?<num_target>\d+)" | dedup reqID | fields reqID, num_target ] | fields reqID, num_target | stats count("reqID") as total_calls by num_target | eval total_api_calls = total_calls * num_target | stats sum(total_api_calls) as Total_Requests_Received | rename reqID AS "x-request-id" | join "x-request-id" [ search index=platform-va6 sourcetype="platform-ue*" "Marked request as" | eval num_succeed = if(like(message, "Marked request as succeed%"), 1, 0) | eval num_failed = if(like(message, "Marked request as failed%"), 1, 0) | fields num_succeed, num_failed ] | fields num_succeed, num_failed | stats sum(num_succeed) as num_succeed, sum(num_failed) as num_failed | eval total_succeed_calls = num_succeed * num_target, total_failed_calls = num_failed * num_target
<search> <query>index="ourIndex" sourcetype=$stype$ABC AND Is_Service_Account="True" OR Is_Service_Account="False" earliest=-48h | eval DC=upper(DC) | eval env1=case(DC like "%Q%","QA", DC like "... See more...
<search> <query>index="ourIndex" sourcetype=$stype$ABC AND Is_Service_Account="True" OR Is_Service_Account="False" earliest=-48h | eval DC=upper(DC) | eval env1=case(DC like "%Q%","QA", DC like "%DEV%","DEV", true(), "PROD") | search env1=$envPure$ AND $domainPure$ |rename DC AS domainPure | stats count </query> <earliest>0</earliest> <latest></latest> </search>   If earliest=-48h and within the source code there is <earliest>0</earliest>, then if we enable an admission rule that disables All Time searches what would happen? 
Here is an idea: Select events in which list{}.name has one unique value "Hello", and has a value of "code" as the first element of list{}.type.   | where mvindex('list{}.type', 0) == "code" AND 'l... See more...
Here is an idea: Select events in which list{}.name has one unique value "Hello", and has a value of "code" as the first element of list{}.type.   | where mvindex('list{}.type', 0) == "code" AND 'list{}.name' == "Hello" AND mvcount(mvdedup('list{}.name')) == 1   However, given that list is an array, selecting only the first element for matching may not be what the use case demands. (Work with developers to figure out what semantics array order may convey.)  Here is one to select any element with value "code".   | where 'list{}.type' == "code" AND 'list{}.name' == "Hello" AND mvcount(mvdedup('list{}.name')) == 1   Here is an emulation of your mock data for you to play with and compare with real data   | makeresults | fields - _* | eval data = mvappend("{ \"list\": [ {\"name\": \"Hello\", \"type\": \"code\"}, {\"name\": \"Hello\", \"type\": \"document\"} ] }", "{ \"list\": [ {\"name\": \"Hello\", \"type\": \"code\"}, {\"name\": \"World\", \"type\": \"document\"} ] }", "{ \"list\": [ {\"name\": \"Hello\", \"type\": \"document\"}, {\"name\": \"Hello\", \"type\": \"document\"} ] }") | mvexpand data | rename data AS _raw | spath ``` data emulation above ```   With this data, output is the same for both variants _raw list{}.name list{}.type { "list": [ {"name": "Hello", "type": "code"}, {"name": "Hello", "type": "document"} ] } Hello Hello code document
Interesting, thanks for taking time and replying to my queries. @PaulPanther