All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

This does not sound right.  First of all, why should ThreeDSecureResult and description be one field? They are different parts of a data structure.  Second, Splunk should have given you  ThreeDSe... See more...
This does not sound right.  First of all, why should ThreeDSecureResult and description be one field? They are different parts of a data structure.  Second, Splunk should have given you  ThreeDSecureResult{@description} Failed
How do I rename/conjoin/remove the space between the field "ThreeDSecureResult" and "description"? The value is coming up as 'description' rather than 'Failed' when I try to table it on Splunk. <Thr... See more...
How do I rename/conjoin/remove the space between the field "ThreeDSecureResult" and "description"? The value is coming up as 'description' rather than 'Failed' when I try to table it on Splunk. <ThreeDSecureResult description="Failed"/>
Would like to run a scan on backend and look for "*M5*-CLDB" or any combination of M5 and CLDB. We have Splunk Distributed environment, indexer and search head clusters. Saved searches, lookups, Dash... See more...
Would like to run a scan on backend and look for "*M5*-CLDB" or any combination of M5 and CLDB. We have Splunk Distributed environment, indexer and search head clusters. Saved searches, lookups, Dashboards which needs to be modified due to the cluster name change. Could someone share your thoughts on the same.
Maybe you can explain the requirement further?  You said of your original SPL "it's only counting the matches, i need the URLs that don't exist to count 0."  So, I thought you would want every url no... See more...
Maybe you can explain the requirement further?  You said of your original SPL "it's only counting the matches, i need the URLs that don't exist to count 0."  So, I thought you would want every url not with a match.  Could you mock up some data in web_index and explain why the output doesn't meet the requirements? Here is my emulation   | makeresults | fields - _time | eval url = mvappend("google.com", "foo.com", "bar.com", "google.com") | mvexpand url ``` the above emulates index="web_index" ```   url google.com foo.com bar.com google.com Using the exact mock lookup you give, my search will give url count bar.com 0 foo.com 0 google.com 2 Is this not what you want, bar.com and foo.com count to 0? Below is the full example   | makeresults | fields - _time | eval url = mvappend("google.com", "foo.com", "bar.com", "google.com") | mvexpand url ``` the above emulates index="web_index" ``` | lookup URLs.csv kurl as url output kurl as match | eval match = if(isnull(match), 0, 1) | stats sum(match) as count by url    
@niketn  how do i make the domain entity to be multiselect with value1: material,value 2:sm,value 3:all then my inputToken &outputtoken are different for each data entity selection,,,how can i pa... See more...
@niketn  how do i make the domain entity to be multiselect with value1: material,value 2:sm,value 3:all then my inputToken &outputtoken are different for each data entity selection,,,how can i pass two token if used in multiselect to two different search query <row> <panel> <input type="dropdown" token="tokEnvironment" searchWhenChanged="true"> <label>Domain</label> <choice value="goodsdevelopment">goodsdevelopment</choice> <choice value="materialdomain">materialdomain</choice> <choice value="costsummary">costsummary</choice> <change> <unset token="tokSystem"></unset> <unset token="form.tokSystem"></unset> </change> <default></default> </input> <input type="dropdown" token="tokSystem" searchWhenChanged="true">         <label>Domain Entity</label>         <fieldForLabel>$tokEnvironment$</fieldForLabel>         <fieldForValue>$tokEnvironment$</fieldForValue>         <search>           <query>| makeresults   | eval goodsdevelopment="airbag",materialdomain="material,sm",costsummary="costing"</query>         </search>         <change>           <condition match="$label$==&quot;airbag&quot;">             <set token="inputToken">airbagSizeScheduling</set>             <set token="outputToken">goodsdevelopment</set>           </condition>           <condition match="$label$==&quot;costing&quot;">             <set token="inputToken">costSummary</set>             <set token="outputToken">costing</set>           </condition>           <condition match="$label$==&quot;material&quot;">             <set token="inputToken">material</set>             <set token="outputToken">md</set>           </condition>         </change>       </input>
how do i make the domain entity to be multiselect with value1: material,value 2:sm,value 3:all then my inputToken &outputtoken are different for each data entity selection,,,how can i pass two tok... See more...
how do i make the domain entity to be multiselect with value1: material,value 2:sm,value 3:all then my inputToken &outputtoken are different for each data entity selection,,,how can i pass two token if used in multiselect to two different search query <row> <panel> <input type="dropdown" token="tokEnvironment" searchWhenChanged="true"> <label>Domain</label> <choice value="goodsdevelopment">goodsdevelopment</choice> <choice value="materialdomain">materialdomain</choice> <choice value="costsummary">costsummary</choice> <change> <unset token="tokSystem"></unset> <unset token="form.tokSystem"></unset> </change> <default></default> </input> <input type="dropdown" token="tokSystem" searchWhenChanged="true">         <label>Domain Entity</label>         <fieldForLabel>$tokEnvironment$</fieldForLabel>         <fieldForValue>$tokEnvironment$</fieldForValue>         <search>           <query>| makeresults   | eval goodsdevelopment="airbag",materialdomain="material,sm",costsummary="costing"</query>         </search>         <change>           <condition match="$label$==&quot;airbag&quot;">             <set token="inputToken">airbagSizeScheduling</set>             <set token="outputToken">goodsdevelopment</set>           </condition>           <condition match="$label$==&quot;costing&quot;">             <set token="inputToken">costSummary</set>             <set token="outputToken">costing</set>           </condition>           <condition match="$label$==&quot;material&quot;">             <set token="inputToken">material</set>             <set token="outputToken">md</set>           </condition>         </change>       </input>
Hi Splunkers, I need to send 50 reports out of a splunk query to my team members every month. Currently I’m taking them manually and distributing them.  we are planning to automate this proce... See more...
Hi Splunkers, I need to send 50 reports out of a splunk query to my team members every month. Currently I’m taking them manually and distributing them.  we are planning to automate this process by sending them to Jira and open tickets with the necessary data and assign them to people.  What should I do in order to achieve this? Is there a proper Splunk Jira add-on in Splunk base? Any recommendations would be helpful. TIA
Hello gcusello, Sorry, I forgot to reply. I rewrote SPL myself to complete the requirements, thanks for your help.
Ah wonderful, thanks so much!
That also returned every url not on my lookup.  I guess I could just take the hits and do a duplicate compare in excel but it'd be nice to see it all in splunk. Yeah, sorry about the tag I wasn't pa... See more...
That also returned every url not on my lookup.  I guess I could just take the hits and do a duplicate compare in excel but it'd be nice to see it all in splunk. Yeah, sorry about the tag I wasn't paying attention.
Hello, Thank you for your suggestion. On your suggestion, when using index=vulnerability_index, Doesn't the search still correlate 100k IPs with the CSV? Is it possible to create a historical d... See more...
Hello, Thank you for your suggestion. On your suggestion, when using index=vulnerability_index, Doesn't the search still correlate 100k IPs with the CSV? Is it possible to create a historical data or index or DB to bypass the original vulnerability index? For example: index=new_vulnerability_index The old vulnerability_index has total 100k IPs, but this new index only has 2 IPs and 4 rows because it's already pre-calculated ip_address         vulnerability                        score 192.168.1.1 SQL Injection 9 192.168.1.1 OpenSSL 7 192.168.1.2 Cross Site-Scripting       8 192.168.1.2 DNS 5
As the title says, when troubleshooting data ingestion on search heads e.g. running an SPL to check if certain data is landing to that index or searching through an addon like aws to check its diagno... See more...
As the title says, when troubleshooting data ingestion on search heads e.g. running an SPL to check if certain data is landing to that index or searching through an addon like aws to check its diagnostic logs on the HF. For tests like these, which search heads are best to use, monitoring console or a regular search head or an SHC ?
Pro tip: It is great that you are using makeresults to post simulation.  But do not mangle JSON. The command you are looking for is mvexpand.   | makeresults | eval prediction_str_body="[{\"string... See more...
Pro tip: It is great that you are using makeresults to post simulation.  But do not mangle JSON. The command you are looking for is mvexpand.   | makeresults | eval prediction_str_body="[{\"stringOutput\":\"Alpha\",\"doubleOutput\":0.52},{\"stringOutput\":\"Beta\",\"doubleOutput\":0.48}]" | spath input=prediction_str_body path={} | mvexpand {} | spath input={}   Your sample gives doubleOutput stringOutput {} 0.48 Beta {"stringOutput":"Beta","doubleOutput":0.48} Hope this helps.
Something like index=vulnerability_index [| inputlookup company.csv | stats values(ip_address) as ip_address] | table ip_address, vulnerability, score | lookup company.csv ip_address as ip_a... See more...
Something like index=vulnerability_index [| inputlookup company.csv | stats values(ip_address) as ip_address] | table ip_address, vulnerability, score | lookup company.csv ip_address as ip_address OUTPUTNEW ip_address, company, location Hope this helps
Hey I have the following query:   ``` | makeresults | eval prediction_str_body="[{'stringOutput':'Alpha','doubleOutput':0.52},{'stringOutput':'Beta','doubleOutput':0.48}]" ```   But no matter w... See more...
Hey I have the following query:   ``` | makeresults | eval prediction_str_body="[{'stringOutput':'Alpha','doubleOutput':0.52},{'stringOutput':'Beta','doubleOutput':0.48}]" ```   But no matter what I do, I can't seem to extract each element of the list and turn it into it's own event. I'd ideally like a table afterwards of the sum of each value: Alpha: 0.52 Beta: 0.48 For all rows. Thanks!
I assume that Splunk already gives you msg as a field.  You can then use extract on it. index=wf_pvsi_other wf_id=swbs wf_env=prod sourcetype="wf:swbs:profiling:txt" | rename msg as _raw | extract |... See more...
I assume that Splunk already gives you msg as a field.  You can then use extract on it. index=wf_pvsi_other wf_id=swbs wf_env=prod sourcetype="wf:swbs:profiling:txt" | rename msg as _raw | extract | search AppBody=SOR_RESP DestId=EIW | table SrcApp, SubApp, RefMsgNm, DestId, MsgNm | fillnull value=NA SubApp | top SrcApp, SubApp, RefMsgNm, DestId, MsgNm limit=100 | rename SrcApp as Channel, SubApp as "Sub Application", RefMsgNm as Message, DestId as SOR, MsgNm as "SOR Message" | fields Channel, "Sub Application", Message,SOR,"SOR Message",count | sort Channel,"Sub Application", Message,SOR, "SOR Message", count (As your new source is JSON, overriding _raw should be fine.)  Hope this helps.
Here's another way to find those transaction - replace transaction with this | streamstats global=f reset_after="state=1" range(_time) as duration list(_raw) as events count as eventcount by zone | ... See more...
Here's another way to find those transaction - replace transaction with this | streamstats global=f reset_after="state=1" range(_time) as duration list(_raw) as events count as eventcount by zone | where state=1 | table _time events zone state duration eventcount  
Thanks a lot for the response @gcusello , it works.
So, it seems like your zones repeat themselves. Here is an example of using your data. You can paste this example into your search | makeresults | eval x=split("2023-09-18 11:22:05.9145992, E7F93BB... See more...
So, it seems like your zones repeat themselves. Here is an example of using your data. You can paste this example into your search | makeresults | eval x=split("2023-09-18 11:22:05.9145992, E7F93BB1-608A-4D2F-AF34-0ED1AB279A65, AUR MCPA Alarm 16,2, Full; Bins East; Level 1; Divert Row 057; Zone 113,1,0,192###2023-09-18 11:31:35.7205659, 2C8701D0-7B9D-4F99-8679-A4F3F98086C9, AUR MCPA Alarm 16,2, Full; Bins East; Level 1; Divert Row 057; Zone 113,0,0,192###2023-09-18 11:36:24.1803900, 0C07C755-C59B-4E9F-92A6-E60EC1790E00, AUR MCPA Alarm 14,2, Full; Bins East; Level 1; Divert Row 223; Zone 121,0,0,192###2023-09-18 12:00:27.1437935, 0BE15F46-AA1E-46D2-97FF-5E8F68EC4415, AUR MCPA Alarm 14,2, Full; Bins East; Level 1; Divert Row 223; Zone 121,1,0,192###2023-09-18 12:00:37.1563574, 67E5E8C7-3D36-41C9-9062-F71AF3481012, AUR MCPA Alarm 14,2, Full; Bins East; Level 1; Divert Row 223; Zone 121,0,0,192###2023-09-18 12:00:47.1724708, 39C5326A-B2B6-478A-9756-8FAD049074C9, AUR MCPA Alarm 13,2, Full; Bins East; Level 1; Divert Row 227; Zone 122,1,0,192###2023-09-18 12:00:55.1835517, 7C060FE4-3441-4BEB-AFFE-97D8E0E5F324, AUR MCPA Alarm 13,2, Full; Bins East; Level 1; Divert Row 227; Zone 122,0,0,192###2023-09-18 12:03:27.3790874, B40D0D99-8E60-4AC8-8F34-2DA037945463, AUR MCPA Alarm 24,2, Full; Bins East; Level 1; Divert Row 121; Zone 117,1,0,192###2023-09-18 12:03:31.3853304, B72D54D5-B7B8-4928-83D2-DF64FAAD52BD, AUR MCPA Alarm 24,2, Full; Bins East; Level 1; Divert Row 121; Zone 117,0,0,192###2023-09-18 12:11:28.9249859, 3323D5D6-98BE-4867-86D9-7068225C44E6, AUR MCPA Alarm 19,2, Full; Bins East; Level 1; Divert Row 095; Zone 116,1,0,192###2023-09-18 12:11:32.9266932, 32C54B9A-03E1-4E70-9F6E-F34FF4D4EF8D, AUR MCPA Alarm 19,2, Full; Bins East; Level 1; Divert Row 095; Zone 116,0,0,192###2023-09-18 12:20:34.8242708, 1231E232-07F7-40F6-8CC0-23A80D9693DA, AUR MCPA Alarm 14,2, Full; Bins East; Level 1; Divert Row 223; Zone 121,1,0,192###2023-09-18 12:21:01.8614482, D807C593-5F41-44F3-9BEA-601BCEA45A96, AUR MCPA Alarm 14,2, Full; Bins East; Level 1; Divert Row 223; Zone 121,0,0,192###2023-09-18 12:41:58.6150128, 04A9F0AC-34E2-4514-9301-E607F5B90DBB, AUR MCPA Alarm 14,2, Full; Bins East; Level 1; Divert Row 223; Zone 121,1,0,192###2023-09-18 12:42:16.6309373, DAF119E7-8BE5-4B14-AF98-EC34F52CF343, AUR MCPA Alarm 14,2, Full; Bins East; Level 1; Divert Row 223; Zone 121,0,0,192###2023-09-18 12:45:56.3032344, CF2988F9-7354-4C6F-A320-ED50AF43F149, AUR MCPA Alarm 14,2, Full; Bins East; Level 1; Divert Row 223; Zone 121,1,0,192###2023-09-18 12:48:22.3814934, F12CAAFE-8861-40A5-8763-EDF02C25722F, AUR MCPA Alarm 14,2, Full; Bins East; Level 1; Divert Row 223; Zone 121,0,0,192###2023-09-18 12:49:10.4169289, C72DB2E5-A7E6-471C-8BAC-280A91E28338, AUR MCPA Alarm 14,2, Full; Bins East; Level 1; Divert Row 223; Zone 121,1,0,192###2023-09-18 12:53:18.5610031, 4C8CAF70-1A73-4318-A0DF-B42F76352277, AUR MCPA Alarm 18,2, Full; Bins East; Level 1; Divert Row 257; Zone 123,1,0,192###2023-09-18 12:53:56.5822544, 9D2E9472-7FCF-4266-A7C5-76942F4E9D71, AUR MCPA Alarm 18,2, Full; Bins East; Level 1; Divert Row 257; Zone 123,0,0,192###2023-09-18 12:57:56.9627790, CC8B059B-5A4F-46CE-9CB2-0E6F98E95A1B, AUR MCPA Alarm 13,2, Full; Bins East; Level 1; Divert Row 227; Zone 122,1,0,192###2023-09-18 13:01:11.2381480, ECC5639E-14DA-4067-9874-DAC23B56F50A, AUR MCPA Alarm 13,2, Full; Bins East; Level 1; Divert Row 227; Zone 122,0,0,192", "###") | mvexpand x | rename x as _raw | eval _time=strptime(_raw, "%F %T.%Q") | sort - _time | fields _time _raw ``` The above creates your data set ``` ``` Extract the zone and state ``` | rex "Zone (?<zone>\d+),(?<state>\d)" ``` Now look for 2 events per transaction ``` | transaction maxevents=2 zone startswith=eval(state=1) endswith=eval(state=0) If you set up a field extraction to extract zone and state automatically, you can then search for zone=X or zone=Y in the search and then the transaction command is simple. Note that transaction has limitations and the "length" of your transactions is quite long, so you should look at using some kind of stats to evaluate these.
You selected lookup as label, but are using inputlookup .  You would have the answer if you stick to lookup. index="web_index" | lookup URLs.csv kurl as url output kurl as match | eval match = if(... See more...
You selected lookup as label, but are using inputlookup .  You would have the answer if you stick to lookup. index="web_index" | lookup URLs.csv kurl as url output kurl as match | eval match = if(isnull(match), 0, 1) | stats sum(match) as count by url