All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

<input type="dropdown" token="application" searchWhenChanged="false"> <label>Application</label> <choice value="*">All</choice> <fieldForLabel>application_Succ</fieldForLabel> <fieldForValue>applicat... See more...
<input type="dropdown" token="application" searchWhenChanged="false"> <label>Application</label> <choice value="*">All</choice> <fieldForLabel>application_Succ</fieldForLabel> <fieldForValue>application_Fail</fieldForValue> <search> <query> |inputlookup application_lists.csv |search country=$country$ |sort country application_Succ |fields application_Succ application_Fail</query> <earliest>-15m</earliest> <latest>now</latest> </search> </input>
Hi @SplunkDash, let me understand: do you have in each event the four fields : account_id, Name, Org_Code, UPDATE_DATE? because it shouldn't be possible that some fields aren't visualizes unless th... See more...
Hi @SplunkDash, let me understand: do you have in each event the four fields : account_id, Name, Org_Code, UPDATE_DATE? because it shouldn't be possible that some fields aren't visualizes unless they are missing in the lookup. In this case, to assign the values where missing you coud use the join (even if I hate this command!), is this your requirement, put in the empty cells the values from other rows? Ciao. Giuseppe
Hi @Muthu_Vinith , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Excellent question. > or is indexing left up to the remote site indexer? Left up to remote site indexer. With RF 3, remote is indexing twice compared to source. Overall each node is indexing additi... See more...
Excellent question. > or is indexing left up to the remote site indexer? Left up to remote site indexer. With RF 3, remote is indexing twice compared to source. Overall each node is indexing additional 2 replicated slices. >Also, does Splunk replicate raw data or compressed data? If ssl is enabled and on indexers, following config in server.conf are set to true. Under stanza [replication_port-ssl://<port>] useSSLCompression = <boolean> * If true, enables SSL compression. * Default: false compressed = <boolean> * DEPRECATED; use 'useSSLCompression' instead. * Used only if 'useSSLCompression' is not set.  Under stanza [sslConfig] allowSslCompression = <boolean> * If set to "true", the server allows clients to negotiate SSL-layer data compression. * KV Store also observes this setting. * If set to "false", KV Store disables TLS compression. * Default: true  
CHECK_METHOD = modtime is not working as expected due to a regression in 9.x as there is wrong calculation which will lead to un-expected re-reading of a file. Until next patch, use following workar... See more...
CHECK_METHOD = modtime is not working as expected due to a regression in 9.x as there is wrong calculation which will lead to un-expected re-reading of a file. Until next patch, use following workaround for inputs with CHECK_METHOD = modtime In inputs.conf set following for impacted stanza time_before_close=0  
Where I should add that, near application drop down?? can u pls share me the code????
Thank you for providing the emulation!  It is really important to illustrate data characteristics when dealing with data analytics.  I made the assumption that each session would only handle one clai... See more...
Thank you for providing the emulation!  It is really important to illustrate data characteristics when dealing with data analytics.  I made the assumption that each session would only handle one claim.  If that is not the case, we'll have to extract claim number for correlation.  There are many ways to do this. Because claim number is always embedded in the file name, I will show the simplest that applies to both INFO and ERROR. (An alternative is to simply use file name for correlation.)  So   (index="myindex" "/app1/service/site/upload failed" AND "source=Web" AND "confirmationNumber=ND_*") OR (index="myindex" "Exception from executeScript") | rex "\bname=(?<name>[^,]+)" | rex "(?i) claim # *(?<claimNumber>\S+)" | rex "(?<SessionID>\[http-nio-8080-exec-\d+\])" | rex "Exception from executeScript: (?<Exception>[^:]+)" | fields claimNumber confirmationNumber name Exception | stats min(_time) as _time values(*) as * by claimNumber    Here is full emulation and result   | makeresults | eval data = split("2024-02-15 09:07:47,770 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-202] The Upload Service /app1/service/citizens/upload failed in 0.124000 seconds, {comments=xxx-123, senderCompany=Company1, source=Web, title=Submitted via Site website, submitterType=Others, senderName=ROMAN , confirmationNumber=ND_50249-02152024, clmNumber=99900468430, name=ROAMN Claim # 99900468430 Invoice.pdf, contentType=Email} 2024-02-15 09:07:47,772 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-202] Exception from executeScript: 0115100898 Duplicate Child Exception - ROAMN Claim # 99900468430 Invoice.pdf already exists in the location. 2024-02-15 09:10:47,770 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-202] The Upload Service /app1/service/citizens/upload failed in 0.124000 seconds, {comments=xxx-123, senderCompany=Company1, source=Web, title=Submitted via Site website, submitterType=Others, senderName=Bob , confirmationNumber=ND_55555-02152024, clmNumber=99900468999, name=Bob Claim # 99900468999 Invoice.pdf, contentType=Email} 2024-02-15 09:10:48,772 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-202] Exception from executeScript: 0115101000 Document not found - Bob Claim # 99900468999 Invoice.pdf already exists in the location. 2024-02-15 09:41:16,762 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-200] The Upload Service /app1/service/citizens/upload failed in 0.138000 seconds, {comments=yyy-789, senderCompany=Company2, source=Web, title=Submitted via Site website, submitterType=Public Adjuster, senderName=Tristian, confirmationNumber=ND_52233-02152024, clmNumber=99900470018, name=Tristian CLAIM #99900470018 PACKAGE.pdf, contentType=Email} 2024-02-15 09:41:16,764 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-200] Exception from executeScript: 0115100953 Document not found - Tristian CLAIM #99900470018 PACKAGE.pdf", " ") | mvexpand data | rename data AS _raw | rex "^(?<_time>\S+ \S+)" | eval _time = strptime(_time, "%F %T,%3N") | extract ``` the above emulates (index="myindex" "/app1/service/site/upload failed" AND "source=Web" AND "confirmationNumber=ND_*") OR (index="myindex" "Exception from executeScript") ``` | rex "\bname=(?<name>[^,]+)" | rex "(?i) claim # *(?<claimNumber>\S+)" ```| rex "clmNumber=(?<ClaimNumber>[^,]+)" | rex "confirmationNumber=(?<SubmissionNumber>[^},]+)" | rex "contentType=(?<ContentType>[^},]+)" ``` | rex "(?<SessionID>\[http-nio-8080-exec-\d+\])" | rex "Exception from executeScript: (?<Exception>[^:]+)" | fields claimNumber confirmationNumber name Exception | stats min(_time) as _time values(*) as * by claimNumber   claimNumber _time Exception confirmationNumber name 99900468430 2024-02-15 09:07:47.769 0115100898 Duplicate Child Exception - ROAMN Claim # 99900468430 Invoice.pdf already exists in the location. ND_50249-02152024 ROAMN Claim # 99900468430 Invoice.pdf 99900468999 2024-02-15 09:10:47.769 0115101000 Document not found - Bob Claim # 99900468999 Invoice.pdf already exists in the location. ND_55555-02152024 Bob Claim # 99900468999 Invoice.pdf 99900470018 2024-02-15 09:41:16.762 0115100953 Document not found - Tristian CLAIM #99900470018 PACKAGE.pdf ND_52233-02152024 Tristian CLAIM #99900470018 PACKAGE.pdf
The whole search is... strange to say the least. You generate a single value (and do that in a strange way - by aggregating by a field and the  ignoring the split by that field and summing up everyt... See more...
The whole search is... strange to say the least. You generate a single value (and do that in a strange way - by aggregating by a field and the  ignoring the split by that field and summing up everything). Then you use appendcols to add another value which is obtained by joining two data sets from the same index. Very strange and possibly inefficient way. Even if you split your data by site, there is no guarantee that both result sets joined by appendcols will have the same order of results (and appendcols doesn't care about any field matching or something like that so it's up to you to make sure both result sets are compatible). Anyway, I suspect there might be a more elegant (and possibly more efficient) way to do the same. Also remember that your search might be subject to subsearch limitations.
Sorry - typo on my part - you need it like this |stats sum(count) as Success by site You also need site on some of the other stats commands too
Add the all choice to the input - you can have both fixed and dynamic choices in the same dropdown
Created 2 drop downs in a dashboard.  1. Country 2. Applications (getting data from .csv file) In applications drop down i am seeing individual applications in the drop down but I need "All" opt... See more...
Created 2 drop downs in a dashboard.  1. Country 2. Applications (getting data from .csv file) In applications drop down i am seeing individual applications in the drop down but I need "All" options in the dropdown. How can i do it??   <input type="radio" token="country"> <label>Country</label> <choice value="india">India</choice> <choice value="australian">Australian</choice> <default>india</default> <intialValue>india</intialValue> <change> <condition label="India"> <set token="sorc">callsource</set> </condition> <condition label="Australian"> <set token="sorc">callsource2</set> </condition> </change> </input> <input type="dropdown" token="application" searchWhenChanged="false"> <label>Application</label> <fieldForLabel>application_Succ</fieldForLabel> <fieldForValue>application_Fail</fieldForValue> <search> <query> |inputlookup application_lists.csv |search country=$country$ |sort country application_Succ |fields application_Succ application_Fail</query> <earliest>-15m</earliest> <latest>now</latest> </search> </input> </fieldset>      
Hi @ITWhisperer , i tried adding the missing line, but i am not getting the results by site. I think we need to do some changes in the query but i am not getting it.  Can anyone help on this.
Hello @gcusello  Thank you so much again for your quick reply. I tried with that before not working, all duplicate  account_id  group together within one event. For Example, we should have three sep... See more...
Hello @gcusello  Thank you so much again for your quick reply. I tried with that before not working, all duplicate  account_id  group together within one event. For Example, we should have three separate events for account_id 121. But when I use |where updatedate >comparedate or |search updatedate >comparedate Name=*; those group together in one event. I couldn't remove that or separate them. Is there any way we can do that. Thank you again.
I did a quick and dirty test | makeresults count=10 | streamstats count | eval size=count*10000 | map search="| makeresults count=100000 | streamstats count | search [ | makeresults count=$size$... See more...
I did a quick and dirty test | makeresults count=10 | streamstats count | eval size=count*10000 | map search="| makeresults count=100000 | streamstats count | search [ | makeresults count=$size$ | streamstats count | tail 1 | table count] " And got 10 results from 10k to 100k. So apparently even though the initial makeresults creates 100k events, the important thing is that the subsearch only returns one result. But if I rephrase the search a bit | makeresults count=10 | streamstats count | eval size=count*10000 | map search="| makeresults count=100000 | streamstats count | search [ | makeresults count=$size$ | streamstats count | table count] | stats count All my results are 10000, because all subsearches are finalized at 10k results.
@PickleRick You could be right - I thought I had a usecase where it was events being found caused a problem, but I can't reproduce it, although I can reproduce the problem when it is results being re... See more...
@PickleRick You could be right - I thought I had a usecase where it was events being found caused a problem, but I can't reproduce it, although I can reproduce the problem when it is results being returned exceeding (or rather being truncated at) 50k
Field extractions do _not_ change what the raw event looks like. The only extract parts of the original event into specific fields. BTW, instead of reinventing the wheel why not checking out the add... See more...
Field extractions do _not_ change what the raw event looks like. The only extract parts of the original event into specific fields. BTW, instead of reinventing the wheel why not checking out the add-ons that are already on Splunkbase. BTW2, are you sure you're not having your field extractions defined and them not showing because you're searching in fast mode?
Result is coming like this for the first query..... SessionID _time Exception clmNumber confirmationNumber name [http-nio-8080-exec-101] 2024-02-15 00:06:38.457 0115100018 Could not ma... See more...
Result is coming like this for the first query..... SessionID _time Exception clmNumber confirmationNumber name [http-nio-8080-exec-101] 2024-02-15 00:06:38.457 0115100018 Could not match parameter list [names, keep] to an operation. org.springframework.extensions.webscripts.WebScriptException 0115100062 Could not find document 20231009_00064.TIF in suspense. org.springframework.extensions.webscripts.WebScriptException 0115100104 Could not find document 20240103_00065.TIF in suspense. org.springframework.extensions.webscripts.WebScriptException 0115100168 Duplicate Child Exception - 02142024_17C0_Email.pdf already exists in the location. org.springframework.extensions.webscripts.WebScriptException 0115100375 Duplicate Child Exception - NB Doc Form 313652.8.24 already exists in the location. org.springframework.extensions.webscripts.WebScriptException --- (Many More)        
Hi @SplunkDash, to remove the empty rows you sould use a command "Name=*" Anyway, I'd use a simpler search: | inputlookup account_audit.csv | eval updatedate=strptime(UPDATE_DATE, "%m/%d/%y" %... See more...
Hi @SplunkDash, to remove the empty rows you sould use a command "Name=*" Anyway, I'd use a simpler search: | inputlookup account_audit.csv | eval updatedate=strptime(UPDATE_DATE, "%m/%d/%y" %H:%M:%S), comparetdate=now()-86400*30 | search updatedate>comparedate Name=* | table account_id Name Org_Code UPDATE_DATE Even if, I'd use an index and not a lookup. Ciao. Giuseppe
BTW when the first query runs, it feels like it is going to give data as it presented by query 2 (| makeresults) for a sub second and then it mixes up and provides all the jumbled up data without any... See more...
BTW when the first query runs, it feels like it is going to give data as it presented by query 2 (| makeresults) for a sub second and then it mixes up and provides all the jumbled up data without anything on last three columns. Not sure if this information helps.
Thanks a lot for your reply Yuanliu. When I tried to run the below code I get very skwed result. Session ID, and Time columns gets populated. For Exception, all exception for that "day" shows up in ... See more...
Thanks a lot for your reply Yuanliu. When I tried to run the below code I get very skwed result. Session ID, and Time columns gets populated. For Exception, all exception for that "day" shows up in in row itself (Since I am running a day's worth of report) whether its related to "confirmationNumber=ND_*" or not. Rest of the three fieds are empty.   index="myindex" ("/app1/service/site/upload failed" AND "source=Web" AND "confirmationNumber=ND_*") OR ("Exception from executeScript") | rex "\bname=(?<name>[^,]+)" ```| rex "clmNumber=(?<ClaimNumber>[^,]+)" | rex "confirmationNumber=(?<SubmissionNumber>[^},]+)" | rex "contentType=(?<ContentType>[^},]+)" ``` | rex "(?<SessionID>\[http-nio-8080-exec-\d+\])" | rex "Exception from executeScript: (?<Exception>[^:]+)" | fields clmNumber confirmationNumber name Exception SessionID | stats min(_time) as _time values(*) as * by SessionID     Secondly, I have data that might have same sessionID but different dataset, I am not able to see _time for the second transaction for same sessionID. Here is the sample data -   | makeresults | eval data = split("2024-02-15 09:07:47,770 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-202] The Upload Service /app1/service/citizens/upload failed in 0.124000 seconds, {comments=xxx-123, senderCompany=Company1, source=Web, title=Submitted via Site website, submitterType=Others, senderName=ROMAN , confirmationNumber=ND_50249-02152024, clmNumber=99900468430, name=ROAMN Claim # 99900468430 Invoice.pdf, contentType=Email} 2024-02-15 09:07:47,772 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-202] Exception from executeScript: 0115100898 Duplicate Child Exception - ROAMN Claim # 99900468430 Invoice.pdf already exists in the location. 2024-02-15 09:10:47,770 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-202] The Upload Service /app1/service/citizens/upload failed in 0.124000 seconds, {comments=xxx-123, senderCompany=Company1, source=Web, title=Submitted via Site website, submitterType=Others, senderName=Bob , confirmationNumber=ND_55555-02152024, clmNumber=99900468999, name=Bob Claim # 99900468999 Invoice.pdf, contentType=Email} 2024-02-15 09:10:48,772 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-202] Exception from executeScript: 0115101000 Document not found - Bob Claim # 99900468999 Invoice.pdf already exists in the location. 2024-02-15 09:41:16,762 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-200] The Upload Service /app1/service/citizens/upload failed in 0.138000 seconds, {comments=yyy-789, senderCompany=Company2, source=Web, title=Submitted via Site website, submitterType=Public Adjuster, senderName=Tristian, confirmationNumber=ND_52233-02152024, clmNumber=99900470018, name=Tristian CLAIM #99900470018 PACKAGE.pdf, contentType=Email} 2024-02-15 09:41:16,764 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-200] Exception from executeScript: 0115100953 Document not found - Tristian CLAIM #99900470018 PACKAGE.pdf", " ")   and here is the result - SessionID _time Exception clmNumber confirmationNumber name [http-nio-8080-exec-200] 2024-02-15 09:41:16.762 0115100953 Document not found - Tristian CLAIM #99900470018 PACKAGE.pdf 99900470018 ND_52233-02152024 Tristian CLAIM #99900470018 PACKAGE.pdf [http-nio-8080-exec-202] 2024-02-15 09:07:47.769 0115100898 Duplicate Child Exception - ROAMN Claim # 99900468430 Invoice.pdf already exists in the location. 0115101000 Document not found - Bob Claim # 99900468999 Invoice.pdf already exists in the location. 99900468430 99900468999 ND_50249-02152024 ND_55555-02152024 Bob Claim # 99900468999 Invoice.pdf ROAMN Claim # 99900468430 Invoice.pdf How can we fix the first query so that it provides data for all columns correctly? Thanks in advance for your time!