All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Where I should add that, near application drop down?? can u pls share me the code????
Thank you for providing the emulation!  It is really important to illustrate data characteristics when dealing with data analytics.  I made the assumption that each session would only handle one clai... See more...
Thank you for providing the emulation!  It is really important to illustrate data characteristics when dealing with data analytics.  I made the assumption that each session would only handle one claim.  If that is not the case, we'll have to extract claim number for correlation.  There are many ways to do this. Because claim number is always embedded in the file name, I will show the simplest that applies to both INFO and ERROR. (An alternative is to simply use file name for correlation.)  So   (index="myindex" "/app1/service/site/upload failed" AND "source=Web" AND "confirmationNumber=ND_*") OR (index="myindex" "Exception from executeScript") | rex "\bname=(?<name>[^,]+)" | rex "(?i) claim # *(?<claimNumber>\S+)" | rex "(?<SessionID>\[http-nio-8080-exec-\d+\])" | rex "Exception from executeScript: (?<Exception>[^:]+)" | fields claimNumber confirmationNumber name Exception | stats min(_time) as _time values(*) as * by claimNumber    Here is full emulation and result   | makeresults | eval data = split("2024-02-15 09:07:47,770 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-202] The Upload Service /app1/service/citizens/upload failed in 0.124000 seconds, {comments=xxx-123, senderCompany=Company1, source=Web, title=Submitted via Site website, submitterType=Others, senderName=ROMAN , confirmationNumber=ND_50249-02152024, clmNumber=99900468430, name=ROAMN Claim # 99900468430 Invoice.pdf, contentType=Email} 2024-02-15 09:07:47,772 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-202] Exception from executeScript: 0115100898 Duplicate Child Exception - ROAMN Claim # 99900468430 Invoice.pdf already exists in the location. 2024-02-15 09:10:47,770 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-202] The Upload Service /app1/service/citizens/upload failed in 0.124000 seconds, {comments=xxx-123, senderCompany=Company1, source=Web, title=Submitted via Site website, submitterType=Others, senderName=Bob , confirmationNumber=ND_55555-02152024, clmNumber=99900468999, name=Bob Claim # 99900468999 Invoice.pdf, contentType=Email} 2024-02-15 09:10:48,772 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-202] Exception from executeScript: 0115101000 Document not found - Bob Claim # 99900468999 Invoice.pdf already exists in the location. 2024-02-15 09:41:16,762 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-200] The Upload Service /app1/service/citizens/upload failed in 0.138000 seconds, {comments=yyy-789, senderCompany=Company2, source=Web, title=Submitted via Site website, submitterType=Public Adjuster, senderName=Tristian, confirmationNumber=ND_52233-02152024, clmNumber=99900470018, name=Tristian CLAIM #99900470018 PACKAGE.pdf, contentType=Email} 2024-02-15 09:41:16,764 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-200] Exception from executeScript: 0115100953 Document not found - Tristian CLAIM #99900470018 PACKAGE.pdf", " ") | mvexpand data | rename data AS _raw | rex "^(?<_time>\S+ \S+)" | eval _time = strptime(_time, "%F %T,%3N") | extract ``` the above emulates (index="myindex" "/app1/service/site/upload failed" AND "source=Web" AND "confirmationNumber=ND_*") OR (index="myindex" "Exception from executeScript") ``` | rex "\bname=(?<name>[^,]+)" | rex "(?i) claim # *(?<claimNumber>\S+)" ```| rex "clmNumber=(?<ClaimNumber>[^,]+)" | rex "confirmationNumber=(?<SubmissionNumber>[^},]+)" | rex "contentType=(?<ContentType>[^},]+)" ``` | rex "(?<SessionID>\[http-nio-8080-exec-\d+\])" | rex "Exception from executeScript: (?<Exception>[^:]+)" | fields claimNumber confirmationNumber name Exception | stats min(_time) as _time values(*) as * by claimNumber   claimNumber _time Exception confirmationNumber name 99900468430 2024-02-15 09:07:47.769 0115100898 Duplicate Child Exception - ROAMN Claim # 99900468430 Invoice.pdf already exists in the location. ND_50249-02152024 ROAMN Claim # 99900468430 Invoice.pdf 99900468999 2024-02-15 09:10:47.769 0115101000 Document not found - Bob Claim # 99900468999 Invoice.pdf already exists in the location. ND_55555-02152024 Bob Claim # 99900468999 Invoice.pdf 99900470018 2024-02-15 09:41:16.762 0115100953 Document not found - Tristian CLAIM #99900470018 PACKAGE.pdf ND_52233-02152024 Tristian CLAIM #99900470018 PACKAGE.pdf
The whole search is... strange to say the least. You generate a single value (and do that in a strange way - by aggregating by a field and the  ignoring the split by that field and summing up everyt... See more...
The whole search is... strange to say the least. You generate a single value (and do that in a strange way - by aggregating by a field and the  ignoring the split by that field and summing up everything). Then you use appendcols to add another value which is obtained by joining two data sets from the same index. Very strange and possibly inefficient way. Even if you split your data by site, there is no guarantee that both result sets joined by appendcols will have the same order of results (and appendcols doesn't care about any field matching or something like that so it's up to you to make sure both result sets are compatible). Anyway, I suspect there might be a more elegant (and possibly more efficient) way to do the same. Also remember that your search might be subject to subsearch limitations.
Sorry - typo on my part - you need it like this |stats sum(count) as Success by site You also need site on some of the other stats commands too
Add the all choice to the input - you can have both fixed and dynamic choices in the same dropdown
Created 2 drop downs in a dashboard.  1. Country 2. Applications (getting data from .csv file) In applications drop down i am seeing individual applications in the drop down but I need "All" opt... See more...
Created 2 drop downs in a dashboard.  1. Country 2. Applications (getting data from .csv file) In applications drop down i am seeing individual applications in the drop down but I need "All" options in the dropdown. How can i do it??   <input type="radio" token="country"> <label>Country</label> <choice value="india">India</choice> <choice value="australian">Australian</choice> <default>india</default> <intialValue>india</intialValue> <change> <condition label="India"> <set token="sorc">callsource</set> </condition> <condition label="Australian"> <set token="sorc">callsource2</set> </condition> </change> </input> <input type="dropdown" token="application" searchWhenChanged="false"> <label>Application</label> <fieldForLabel>application_Succ</fieldForLabel> <fieldForValue>application_Fail</fieldForValue> <search> <query> |inputlookup application_lists.csv |search country=$country$ |sort country application_Succ |fields application_Succ application_Fail</query> <earliest>-15m</earliest> <latest>now</latest> </search> </input> </fieldset>      
Hi @ITWhisperer , i tried adding the missing line, but i am not getting the results by site. I think we need to do some changes in the query but i am not getting it.  Can anyone help on this.
Hello @gcusello  Thank you so much again for your quick reply. I tried with that before not working, all duplicate  account_id  group together within one event. For Example, we should have three sep... See more...
Hello @gcusello  Thank you so much again for your quick reply. I tried with that before not working, all duplicate  account_id  group together within one event. For Example, we should have three separate events for account_id 121. But when I use |where updatedate >comparedate or |search updatedate >comparedate Name=*; those group together in one event. I couldn't remove that or separate them. Is there any way we can do that. Thank you again.
I did a quick and dirty test | makeresults count=10 | streamstats count | eval size=count*10000 | map search="| makeresults count=100000 | streamstats count | search [ | makeresults count=$size$... See more...
I did a quick and dirty test | makeresults count=10 | streamstats count | eval size=count*10000 | map search="| makeresults count=100000 | streamstats count | search [ | makeresults count=$size$ | streamstats count | tail 1 | table count] " And got 10 results from 10k to 100k. So apparently even though the initial makeresults creates 100k events, the important thing is that the subsearch only returns one result. But if I rephrase the search a bit | makeresults count=10 | streamstats count | eval size=count*10000 | map search="| makeresults count=100000 | streamstats count | search [ | makeresults count=$size$ | streamstats count | table count] | stats count All my results are 10000, because all subsearches are finalized at 10k results.
@PickleRick You could be right - I thought I had a usecase where it was events being found caused a problem, but I can't reproduce it, although I can reproduce the problem when it is results being re... See more...
@PickleRick You could be right - I thought I had a usecase where it was events being found caused a problem, but I can't reproduce it, although I can reproduce the problem when it is results being returned exceeding (or rather being truncated at) 50k
Field extractions do _not_ change what the raw event looks like. The only extract parts of the original event into specific fields. BTW, instead of reinventing the wheel why not checking out the add... See more...
Field extractions do _not_ change what the raw event looks like. The only extract parts of the original event into specific fields. BTW, instead of reinventing the wheel why not checking out the add-ons that are already on Splunkbase. BTW2, are you sure you're not having your field extractions defined and them not showing because you're searching in fast mode?
Result is coming like this for the first query..... SessionID _time Exception clmNumber confirmationNumber name [http-nio-8080-exec-101] 2024-02-15 00:06:38.457 0115100018 Could not ma... See more...
Result is coming like this for the first query..... SessionID _time Exception clmNumber confirmationNumber name [http-nio-8080-exec-101] 2024-02-15 00:06:38.457 0115100018 Could not match parameter list [names, keep] to an operation. org.springframework.extensions.webscripts.WebScriptException 0115100062 Could not find document 20231009_00064.TIF in suspense. org.springframework.extensions.webscripts.WebScriptException 0115100104 Could not find document 20240103_00065.TIF in suspense. org.springframework.extensions.webscripts.WebScriptException 0115100168 Duplicate Child Exception - 02142024_17C0_Email.pdf already exists in the location. org.springframework.extensions.webscripts.WebScriptException 0115100375 Duplicate Child Exception - NB Doc Form 313652.8.24 already exists in the location. org.springframework.extensions.webscripts.WebScriptException --- (Many More)        
Hi @SplunkDash, to remove the empty rows you sould use a command "Name=*" Anyway, I'd use a simpler search: | inputlookup account_audit.csv | eval updatedate=strptime(UPDATE_DATE, "%m/%d/%y" %... See more...
Hi @SplunkDash, to remove the empty rows you sould use a command "Name=*" Anyway, I'd use a simpler search: | inputlookup account_audit.csv | eval updatedate=strptime(UPDATE_DATE, "%m/%d/%y" %H:%M:%S), comparetdate=now()-86400*30 | search updatedate>comparedate Name=* | table account_id Name Org_Code UPDATE_DATE Even if, I'd use an index and not a lookup. Ciao. Giuseppe
BTW when the first query runs, it feels like it is going to give data as it presented by query 2 (| makeresults) for a sub second and then it mixes up and provides all the jumbled up data without any... See more...
BTW when the first query runs, it feels like it is going to give data as it presented by query 2 (| makeresults) for a sub second and then it mixes up and provides all the jumbled up data without anything on last three columns. Not sure if this information helps.
Thanks a lot for your reply Yuanliu. When I tried to run the below code I get very skwed result. Session ID, and Time columns gets populated. For Exception, all exception for that "day" shows up in ... See more...
Thanks a lot for your reply Yuanliu. When I tried to run the below code I get very skwed result. Session ID, and Time columns gets populated. For Exception, all exception for that "day" shows up in in row itself (Since I am running a day's worth of report) whether its related to "confirmationNumber=ND_*" or not. Rest of the three fieds are empty.   index="myindex" ("/app1/service/site/upload failed" AND "source=Web" AND "confirmationNumber=ND_*") OR ("Exception from executeScript") | rex "\bname=(?<name>[^,]+)" ```| rex "clmNumber=(?<ClaimNumber>[^,]+)" | rex "confirmationNumber=(?<SubmissionNumber>[^},]+)" | rex "contentType=(?<ContentType>[^},]+)" ``` | rex "(?<SessionID>\[http-nio-8080-exec-\d+\])" | rex "Exception from executeScript: (?<Exception>[^:]+)" | fields clmNumber confirmationNumber name Exception SessionID | stats min(_time) as _time values(*) as * by SessionID     Secondly, I have data that might have same sessionID but different dataset, I am not able to see _time for the second transaction for same sessionID. Here is the sample data -   | makeresults | eval data = split("2024-02-15 09:07:47,770 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-202] The Upload Service /app1/service/citizens/upload failed in 0.124000 seconds, {comments=xxx-123, senderCompany=Company1, source=Web, title=Submitted via Site website, submitterType=Others, senderName=ROMAN , confirmationNumber=ND_50249-02152024, clmNumber=99900468430, name=ROAMN Claim # 99900468430 Invoice.pdf, contentType=Email} 2024-02-15 09:07:47,772 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-202] Exception from executeScript: 0115100898 Duplicate Child Exception - ROAMN Claim # 99900468430 Invoice.pdf already exists in the location. 2024-02-15 09:10:47,770 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-202] The Upload Service /app1/service/citizens/upload failed in 0.124000 seconds, {comments=xxx-123, senderCompany=Company1, source=Web, title=Submitted via Site website, submitterType=Others, senderName=Bob , confirmationNumber=ND_55555-02152024, clmNumber=99900468999, name=Bob Claim # 99900468999 Invoice.pdf, contentType=Email} 2024-02-15 09:10:48,772 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-202] Exception from executeScript: 0115101000 Document not found - Bob Claim # 99900468999 Invoice.pdf already exists in the location. 2024-02-15 09:41:16,762 INFO [com.mysite.core.app1.upload.FileUploadWebScript] [http-nio-8080-exec-200] The Upload Service /app1/service/citizens/upload failed in 0.138000 seconds, {comments=yyy-789, senderCompany=Company2, source=Web, title=Submitted via Site website, submitterType=Public Adjuster, senderName=Tristian, confirmationNumber=ND_52233-02152024, clmNumber=99900470018, name=Tristian CLAIM #99900470018 PACKAGE.pdf, contentType=Email} 2024-02-15 09:41:16,764 ERROR [org.springframework.extensions.webscripts.AbstractRuntime] [http-nio-8080-exec-200] Exception from executeScript: 0115100953 Document not found - Tristian CLAIM #99900470018 PACKAGE.pdf", " ")   and here is the result - SessionID _time Exception clmNumber confirmationNumber name [http-nio-8080-exec-200] 2024-02-15 09:41:16.762 0115100953 Document not found - Tristian CLAIM #99900470018 PACKAGE.pdf 99900470018 ND_52233-02152024 Tristian CLAIM #99900470018 PACKAGE.pdf [http-nio-8080-exec-202] 2024-02-15 09:07:47.769 0115100898 Duplicate Child Exception - ROAMN Claim # 99900468430 Invoice.pdf already exists in the location. 0115101000 Document not found - Bob Claim # 99900468999 Invoice.pdf already exists in the location. 99900468430 99900468999 ND_50249-02152024 ND_55555-02152024 Bob Claim # 99900468999 Invoice.pdf ROAMN Claim # 99900468430 Invoice.pdf How can we fix the first query so that it provides data for all columns correctly? Thanks in advance for your time!
I have already send a support request to MS Office365. Hope that they can give a better or more detailed reason then "SendAsDenied; ticket@eremote.nl not allowed to send as Splunk_eRemote@uBDC01;" ... See more...
I have already send a support request to MS Office365. Hope that they can give a better or more detailed reason then "SendAsDenied; ticket@eremote.nl not allowed to send as Splunk_eRemote@uBDC01;" In the mean time I will follow also my Splunk support-case route... Will post my findings here, later....
The sendemail.py is apparently trying to send but the server is rejecting the email. It's easiest to check at the denying party's logs why it's happening. If really nothing has changed recently on ei... See more...
The sendemail.py is apparently trying to send but the server is rejecting the email. It's easiest to check at the denying party's logs why it's happening. If really nothing has changed recently on either side (are you absolutely sure there was no change in policies in the mail environment?), maybe it's simply a case of over-quota recipient mailbox.
Ok, firstly, we seem to be mixing limits. 50000 is the default limit for subsearch used by join command. The limit for subsearch is 10000 results. But as I understand the wording from the limits.con... See more...
Ok, firstly, we seem to be mixing limits. 50000 is the default limit for subsearch used by join command. The limit for subsearch is 10000 results. But as I understand the wording from the limits.conf spec, it applies to the number of results returned by the search, not the initial events processed by the first part of the pipeline. I'll have to test it.
Currently I am feeding Splunk Zeek logs (formerly known as bro) via the monitor command. Some of the logs in the Zeek index are being parsed correctly. Other logs, however, are still appearing as raw... See more...
Currently I am feeding Splunk Zeek logs (formerly known as bro) via the monitor command. Some of the logs in the Zeek index are being parsed correctly. Other logs, however, are still appearing as raw text.  I remember in the past there was a certain link in the settings where I could specify how to extract each field in the event what to call the field and what data belonged to it.  I also remember being able to test the specific settings I was applying via a log of the same index/source type. Any help interpreting what I am trying to communicate or guidance as to finding that specific page I am looking for is very much appreciated. 
The limit is to do with the events not the result i.e. the number of events returned by the first part of the subsearch (before first pipe), so, as you have already stated, you had more than 50k even... See more...
The limit is to do with the events not the result i.e. the number of events returned by the first part of the subsearch (before first pipe), so, as you have already stated, you had more than 50k events to get your 250+ results. You need to reframe this initial part of the subsearch so that fewer than 50k events are found.