All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

You possibly have duplicates/triplicates in your events.
Thanks one more time. Interestingly, your recent query is fetching only 77 values where as i have 182 values in json file. Is this splunk limitation?  
Try removing the other fields | table _time _raw | spath | untable _time name state | eval date=strftime(_time,"%F") | xyseries name date state
Hello, I have a use case where I have a bunch of email alerts that I need to determine the system name for. Examples,  lets say i have the alerts: 1. File system alert on AAA 2. File system aler... See more...
Hello, I have a use case where I have a bunch of email alerts that I need to determine the system name for. Examples,  lets say i have the alerts: 1. File system alert on AAA 2. File system alert on server servernameaaaendservername 3. File system alert on server BBB I have the list of these system names in a lookup table (Around 100 unique names), so adding 100 lines of field_name LIKE "%systemname1%","systemname1" doesn't seem efficient. Is there a way to use the conditional statement with the lookup table to match the statments? Trying to get the below output by using the system names found in the lookup table If systemname is found in the lookup table that matches on what is found in the alert, output systemname Alert Name || System Name File system alert on AAA || AAA File system alert on server servernameaaaendservername || AAA File system alert on server BBB || BBB
The Windows TA on the search heads is 8.6.0, and the Windows TA on the HF us 9.0.6. Here is the inputs.conf stanza for Security. [WinEventLog://Security] disabled = 0 start_from = oldest current... See more...
The Windows TA on the search heads is 8.6.0, and the Windows TA on the HF us 9.0.6. Here is the inputs.conf stanza for Security. [WinEventLog://Security] disabled = 0 start_from = oldest current_only = 0 evt_resolve_ad_obj = 1 checkpointInterval = 5 blacklist1 = EventCode="4662" Message="Object Type:(?!\s*groupPolicyContainer)" blacklist2 = EventCode="566" Message="Object Type:(?!\s*groupPolicyContainer)" index = test_i renderXml=true  The events are stream processed, and come in as JSON.
Thanks Again! You're my saver Your query works. However, For some reason I see state twice. Also, i see source, host etc... being listed in the table  
It would help to know what you've tried so far and how those attempts failed you. One method that works well for me is the rex command.  This regex matches everything after "Junk_Message>" until the... See more...
It would help to know what you've tried so far and how those attempts failed you. One method that works well for me is the rex command.  This regex matches everything after "Junk_Message>" until the following "<" and puts it into a field called Junk_Message. | rex "Junk_Message>(?<Junk_Message>[^\<])"  
I need to extract a string from a message body,  and make a new field for it.   <Junk_Message> #body | Thing1 | Stuff2  | meh4 | so on 1 | extra stuff3 | Blah4 </Junk_Message> I just need the tex... See more...
I need to extract a string from a message body,  and make a new field for it.   <Junk_Message> #body | Thing1 | Stuff2  | meh4 | so on 1 | extra stuff3 | Blah4 </Junk_Message> I just need the text that start with #body and end with Blah4. To make things more fun everything after #body generates randomly.
Here is what I am attempting to write SPL to show.  I will have users logged into several hosts all using a web application.  I want to see the last (most recent) activity performed for each user log... See more...
Here is what I am attempting to write SPL to show.  I will have users logged into several hosts all using a web application.  I want to see the last (most recent) activity performed for each user logged in. Here is what I have so far:  index=anIndex sourcetype=aSourcetype | rex field=_raw "^(?:[^,\n]*,){2}(?P<aLoginID>[^,]+)" | rex field=_raw "^\w+\s+\d+_\w+_\w+\s+:\s+\w+\.\w+\.\w+\.\w+\.\w+\.\w+\.\w+\.\w+,(?P<anAction>\w+)" | search aLoginID!=null | stats max(_time) AS lastAttempt BY host aLoginID | eval aTime = strftime(lastAttempt, "%Y-%m-%d %H:%M:%S %p ") | sort -aTime | table host aLoginID aTime | rename host AS "Host", aLoginID AS "User ID", aTime AS "User Last Activity Time" I am getting my data as expected by host aLoginID but want to only see the most recent anAction ? When I add in my BY clause host aLoginID anAction I start seeing the userID repeated in my results as I would expect as each anAction "name" is different but I am only seeing one row for each anAction name. I think I am on the right 'path' but I want to only see 1 row for each user not 1 row for each userID & action ?
Assuming your timestamp is in _time and your events (as shown) are in _raw, try this | spath | untable _time name state | eval date=strftime(_time,"%F") | xyseries name date state
I am trying to generate three reports with stats. The first is where jedi and sith have matching columns. The third is where jedi and sith do not match. Example: index=jedi | table saber_color, J... See more...
I am trying to generate three reports with stats. The first is where jedi and sith have matching columns. The third is where jedi and sith do not match. Example: index=jedi | table saber_color, Jname, strengths index-=sith | table saber_color, Sname, strengths I need to list where Jname=Sname The third one is where the Jname!=Sname  The caveat is I cannot use the join for this query. Any good ideas?    
This is simple search, which give me this result. Result contains fields which contains "mobilePhoneNumber" OR "countryCode" OR "mobilePhoneNumber AND countryCode"   I want to return count (in ... See more...
This is simple search, which give me this result. Result contains fields which contains "mobilePhoneNumber" OR "countryCode" OR "mobilePhoneNumber AND countryCode"   I want to return count (in one line) of all fields which contains both, mobilePhoneNumber and countryCode ("mobilePhoneNumber AND countryCode").
Thanks again! Yes! Below json files are generated every day and I would like to show them in table format as below Source: Group01/1318/test.json Generated timestamp: 11/12 12:00 AM { "Portf... See more...
Thanks again! Yes! Below json files are generated every day and I would like to show them in table format as below Source: Group01/1318/test.json Generated timestamp: 11/12 12:00 AM { "Portfolio_Validate1":"skipped", "Portfolio_Validate2":"passed", "Portfolio_Validate3":"passed", "Portfolio_Validate4":"broken" } Source: Group01/1319/test.json Generated timestamp: 11/13 12:00 AM { "Portfolio_Validate1":"passed", "Portfolio_Validate2":"passed", "Portfolio_Validate3":"passed", "Portfolio_Validate4":"broken" } Source: Group01/1320/test.json Generated timestamp: 11/14 12:00 AM { "Portfolio_Validate1":"passed", "Portfolio_Validate2":"failed", "Portfolio_Validate3":"passed", "Portfolio_Validate4":"passed" }   11/14 12:00 AM 11/14 12:00 AM 11/12 12:00 AM Portfolio_Validate1 passed passed skipped Portfolio_Validate1 failed passed passed Portfolio_Validate1 passed passed passed Portfolio_Validate1 passed broken broken  
Hello,  I have a system log which contains different DNS error messages (in the 'Message' field) and I am looking for an easy way to provide a short, meaningful description for those messages, eithe... See more...
Hello,  I have a system log which contains different DNS error messages (in the 'Message' field) and I am looking for an easy way to provide a short, meaningful description for those messages, either by adding a new field representing each unique DNS error message, or by adding text to the Message field. Here's an example; one event contains the following :  Message="DNS name resolution failure (sos.epdg.epc.mnc720.mcc302.pub.3gppnetwork.org)" This error is related to WiFi calling, so I would like to associate a description, or tag to that specific message, e.g. "WiFi calling". Thoughts?
Is the only advantage that append is not limited as the join? Thank you
| rex field=field_id max_match=0 "/(?<key>[^/]+)/(?<value>[^/]+)" | eval row=mvrange(0,mvcount(key)) | streamstats count as _row | mvexpand row | eval name="field_".mvindex(key,row) | eval {name}=mvi... See more...
| rex field=field_id max_match=0 "/(?<key>[^/]+)/(?<value>[^/]+)" | eval row=mvrange(0,mvcount(key)) | streamstats count as _row | mvexpand row | eval name="field_".mvindex(key,row) | eval {name}=mvindex(value,row) | fields - key value name row | stats values(*) as * by _row
Thank you! The max = 0 flag is what i had missing indeed. The data i provided is the result of the search that is why it's epoch. The logs i am managing are not that big but i will keep that in mind ... See more...
Thank you! The max = 0 flag is what i had missing indeed. The data i provided is the result of the search that is why it's epoch. The logs i am managing are not that big but i will keep that in mind for the future!   Have a nice day!
The difficulty is that click.name2 is the group by in the query, which is dynamically determined by the eval in the query (the AppName).  That AppName is for making the chart human readable, but the ... See more...
The difficulty is that click.name2 is the group by in the query, which is dynamically determined by the eval in the query (the AppName).  That AppName is for making the chart human readable, but the drill down need to "convert" the AppName back into the search terms used for the clicked on AppName.  e.g., the app named GVMSAuth is really the search terms (SourceName="KmsService" AND Message="*GVMSAuth(Ver:*"), as far as seeing the event logs. Does that make sense?
I am using splunk 8.2.12 and am trying to generate a pdf via an existing alert action using splunk api calls. The action was originally developed for automated ticketing within another app when a spl... See more...
I am using splunk 8.2.12 and am trying to generate a pdf via an existing alert action using splunk api calls. The action was originally developed for automated ticketing within another app when a splunk alert is triggered. The end goal is to be able to upload the pdf of  search results based on the alert to the ticket in an automated way. below is the current state of the code:     def create_pdf_for_ticket(payload, output_file): # Extract relevant information from the payload ticket_id = payload.get('sid') index = payload.get('result', {}).get('index') sourcetype = payload.get('result', {}).get('sourcetype') # Construct the search query based on the extracted information search_query = f'search index={index} sourcetype={sourcetype} sid={ticket_id}' # Make the API request to execute the search and get the results search_payload = { 'search': search_query, 'output_mode': 'json', } search_response = requests.get('http://localhost:8089/services/search/jobs/export', params=search_payload, headers=post_headers) # Check if the search request was successful if search_response.status_code == 200: # Save the search results to a file with open(output_file, 'wb') as pdf_file: pdf_file.write(search_response.content) print(f"PDF created successfully at: {output_file}") else: print(f"Error creating PDF: {search_response.status_code} - {search_response.text}") def main(): ***** # Create PDF for the ticket output_file = os.environ['SPLUNK_HOME'] + '/etc/apps/Splunk_Ivanti/local/ticket.pdf' create_pdf_for_ticket(payload, output_file) *****    
Without seeing the events, it is difficult to know what to suggest. Hopefully, previous answers will at least give you some ideas.