All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

VERY new to splunk.  I have a query that scans a vulnerability report for critical vulnerabilities: index=vulnerability severity=critical | eval first_found=replace (first_found, "T\S+", "") | eva... See more...
VERY new to splunk.  I have a query that scans a vulnerability report for critical vulnerabilities: index=vulnerability severity=critical | eval first_found=replace (first_found, "T\S+", "") | eval first_found_epoch=strptime(first_found, "%Y-%m-%d") | eval last_found=replace (last_found, "T\S+", "") | eval last_found_epoch=strptime(last_found, "%Y-%m-%d") | eval last_found_65_days=relative_time(last_found_epoch,"-65d@d") | fieldformat last_found_65_days_convert=strftime(last_found_65_days, "%Y-%m-%d") | where first_found_epoch>last_found_65_days | sort -first_found | dedup cve | rename severity AS Severity, first_found AS "First Found", last_found AS "Last Found", asset_fqdn AS Host, ipv4 AS IP, cve AS CVE, output AS Description | streamstats count as "Row #" | table Severity,"First Found","Last Found",Host,IP,CVE,Description,Reason   Which gives me output similar to this: critical 2023-10-11 2023-11-20 host1.example.com 192.168.101.12 CVE-2021-0123 blah blah blah critical 2023-03-25 2023-11-20 host2.example.com 192.168.101.25 CVE-2022-0219 blah blah blah critical 2023-06-23 2023-11-20 host3.example.com 192.168.101.102 CVE-2023-0489 blah blah blah critical 2023-08-05 2023-11-20 host4.example.com 192.168.101.145 CVE-2023-0456 blah blah blah I also have a .csv lookup file where I keep extra information on certain hosts: ScanHost                      ScanIP                   target-CVE            Reason host2.example.com 192.168.101.25 CVE-2022-0219 CVE can not be mitigated What I'm trying to do is to take the Host from the search and if it matches a ScanHost in the CSV then fill in the Reason field from the .csv.
Attached result 2.jpg
Hi ITWhisperer, I fixed it. Thank you very very much for your help, with this, it is working properly (look attached 2.jpg): | sort StartTime | eval row=mvrange(0,4) | mvexpand row | eval _time=... See more...
Hi ITWhisperer, I fixed it. Thank you very very much for your help, with this, it is working properly (look attached 2.jpg): | sort StartTime | eval row=mvrange(0,4) | mvexpand row | eval _time=case(row=0,strptime(StartTime,"%Y-%m-%d %H:%M:%S"),row=1,strptime(StartTime,"%Y-%m-%d %H:%M:%S"),row=2,strptime(EndTime,"%Y-%m-%d %H:%M:%S"),row=3,strptime(EndTime,"%Y-%m-%d %H:%M:%S")) | eval value=case(row=0,1,row=1,1,row=2,1,row=2,0) ´here is the difference | table _time value
Sure @ITWhisperer 
But, if I use this code on the content, which I mentioned in the main describtion, I receive these results (see attch 1.jpg). And this is quiet good for me, except the triangel step. Any idea, how to... See more...
But, if I use this code on the content, which I mentioned in the main describtion, I receive these results (see attch 1.jpg). And this is quiet good for me, except the triangel step. Any idea, how to fix it? | eval row=mvrange(0,4) | mvexpand row | eval _time=case(row=0,strptime(StartTime,"%Y-%m-%d %H:%M:%S"),row=1,strptime(StartTime,"%Y-%m-%d %H:%M:%S"),row=2,strptime(EndTime,"%Y-%m-%d %H:%M:%S"),row=3,strptime(EndTime,"%Y-%m-%d %H:%M:%S")) | eval value=case(row=0,1,row=1,1,row=2,1,row=3,0) | table _time value
Looks like you haven't evaluated _time | eval _time=case(row=0,strptime(StartTime,"%F %T.%6N"),row=1,strptime(StartTime,"%F %T.%6N"),row=2,strptime(EndTime,"%F %T.%6N"),row=3,strptime(EndTime,"%F %T... See more...
Looks like you haven't evaluated _time | eval _time=case(row=0,strptime(StartTime,"%F %T.%6N"),row=1,strptime(StartTime,"%F %T.%6N"),row=2,strptime(EndTime,"%F %T.%6N"),row=3,strptime(EndTime,"%F %T.%6N"))
I was ready to say the dedup wasn't the issue because I thought I previously crossed that off. The case_id is only supposed to have 2 events; when the case is opened and closed. So I thought each id... See more...
I was ready to say the dedup wasn't the issue because I thought I previously crossed that off. The case_id is only supposed to have 2 events; when the case is opened and closed. So I thought each id would only appear twice and the dedup was working in my favor. It looks like I didn't do my due diligence and make sure they're not updated again. Thanks for forcing me to check back and confirm the case_id's do repeat. I'm glad the solution is simple and something I overlooked.
I ran into a similar issue, and there could be at least two reasons for this. Here is the search the wizard generates: index=* OR index=_* _sourcetype="WinEventLog" | where _sourcetype="WinEventLog"... See more...
I ran into a similar issue, and there could be at least two reasons for this. Here is the search the wizard generates: index=* OR index=_* _sourcetype="WinEventLog" | where _sourcetype="WinEventLog" | head 100 1. The Ingest Sample Data wizard uses the "where" search command, which is case sensitive. So make sure the sourcetype case matches how it actually shows up in events.  WinEventLog is not the same is wineventlog. 2. The wizard also uses the _sourcetype field instead of the sourcetype field. That means that if there is any sourcetype transformation happing already, the _sourcetype field will have the original sourcetype. You can check this by searching for your events and adding this _souredcetype field (which is normally hidden). index=* sourcetype="winEventLog" | head 100 | eval orig_sourcetype=_sourcetype   Patrick
Hello Members, I would like to import/show data in a splunk dashboard. This data is results from a mysql query run by php to create an html page with the results in an html table. Most likely the ... See more...
Hello Members, I would like to import/show data in a splunk dashboard. This data is results from a mysql query run by php to create an html page with the results in an html table. Most likely the easiest way to do this would be to write the data to a csv file, and use splunk forwarder to send the data to splunk. The data needs to be checked once a day. I was wondering if there is a way to build a dashboard from the data via the splunk REST api, or import/forward the html page that get created from the mysql query. The query is run on a remote server. I looked at the splunk-sdk-python but its implementation is not user friendly - it requires docker, which I cannot get running for some reason.   I am open to any and all suggestions. thanks eholz
From your code I recived this: "_time",value ,0 ,1 ,1 ,0
Hi @ gcusello, I have similar problem with my use case. I am looking to filter out from two lookup files. I am not using any index. can you help me with to compare filed values from two difference... See more...
Hi @ gcusello, I have similar problem with my use case. I am looking to filter out from two lookup files. I am not using any index. can you help me with to compare filed values from two difference lookup files? below are the sample data from two lookup file 1.Firewall_NEW_Database.csv o Hostname Location Datacenter ABCD           US             xyz LMNO     SING         ABC 2,Firewall_OLD_Database.csv.  Firewall Location Datacenter  LMNO     SING         ABC ABCD       US         xyz EFGH        CAN      PQR in above two lookups I want to compare similar values  base on Hostname and Firewall fields and filter it out with count.    
You can have either SAML or LDAP authentication, but not both.  Splunk authentication is always available. To force Splunk authentication, go to http://<your Splunk URL>/en-us/account/login?loginTyp... See more...
You can have either SAML or LDAP authentication, but not both.  Splunk authentication is always available. To force Splunk authentication, go to http://<your Splunk URL>/en-us/account/login?loginType=Splunk.  The "en-us" part can be replaced with your own locale specifier.
Hello.   Thanks for your help. I have tried with the regex you suggested and with this configuration. [setindexHIGH] SOURCE_KEY = _raw REGEX = audits DEST_KEY = _MetaData:Index FORMAT = imp_h... See more...
Hello.   Thanks for your help. I have tried with the regex you suggested and with this configuration. [setindexHIGH] SOURCE_KEY = _raw REGEX = audits DEST_KEY = _MetaData:Index FORMAT = imp_high The same result. It is not working. We are receiving the events on the index imp_low If we run a search for the events, we can see the field named topic is being indexed. But if we set the view to  raw text of the event. I can not see the words topic or audits on the events raw text. It looks like that info is being removed from the event. Could it be because the props settings?
I removed the "by DateTime" clause and used the timewrap clause, it is giving me the output for last 24 hours correctly however I only receive files on the weekends and if I try to use this command t... See more...
I removed the "by DateTime" clause and used the timewrap clause, it is giving me the output for last 24 hours correctly however I only receive files on the weekends and if I try to use this command then it's giving me too many unwanted fields with no values.
Hello, Is it possible to configure Splunk to receive webhook with some information added to it and if it is can you give me link to the tutorial? 
I just saw your new message, it works even better and it's cleaner. Thank you for your help !  
We have recently setup SAML Authentication on our Splunk search which will be accessed by our Vendor using SSO authentication.  I wanted to enquire if LDAP authentication can also be enabled which wi... See more...
We have recently setup SAML Authentication on our Splunk search which will be accessed by our Vendor using SSO authentication.  I wanted to enquire if LDAP authentication can also be enabled which will be local to my team ? Also, what if SAML authentication or group mapping on our idP (Azure AD) breaks at some time and we will not be able to get into Splunk.  Is there or can we enable local admin login on the Splunk search which will be managed by our Splunk admin?
By manually setting for a source, it works, even if it is not optimal. | eval "Centreon"=if(isnull(Centreon),0,'Centreon')  
Try like this | table Etat, "Control-M", "Dynatrace", "ServicePilot", "Centreon" | fillnull value=0 "Control-M", "Dynatrace", "ServicePilot", "Centreon"
Hi @ITWhisperer, Thank you for your help, I have my source "Centreon" but it does not display 0 yet. I had already tried the "fillnull" but poorly because it created extra fields. Best Regard... See more...
Hi @ITWhisperer, Thank you for your help, I have my source "Centreon" but it does not display 0 yet. I had already tried the "fillnull" but poorly because it created extra fields. Best Regards, Rajaion