All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

| eval time_start=strptime(time_start,"%Y/%m/%d %H:%M") | eval time_end=strptime(time_end,"%Y/%m/%d %H:%M") | eval time_start=mvrange(time_start,time_end,60*60) | mvexpand time_start | eval time_end=... See more...
| eval time_start=strptime(time_start,"%Y/%m/%d %H:%M") | eval time_end=strptime(time_end,"%Y/%m/%d %H:%M") | eval time_start=mvrange(time_start,time_end,60*60) | mvexpand time_start | eval time_end=time_start+(60*60) | eval time_start=strftime(time_start,"%Y/%m/%d %H:%M") | eval time_end=strftime(time_end,"%Y/%m/%d %H:%M")
Hi @sintjm, I suppose that you created the index and it has the correct grants. did you explicit the index name in the search? <your_search> | table ............ | collect index=your_index Ciao. ... See more...
Hi @sintjm, I suppose that you created the index and it has the correct grants. did you explicit the index name in the search? <your_search> | table ............ | collect index=your_index Ciao. Giuseppe
"does not allow" is a bit vague - what errors do you get? what shows up in the log? what other information do you have?
Hello,   I have this data set:       event, start_time, end_time EV1, 2024/07/11 12:05, 2024/07/11 13:05 EV2, 2024/07/11 21:13, 2024/07/12 04:13 EV3, 2024/07/13 11:22, 2024/07/14 02:44      ... See more...
Hello,   I have this data set:       event, start_time, end_time EV1, 2024/07/11 12:05, 2024/07/11 13:05 EV2, 2024/07/11 21:13, 2024/07/12 04:13 EV3, 2024/07/13 11:22, 2024/07/14 02:44         I need to split the time intervals into hourly intervals. One interval for each row. Eventually I'm looking for an output like this:       event, start_time, end_time EV1, 2024/07/11 12:05, 2024/07/11 13:00 EV1, 2024/07/11 13:00, 2024/07/11 13:05 EV2, 2024/07/11 21:13, 2024/07/12 22:00 EV2, 2024/07/11 22:00, 2024/07/12 23:00 EV2, 2024/07/11 23:00, 2024/07/12 00:00 EV2, 2024/07/11 00:00, 2024/07/12 01:00 EV2, 2024/07/11 01:00, 2024/07/12 02:00 EV2, 2024/07/11 02:00, 2024/07/12 03:00 EV2, 2024/07/11 03:00, 2024/07/12 04:00 EV2, 2024/07/11 04:00, 2024/07/12 04:13 EV3, 2024/07/13 11:22, 2024/07/14 12:00 EV3, 2024/07/13 12:00, 2024/07/14 13:00 EV3, 2024/07/13 13:00, 2024/07/14 14:00 EV3, 2024/07/13 14:00, 2024/07/14 15:00 EV3, 2024/07/13 15:00, 2024/07/14 16:00 EV3, 2024/07/13 16:00, 2024/07/14 17:00 EV3, 2024/07/13 17:00, 2024/07/14 18:00 EV3, 2024/07/13 18:00, 2024/07/14 19:00 EV3, 2024/07/13 19:00, 2024/07/14 20:00 EV3, 2024/07/13 20:00, 2024/07/14 21:00 EV3, 2024/07/13 21:00, 2024/07/14 22:00 EV3, 2024/07/13 22:00, 2024/07/14 23:00 EV3, 2024/07/13 23:00, 2024/07/14 00:00 EV3, 2024/07/13 00:00, 2024/07/14 01:00 EV3, 2024/07/13 01:00, 2024/07/14 02:00 EV3, 2024/07/13 02:00, 2024/07/14 02:44         I tried using bin or timechart command but they don't work. Do you guys have any sugestion?     Thank you, Tommaso
i have a search in my query where i spool data from an API but then the collect command does not allow me to save the search into my index. Any ideas?
Instead of   | join type=inner Hostname [|inputlookup device.csv | table Hostname]   do something like this   | lookup device.csv | where isnotnull(<populated field from csv>) https://docs.spl... See more...
Instead of   | join type=inner Hostname [|inputlookup device.csv | table Hostname]   do something like this   | lookup device.csv | where isnotnull(<populated field from csv>) https://docs.splunk.com/Documentation/Splunk/9.2.2/SearchReference/Lookup  
What happens if you backfill for one day at a time (rather than all 30 days together)?
How can i do that?
As I said, instead of join, why not use lookup?
yes thats why I am inner join devices.csv to other csv file. The Hostname in device.csv will match some Hostname from first search
Are you saying that device.csv has other fields apart from Hostname that you want to "join" to your events?
If it is only Col7 and Col10 that could have escaped commas, try something like this (?<Col1>[^,]+),(?<Col2>[^,]+),(?<Col3>[^,]+),(?<Col4>[^,]+),(?<Col5>[^,]+),(?<Col6>[^,]+),(?<Col7>.+?(?<!\\)),(?<... See more...
If it is only Col7 and Col10 that could have escaped commas, try something like this (?<Col1>[^,]+),(?<Col2>[^,]+),(?<Col3>[^,]+),(?<Col4>[^,]+),(?<Col5>[^,]+),(?<Col6>[^,]+),(?<Col7>.+?(?<!\\)),(?<Col8>[^,]+),(?<Col9>[^,]+),(?<Col10>.+?(?<!\\)),(?<Col11>[^,]+)$ You may have to double-up the backslashes (?<Col1>[^,]+),(?<Col2>[^,]+),(?<Col3>[^,]+),(?<Col4>[^,]+),(?<Col5>[^,]+),(?<Col6>[^,]+),(?<Col7>.+?(?<!\\\\)),(?<Col8>[^,]+),(?<Col9>[^,]+),(?<Col10>.+?(?<!\\\\)),(?<Col11>[^,]+)$
You were right. it figured out that there was additional _TCP_ROUTING settings in come custom inputs.conf local. Fixing this, it worked.
  I did the same steps and still have the same problem     I did the same steps and still have the same problem
first search have multiple fields including Hostname , and second search only have Hostname . So I am trying to inner join to get more fields for Hostname in device.csv. Does that make sense?
I did the same steps and still have the same problem         
What is it that you are trying to achieve? Your join appears to be joining on the Hostname field with a search which only returns a Hostname which would seem to suggest that you are just trying to d... See more...
What is it that you are trying to achieve? Your join appears to be joining on the Hostname field with a search which only returns a Hostname which would seem to suggest that you are just trying to determine whether Hostname (dns) exists in device.csv. Rather than using join (which is slow and has other limitations and should be avoided if possible), perhaps you should just use lookup and check for a field that exists in device.csv (assuming that device.csv contains more than one field).
Hello @gcusello and @PickleRick  Thank you so much for your response, truly appreciate it. All values should be part of this extraction and outcome of this extraction should be and as an example: A... See more...
Hello @gcusello and @PickleRick  Thank you so much for your response, truly appreciate it. All values should be part of this extraction and outcome of this extraction should be and as an example: APIDEV,4xs54,000916,DEV,Update,Integrate,String\,Set\,Number\,ID,Standard,2024-07-10T23:10:45.001Z,Process_TIME\,URI\,Session_Key\,Services,Hourly Col1=APIDEV Col2=4xs54 Col3=000916 Col4=DEV Col5=Update Col6=Integrate Col7=String\,Set\,Number\,ID Col8=Standard Col9=2024-07-10T23:10:45.001Z Col10=Process_TIME\,URI\,Session_Key\,Services Col11=Hourly In the case of Regex, what would be the Regex for this extraction within the props.conf.       
why is inner join not working , Both searches are giving results. | inputlookup ABCD.csv | eval CC=mvdedup(CC) | rename CC as "Company Code" | streamstats first(lastchecked) as scan_check | ev... See more...
why is inner join not working , Both searches are giving results. | inputlookup ABCD.csv | eval CC=mvdedup(CC) | rename CC as "Company Code" | streamstats first(lastchecked) as scan_check | eval key=_key,is_solved=if(lastchecked>lastfound OR lastchecked == 1,1,0),solved=finding."-".is_solved."-".key,blacklisted=if(isnull(blfinding),0,1),scandate=strftime(lastfound,"%Y-%m-%d %H:%M:%S"),lastchecked=if(lastchecked==1,scan_check,lastchecked),lastchecked=strftime(lastchecked,"%Y-%m-%d %H:%M:%S") | fillnull value="N.A." Asset_Gruppe Scan-Company Scanner Scan-Location Location hostname "Company Code" | search (is_solved=1 OR is_solved=0) (severity=informational) blacklisted=0 Asset_Gruppe="*" Scan-Company="*" Location="*" Scanner="*" dns="*" pluginname="*" ip="*" scandate="***" "Company Code"="*" | rex field=scandate "(?<new_date>\A\d{4}-\d{2}-\d{2})" | sort 0 -new_date | eventstats first(new_date) as timeval | rex field=new_date "-(?<date_1>\d\d)-" | rex field=timeval "-(?<date_2>\d\d)-" | strcat finding "#" NessusHost sid hostid pluginid finding | where date_1=date_2 | fields dns ip lastchecked severity pluginid pluginname scandate Asset_Gruppe Location Scan-Company "Company Code" Scan-Location solved Scanner finding | rename dns as Hostname,ip as IP | join type=inner Hostname [|inputlookup device.csv | table Hostname]
Hi @tuts , please try this: from the list of the Correlation Searches, clone your one (link on the right side), edit the new Correlation Search using the correct Security Domain, Save it. disab... See more...
Hi @tuts , please try this: from the list of the Correlation Searches, clone your one (link on the right side), edit the new Correlation Search using the correct Security Domain, Save it. disable and then delete the old Correlatin Search. Ciao. Giuseppe