All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

What about the logs after the collect?
You appear to be doing spath on data, then only keeping data which makes the spath redundant! What timestamps are in the events returned by the curl command?
There's no point doing spath if in the next step you leave only the original data field. But that's beside the point. First step in debugging this would be to remove the collect command from your pi... See more...
There's no point doing spath if in the next step you leave only the original data field. But that's beside the point. First step in debugging this would be to remove the collect command from your pipeline and see what the results look like.  
There was no logs showing after i searched the index, sorry
yes please, I did as such
| eval time_start=strptime(time_start,"%Y/%m/%d %H:%M") | eval time_end=strptime(time_end,"%Y/%m/%d %H:%M") | eval time_start=mvrange(time_start,time_end,60*60) | mvexpand time_start | eval time_end=... See more...
| eval time_start=strptime(time_start,"%Y/%m/%d %H:%M") | eval time_end=strptime(time_end,"%Y/%m/%d %H:%M") | eval time_start=mvrange(time_start,time_end,60*60) | mvexpand time_start | eval time_end=time_start+(60*60) | eval time_start=strftime(time_start,"%Y/%m/%d %H:%M") | eval time_end=strftime(time_end,"%Y/%m/%d %H:%M")
Hi @sintjm, I suppose that you created the index and it has the correct grants. did you explicit the index name in the search? <your_search> | table ............ | collect index=your_index Ciao. ... See more...
Hi @sintjm, I suppose that you created the index and it has the correct grants. did you explicit the index name in the search? <your_search> | table ............ | collect index=your_index Ciao. Giuseppe
"does not allow" is a bit vague - what errors do you get? what shows up in the log? what other information do you have?
Hello,   I have this data set:       event, start_time, end_time EV1, 2024/07/11 12:05, 2024/07/11 13:05 EV2, 2024/07/11 21:13, 2024/07/12 04:13 EV3, 2024/07/13 11:22, 2024/07/14 02:44      ... See more...
Hello,   I have this data set:       event, start_time, end_time EV1, 2024/07/11 12:05, 2024/07/11 13:05 EV2, 2024/07/11 21:13, 2024/07/12 04:13 EV3, 2024/07/13 11:22, 2024/07/14 02:44         I need to split the time intervals into hourly intervals. One interval for each row. Eventually I'm looking for an output like this:       event, start_time, end_time EV1, 2024/07/11 12:05, 2024/07/11 13:00 EV1, 2024/07/11 13:00, 2024/07/11 13:05 EV2, 2024/07/11 21:13, 2024/07/12 22:00 EV2, 2024/07/11 22:00, 2024/07/12 23:00 EV2, 2024/07/11 23:00, 2024/07/12 00:00 EV2, 2024/07/11 00:00, 2024/07/12 01:00 EV2, 2024/07/11 01:00, 2024/07/12 02:00 EV2, 2024/07/11 02:00, 2024/07/12 03:00 EV2, 2024/07/11 03:00, 2024/07/12 04:00 EV2, 2024/07/11 04:00, 2024/07/12 04:13 EV3, 2024/07/13 11:22, 2024/07/14 12:00 EV3, 2024/07/13 12:00, 2024/07/14 13:00 EV3, 2024/07/13 13:00, 2024/07/14 14:00 EV3, 2024/07/13 14:00, 2024/07/14 15:00 EV3, 2024/07/13 15:00, 2024/07/14 16:00 EV3, 2024/07/13 16:00, 2024/07/14 17:00 EV3, 2024/07/13 17:00, 2024/07/14 18:00 EV3, 2024/07/13 18:00, 2024/07/14 19:00 EV3, 2024/07/13 19:00, 2024/07/14 20:00 EV3, 2024/07/13 20:00, 2024/07/14 21:00 EV3, 2024/07/13 21:00, 2024/07/14 22:00 EV3, 2024/07/13 22:00, 2024/07/14 23:00 EV3, 2024/07/13 23:00, 2024/07/14 00:00 EV3, 2024/07/13 00:00, 2024/07/14 01:00 EV3, 2024/07/13 01:00, 2024/07/14 02:00 EV3, 2024/07/13 02:00, 2024/07/14 02:44         I tried using bin or timechart command but they don't work. Do you guys have any sugestion?     Thank you, Tommaso
i have a search in my query where i spool data from an API but then the collect command does not allow me to save the search into my index. Any ideas?
Instead of   | join type=inner Hostname [|inputlookup device.csv | table Hostname]   do something like this   | lookup device.csv | where isnotnull(<populated field from csv>) https://docs.spl... See more...
Instead of   | join type=inner Hostname [|inputlookup device.csv | table Hostname]   do something like this   | lookup device.csv | where isnotnull(<populated field from csv>) https://docs.splunk.com/Documentation/Splunk/9.2.2/SearchReference/Lookup  
What happens if you backfill for one day at a time (rather than all 30 days together)?
How can i do that?
As I said, instead of join, why not use lookup?
yes thats why I am inner join devices.csv to other csv file. The Hostname in device.csv will match some Hostname from first search
Are you saying that device.csv has other fields apart from Hostname that you want to "join" to your events?
If it is only Col7 and Col10 that could have escaped commas, try something like this (?<Col1>[^,]+),(?<Col2>[^,]+),(?<Col3>[^,]+),(?<Col4>[^,]+),(?<Col5>[^,]+),(?<Col6>[^,]+),(?<Col7>.+?(?<!\\)),(?<... See more...
If it is only Col7 and Col10 that could have escaped commas, try something like this (?<Col1>[^,]+),(?<Col2>[^,]+),(?<Col3>[^,]+),(?<Col4>[^,]+),(?<Col5>[^,]+),(?<Col6>[^,]+),(?<Col7>.+?(?<!\\)),(?<Col8>[^,]+),(?<Col9>[^,]+),(?<Col10>.+?(?<!\\)),(?<Col11>[^,]+)$ You may have to double-up the backslashes (?<Col1>[^,]+),(?<Col2>[^,]+),(?<Col3>[^,]+),(?<Col4>[^,]+),(?<Col5>[^,]+),(?<Col6>[^,]+),(?<Col7>.+?(?<!\\\\)),(?<Col8>[^,]+),(?<Col9>[^,]+),(?<Col10>.+?(?<!\\\\)),(?<Col11>[^,]+)$
You were right. it figured out that there was additional _TCP_ROUTING settings in come custom inputs.conf local. Fixing this, it worked.
  I did the same steps and still have the same problem     I did the same steps and still have the same problem
first search have multiple fields including Hostname , and second search only have Hostname . So I am trying to inner join to get more fields for Hostname in device.csv. Does that make sense?