- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am facing issues wherein the events with same timestamp are not showing in results, when I dedup based on time, but I want all those events, even after dedup. Even epoch will be same for those events. Below is the sample query before dedup and result for the same.
Result are attached as an image. I want to show both the events in the results even after dedup, how can I achieve this ?
index=com vendor_action=comment_create|stats count by created_at,created_by_name|eval point=if(count>0,1,0) | eval epoch=strptime(created_at, "%Y-%m-%dT%H:%M:%S+%z")
Now, the query with dedup :
index=com vendor_action=comment_create|stats count by created_at,created_by_name|eval point=if(count>0,1,0) | eval epoch=strptime(created_at, "%Y-%m-%dT%H:%M:%S+%z") | dedup created_at
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I guess you want to remove duplicate values and not entire rows. dedup removes rows based on the column specified. In your case, Instead of a dedup, you need this.
| stats values(*) as * by created_at
Let me know if this helps.
Cheers
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi @pgadhari ,
I don't understand why you want to use dedup
and also want to keep the events as well.
dedup created_at
- it will remove all the events with same create_at value, irrespective of the other fields values.
In your case I would suggest try dedup _raw
, it will only remove the events duplicate events, where the time and all other fields are same.
So in case for same created_at
values, if event data is different, the query will return those events.
Accept & up-vote the answer if it helps.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I can try doing dedup the _raw events, but I am not sure, how it can help ? But see my above reply https://answers.splunk.com/comments/779814/view.html.
I dont want to remove those events, I want to keep it.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
in your case, it looks like you should just change the key you're using to dedup, such as created_by_name. dedup returns one row per key
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I guess you want to remove duplicate values and not entire rows. dedup removes rows based on the column specified. In your case, Instead of a dedup, you need this.
| stats values(*) as * by created_at
Let me know if this helps.
Cheers
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@arjunpkishore5 - I think after adding above query, its working. After adding above query, I did mvexpand by other field name and seems to be working. I need to monitor it for sometime. Once, its ok, I will accept this answer. Thanks.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
No, I dont want to remove the dedup values, instead I want to keep it. As it is a summary index, it is generating duplicate events and thats why I am using "dedup created_at", but because of this dedup, the events which have same timestamp - either one of them is getting removed from the result, due to which I cannot see that user in our statistics. Hope you got it ?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content


Hi pgadhari,
sorry but I don't understand probably there's something I missed in translaction: if you want all the events why do you dedup?
Ciao.
Giuseppe
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Forgot to mention - Actually, the index which I mentioned is summary index, and in that I am getting duplicate events for every run. Its a saved search putting data into summary index and this search is scheduled search running every 5 minutes and getting data of last 15 minutes. Hence, I am getting duplicated events, hence I have to dedup. But doing dedup is removing one of the event of the same timestamps. Hope you got it ?
When I try to schedule it - to get data of last 5 minutes and running every 5 minutes - it is skipping some of the events, which is not helpful.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content


Hi pgadhari,
did you tryed to dedup for all the fields you have in Summary, or at least the more important, not only _time?
Ciao.
Giuseppe
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I will try doing dedup with more than one field and check. I will revert on it.
