Hi,
I am joining several source files in splunk to degenerate some total count. One thing to note is I am using ctcSalt= to reindex all my source file to day, as only very few files will be chnaged when compared to other and i need to reindex all the files as per my usecase.
Here I start using | stats count | timechart span=1d count(field) at the end of of the string and it does not provide any results, i also tries xyseries still no results.
How do i span the results for each day?
timechart requires a _time field. For example, the daily count is:
(your search)|timechart span=1d count
timechart requires a _time field. For example, the daily count is:
(your search)|timechart span=1d count
Hello,
I am able to generate the single day count by adding (my search)| stats count, but if use (my search) |timechart span=1d count or (my search) | stats count |timechart span=1d count, I am not gettting any results and provided time range is all time.
(my search)| stats count
(my search) |timechart span=1d count
The top moves but the bottom doesn't move, you can only think of deleting ”_time”. Please provide a complete search statement.
(my search) | stats count |timechart span=1d count,
→This doesn't work
Hello
below is my actual quiry
|table TestCaseName,SysReqID,TestCaseID,Verdict,CurrentTestcaseResultURL
You can't use "timechart" here because "_time" is gone.
Also, due to "dedup", there will be only the latest one for each "CurrentTestcaseResultURL".
Thanks for your reply. And Yes, we use dedup to fecth only the latest url for each day. Thats why we re-index the data every day. is there any other way to get the historical trend for this search on daily basis?
For example, how about setting the "target date" so that it is always included in DEDUP, JOIN, and STATS?
|eval target_date=strftime(_time,"%Y-%m-%d")
EX.
index="usa_*_test"・・・
・・・|join type=inner DNGProjectAreaID,target_date
・・・|dedup LinkStartID,target_date
・・・|stats count by target_date
・・・・・・・
Thanks for the update, I managed to come up with a solution by scheduling the report to generate a csv and append the same everyday to reach by visualization destination. I will also try out your opinion. Thanks