All Apps and Add-ons

TrackMe App - data_last_time_seen field not updating?

cit_student
Explorer

A number of sourcetypes are coming up as status=red because their data_last_time_seen field is "stuck".

All of these are coming from the Microsoft Teams Add-on for Splunk. 
New data is coming in and the Overview data source tab is recognising it and the new events can also be seen using search. There does appear to be a change in the data format that may be responsible for data_last_time_seen not being able to update, however Clearing and running data sampling again had no effect. Refreshing also has no effect.

Is there a way to "refresh" this field? Or any other approaches that can be taken?

Thanks 

Labels (2)
0 Karma
1 Solution

guilmxm
SplunkTrust
SplunkTrust

But that's the thing @cit_student 

The long term trackers runs over earliest -24h not -7d, so with the regular trackers the app will not discover entities that do not ingest data in the past 24h at the minimum.

There might have been some changes in a earlier release of TrackMe regarding the long term tracker time range which would explain why you had the entity created previously (unless that entity did or was generating data in the past 24h)
However, trackMe does maintain the state of the entity from the alerting point of view (so no the data_last_time_seen but if it is seen as red etc)

You can handle this special case pretty easily using Elastic Sources:
https://trackme.readthedocs.io/en/latest/userguide.html#elastic-sources

You could create a dedicated elastic source targeting the data (index, sourcetype etc) and with an appriopriate earliest time range (you cannot use a shared source here because the shared tracker won't go over the last 24h)
This will create and maintain the entity in TrackMe natively.


View solution in original post

0 Karma

XOJ
Path Finder

@cit_student Did you end up having to do the elastic sources for your issue? It still isn't making sense to me why I have events more recent and TrackMe can't pick them up.

0 Karma

cit_student
Explorer

Hi @XOJ 

Yes, I used Elastic Sources to fix my issue in this case. But thankfully, this issue seems to have been confined to this one app. Other sources behaving as expected. 

guilmxm
SplunkTrust
SplunkTrust

Hi @cit_student 

You might be confused by the reason why TrackMe is complaining about these data sources, data sampling is a different concept from the tracking of latency and delay.

You can check:

- the status tab when you click on the data source table
- the Smart Status report when you click on the data source table, then Smart Status

Which is going to tell the reason why TrackMe is complaining about it.

The data_last_time_seen represents the latest _time (so the latest event) seen in the source, it's unlikely it's actually stucked if data is coming in, so I suspect you're confused by the root cause of the issue.

Let me know

Guilhem

0 Karma

cit_student
Explorer

Hi @guilmxm 

Thanks for replying. I should have stated the Status message originally. 

The Status message reads: monitoring conditions are not met due to lagging or interruption to data flow.

Looking at smart_result in Smart Status "while the latest data available is: " is the same time and date as data_last_time_seen

Would agree that it is strange that new events coming in (which still have an _time field) is not being recognised as before.

Current beginning of log structure: {"reportRefreshDate": "2021-12-04T00:00:00Z", etc. etc.

Old beginning of log structure: {"@odata.type": "#microsoft.graph.teamsUserActivityUserDetail", "reportRefreshDate": "2021-11-02" etc. etc.

Regards,

0 Karma

guilmxm
SplunkTrust
SplunkTrust

@cit_student 

When you click on the "search" button in the entity Window, this opens a search with the exact constraint from the entity (depending on how it was built), then check it out.

If trackMe says it has no data for it, it is extemely likely it is the case and you are not comparing the exact same thing. (think that by default the entity is the index + sourcetype)

Guilhem

0 Karma

cit_student
Explorer

A bit more information:

Running 

|inputlookup trackme_data_source_monitoring where data_name="<index>:<sourcetype>" | eval c_time=strftime(data_last_time_seen,"%m/%d/%y %H:%M:%S")


shows that indeed data_last_time_seen is 1636007052 (11/04/21 17:24:12)

However running (on the same box as above)

| tstats max(_time) as data_last_time_seen WHERE index=<index> sourcetype=<sourcetype>

results in data_last_time_seen is 1638662400 (December 5, 2021 11:00:00)

0 Karma

cit_student
Explorer

Hi Guilhem,

Clicking on Search from the entity window shows the same data as can be seen in the "Overview data source" tab. Which for example has events at _time =  2021-12-05T11:00:00.000+11:00, however in the same entity window the numbers for data_last_time_seen & data_last_ingest = 04/11/2021  17:24 and latest_flip_time = 06/11/2021 05:30, which lines up with the data_max_lag_allowed, but as discussed does not line up with actual events ingested (same index and sourcetype).

All other sourcetypes are not showing this kind of behaviour and are functioning/tracked as expected.

@guilmxm 

0 Karma

guilmxm
SplunkTrust
SplunkTrust

Hi @cit_student 

Hum right, that looks suspiscious, shouldn't happen, the only condition I could think of would be if TrackMe cannot access any longer to this data source when running the tracker, such as adding a blocklist, but then it would be automatically excluded by the UI.
Another option would be that this data scope is excluded via the top level macro running in the tracker.

Can you test the following:

- note the priority and other settings on this data source (it will be logged in the audit changes in anyway in the next step)
- Via the UI, delete temporary the data source (not permanently)
- Run the short term tracker, if the data source generated data in the past 4 hours it should be re-created (if earlier than that and up to 24 hours you can run the long term tracker)
- Check that the entity has been re-created in TrackMe

Guilhem

0 Karma

cit_student
Explorer

Hi Guilhem,

Performing this test of temporary deletion on one of the troublesome sourcetypes and running both short and long term trackers has not re-created the entity in TrackMe. 

I can see that new events have been ingested since I deleted it also, even though there is some delay to ingestion due to the nature of the add-on, new events would be more than 24hrs in the past but not more than 7 days, so long term tracker should pick it up (-7d / +4h).

Regards,

0 Karma

guilmxm
SplunkTrust
SplunkTrust

But that's the thing @cit_student 

The long term trackers runs over earliest -24h not -7d, so with the regular trackers the app will not discover entities that do not ingest data in the past 24h at the minimum.

There might have been some changes in a earlier release of TrackMe regarding the long term tracker time range which would explain why you had the entity created previously (unless that entity did or was generating data in the past 24h)
However, trackMe does maintain the state of the entity from the alerting point of view (so no the data_last_time_seen but if it is seen as red etc)

You can handle this special case pretty easily using Elastic Sources:
https://trackme.readthedocs.io/en/latest/userguide.html#elastic-sources

You could create a dedicated elastic source targeting the data (index, sourcetype etc) and with an appriopriate earliest time range (you cannot use a shared source here because the shared tracker won't go over the last 24h)
This will create and maintain the entity in TrackMe natively.


0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...