Splunk Search

Receiving strange errors when searching messages by old dates?

metylkinandrey
Communicator

I get strange errors when searching messages by old dates.

If I put a search for more than two hours, I immediately get the following errors:

2 errors occurred while the search was executing. Therefore, search results might be incomplete.
  • 'stats' command: limit for values of field 'Time' reached. Some values may have been truncated or ignored.
  • 'stats' command: limit for values of field 'messageType' reached. Some values may have been truncated or ignored.

From four days:

4 errors occurred while the search was executing. Therefore, search results might be incomplete. 
  • 'stats' command: limit for values of field 'Time' reached. Some values may have been truncated or ignored.
  • 'stats' command: limit for values of field 'eventTime' reached. Some values may have been truncated or ignored.
  • 'stats' command: limit for values of field 'messageId' reached. Some values may have been truncated or ignored.
  • 'stats' command: limit for values of field 'messageType' reached. Some values may have been truncated or ignored.

One of my requests:

index="external_system" messageType="RABIS-HeartBeat"
| eval timeValue='eventTime'
| eval time=strptime(timeValue,"%Y-%m-%dT%H:%M:%S")
| sort -_time
| eval timeValue='eventTime'
| eval time=strptime(timeValue,"%Y-%m-%dT%H:%M:%S")
| eval Time=strftime(_time,"%Y-%m-%dT%H:%M:%S")
| stats list(Time) as Time list(eventTime) as EventTime list(messageType) as MessageType list(messageId) as Messag11eId by messageType

 

Message example:

curl --location --request POST 'http://mon.pd.dev.sis.org:8088/services/collector/raw' --header 'Authorization: Splunk 02-93-48-9-27' --header 'Content-Type: text/plain' --data-raw '{
"messageType": "HeartBeat",
"eventTime": "2022-11-14T13:34:15",
"messageId": "ED280816-E404-444A-A2D9-FFD2D171F9999"
}'

Can you please tell me how to solve these problems?

 

Tags (1)
0 Karma
1 Solution

yuanliu
SplunkTrust
SplunkTrust

From limits.conf:

 

[stats]
...
list_maxsize = 100
...

Just note that list is very expensive in terms of RAM.  If you have a lot of events,  list(Time) as Time list(eventTime) as EventTime is practically suicidal.  It is best to avoid such stats.

 

View solution in original post

metylkinandrey
Communicator

All. I'm sorry, I saw that these are the settings:  limits.conf

0 Karma

yuanliu
SplunkTrust
SplunkTrust

From limits.conf:

 

[stats]
...
list_maxsize = 100
...

Just note that list is very expensive in terms of RAM.  If you have a lot of events,  list(Time) as Time list(eventTime) as EventTime is practically suicidal.  It is best to avoid such stats.

 

metylkinandrey
Communicator

I try like this for example:

| eval src_Msg_Id=if(len('srcMsgId')==0 OR isnull('srcMsgId')," ",'srcMsgId')
| stats list_maxsize = 10000
| stats list(diff) as TIME_DIF list(event_Time) as EventTime list(src_Msg_Id) as srcMsgId_Бизнес_Сообщения list(routepoint_ID) as RoutepointID list(t_i_d) as Tid list(GISGMP_Request) as GISGMPRequestID list(message_Type) as MessageType list(Packet_GIS_GMP_Id) as PacketGISGMPId count as Кол_Сообщений by srcMsgId_Исх_Сообщения

 

But it doesn't help

0 Karma

metylkinandrey
Communicator

Hello, tell me, where should I add this in the request?
Request example:

index="main"

| eval srcMsgId_Исх_Сообщения=if(len('Correlation_srcMsgId')==0 OR isnull('Correlation_srcMsgId'),'srcMsgId','Correlation_srcMsgId')

| eval timeValue='eventTime'

| eval time=strptime(timeValue,"%Y-%m-%dT%H:%M:%S.%3N%Z") | sort -eventTime | streamstats values(time) current=f  window=1 as STERAM_RESULT  global=false by srcMsgId_Исх_Сообщения

| eval diff=STERAM_RESULT-time

| stats list(diff)  as TIME_DIF list(eventTime) as eventTime list(srcMsgId) as srcMsgId_Бизнес_Сообщения list(routepointID) as routepointID count as  Кол_Сообщений by srcMsgId_Исх_Сообщения

0 Karma
Get Updates on the Splunk Community!

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...

Splunk MCP & Agentic AI: Machine Data Without Limits

Discover how the Splunk Model Context Protocol (MCP) Server can revolutionize the way your organization uses ...