Splunk Search

After 10000 records in lookup, new records are not getting appended. Any solution?

spkriyaz
Path Finder

Hi,

I have used the below saved search to append the data every 15 mins into the lookup file. I use the lookup file in my dashboard and update a field in the record and use outputlookup to store the modified information.

Currently I am not able to see any new record in the lookup file, when I did a search I could see only 10000 records shown in the statistics. If I remove the old record from the lookup then I am able to load the new data in the lookup.

Is there any solution here? I want to store at least a month data in the lookup and every day close to 2000 to 7000 records will be written to the lookup file. 

Saved search:

 

 

index="application_xyz" source=abc.log SEVERITY_LEVEL=ERROR
| eval Date_Time = strftime(_time,"%d-%m-%Y %H:%M:%S") 
| eval "Package_Name, Procedure_Name, File_Name, Host_Name" =  PACKAGE_NAME+","+PROCEDURE_NAME+","+FILE_NAME+","+HOST_NAME 
| makemv delim="," "Package_Name, Procedure_Name, File_Name, Host_Name"
| rename RECORD_NUMBER as ID PLATFORM_INSTANCE AS Platform INSTITUTION_NUMBER AS Institution MESSAGE_TEXT as Message_Text PROGRAM_ID as Program_ID PROGRAM_RUN_ID as Program_Run_ID PROGRAM_NAME as Program_Name
| eval Ack = "", Ack_By = ""
| table ID Platform Institution Date_Time Program_ID, Program_Run_ID, Program_Name "Package_Name, Procedure_Name, File_Name, Host_Name" Message_Text Ack 
|  outputlookup ram_error.csv append=true

 

 

 

Labels (2)
0 Karma

DalJeanis
Legend

That SPL should not limit the lookup to 10K, and we have lookups that are in the millions.  

Try this : 

| inputlookup ram_error.csv 
| stats count

 If that number is over 10K, then the problem is not with the part you've told us, but with the search you're using to bring the data back.  Show us that, and we can help you fix it. 

spkriyaz
Path Finder

Hi,

Now I am facing the issue, after storing 50K records. New records are not getting appended. I used the same search to load the lookup file.  When my saved search runs it allows to store more than 50K record but after the update query(2nd query given below) which I run it restricts the lookup to store only 50K record.

The second query basically used to update one of the lookup field "Ack_By" in it. When I try to use the outputlookup for the second time with the updated records I see the lookup picks only the first 50k records in it which makes my new records to vanish.

Not sure what was the problem here, I have removed the sort command to fix the 10K issue as well but still the data fails to append in the lookup.

Initial saved search:

index="application" source=xyz.log SEVERITY_LEVEL=ERROR
| eval Date_Time = strftime(_time,"%d-%m-%Y %H:%M:%S") 
| eval "Package_Name, Procedure_Name, File_Name, Host_Name" =  PACKAGE_NAME+","+PROCEDURE_NAME+","+FILE_NAME+","+HOST_NAME 
| makemv delim="," "Package_Name, Procedure_Name, File_Name, Host_Name"
| rename RECORD_NUMBER as ID PLATFORM_INSTANCE AS Platform INSTITUTION_NUMBER AS Institution MESSAGE_TEXT as Message_Text PROGRAM_ID as Program_ID PROGRAM_RUN_ID as Program_Run_ID PROGRAM_NAME as Program_Name
| eval Ack = "", Ack_By = ""
| table ID Platform Institution Date_Time Program_ID, Program_Run_ID, Program_Name "Package_Name, Procedure_Name, File_Name, Host_Name" Message_Text Ack 
|  outputlookup ram_error.csv append=true

Updating the record in the lookup and appending the data again in the lookup - Search given below

|inputlookup ram_error.csv 
| where ID IN ($mytoken$) 
| eval Ack=now(),Ack_By="$env:user_email$"+","+strftime(Ack,"%c") 
| append [|inputlookup ram_error.csv | where not ID IN ($mytoken$) ] 
| outputlookup ram_error.csv

 

 

0 Karma

DalJeanis
Legend

Subsearches are limited to 50K records, so this append isn't  working for you.

| append [|inputlookup ram_error.csv | where not ID IN ($mytoken$) ] 

.

There is a slightly different format you can use...

| eval rectype="newrec"
| inputlookup ram_error.csv append=true 
| where (rectype="newrec") OR ( not ID IN ($mytoken$) )
| fields - rectype

 

 

0 Karma

spkriyaz
Path Finder

Thank you @DalJeanis it was issue with my sort command used to sort the output of my above data which limited it to 10K. I fixed it 🙂

rnowitzki
Builder

Hi @spkriyaz ,

As you don't use sort or any subsearch (which are by default limited to 10.000) my best guess is that it is affected by this setting in the savedsearches.conf

action.email.maxresults = <integer>
* This value affects all methods of result inclusion by email alert: inline,
  CSV, and PDF.

https://docs.splunk.com/Documentation/Splunk/latest/Admin/Savedsearchesconf 

 

Does it work if you executre it manually?

--
Karma and/or Solution tagging appreciated.
0 Karma

spkriyaz
Path Finder

Looks like my sort command in the other query was limiting it to 10K, i used as below and it is working fine.
| sort 0 <field name>

Thanks

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...