Getting Data In

How to override props.conf to change event size limit?

New Member

I was trying to override props.conf to change event size limit. These are the steps I have tried so far:
- Add a blank props.conf file in $SPLUNK_HOME/etc/system/local
- Edit props.conf with three lines configuration as follow:
[linux_secure]
TRUNCATE = 0
MAX_EVENTS = 20000
- Stop Splunk and start Splunk.

However, I still get only 10000 lines when I export the search result to a csv file. Do I miss something?

Thank you,
Hudan.

0 Karma
1 Solution

SplunkTrust
SplunkTrust

Wait a second, you changed the question on me. If you're exporting to CSV, MAX_EVENTS in props.conf has nothing to do with this.

You can use the | outputcsv command instead or the restapi. Check out this article:

http://blogs.splunk.com/2009/08/07/help-i-cant-export-more-than-10000-events/

If you're talking about the csv that is attched to email alerts, that can be configured in alert_actions.conf.

The limit is there to protect you from your own browsers limitations, etc.

View solution in original post

0 Karma

SplunkTrust
SplunkTrust

Wait a second, you changed the question on me. If you're exporting to CSV, MAX_EVENTS in props.conf has nothing to do with this.

You can use the | outputcsv command instead or the restapi. Check out this article:

http://blogs.splunk.com/2009/08/07/help-i-cant-export-more-than-10000-events/

If you're talking about the csv that is attched to email alerts, that can be configured in alert_actions.conf.

The limit is there to protect you from your own browsers limitations, etc.

View solution in original post

0 Karma

SplunkTrust
SplunkTrust

Yeah I was going to mention sort, subsearch, and join limits etc but you said the search was producing 12000 without issue. Glad to help!

0 Karma

New Member

I am sorry for changing the focus of the question. Previously, I think that exporting the search result to csv was highly related to MAX_EVENTS and I just realize it was wrong. I am sorry for the mistake. The search result export to csv is now working like a charm!

In addition, if you use sort command, use this command sort 0 field to export to more than 10000 rows.
Thank you for your answer and I have marked it as Accepted.

0 Karma

SplunkTrust
SplunkTrust

Yes you should reindex the data because those are index time configuragtions that do not occur at search time.

0 Karma

SplunkTrust
SplunkTrust

Remove the data from your indexes (not a requirement, but you don't want duplicates), add a crcSalt to your inputs.confs or clear the fishbucket on your forwarder(s), and they should reindex.

This shows how to clear the fishbucket on a forwarder (delete it's path), and how to clear the fishbucket on a standalone (Splunk cli command the author says is for indexers).

https://answers.splunk.com/answers/72562/how-to-reindex-data-from-a-forwarder.html

0 Karma

New Member

I try these steps but still failed:
- Add a blank props.conf file to $SPLUNK_HOME/etc/system/local
- Edit with these three lines of configuration:
[linux_secure]
TRUNCATE = 0
MAX_EVENTS = 20000
- Remove data to avoid duplicates using | delete command
- Stop Splunk
- Clean the fishbucket index: splunk clean eventdata -index _thefishbucket
- Start Splunk
- Upload the data again since I checked in the Splunk web interface the data is empty
- Issuing search command again.

However, I still get only 10000 lines when I export the search result to a csv file. Still have no luck with this.

Note: I use standalone Splunk version 6.3.3.

0 Karma

New Member

How to efficiently reindex the data? Another answer in Splunk forum encourages us to first delete and then oneshot add the log file again. I have 392,829 lines from 166 log files. It seems that I need to create a shell script to automatically re-add all log files. What do you think?

0 Karma

SplunkTrust
SplunkTrust

You could also wipe out the fishbucket and reindex the data without scripts

https://answers.splunk.com/answers/72562/how-to-reindex-data-from-a-forwarder.html

0 Karma

SplunkTrust
SplunkTrust

Did you reindex the data?

0 Karma

New Member

No, I did not reindex the data. Should I reindex the data to see the effect of overridden props.conf?

0 Karma

SplunkTrust
SplunkTrust

MAX_EVENTS are used for grouping multiple lines in Single Event when Splunk indexes data. Do you want 20000 lines to get grouped together as a single event in Splunk? Or is this issue more towards exporting as PDF where your search results has more than 20000 rows and you when you export you get only 10000 rows?

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

New Member

My problem is the search result returns 12000 rows and when I exported it to csv, I only got 10000 lines instead. Do you have any idea to solve this?

0 Karma

SplunkTrust
SplunkTrust

While exporting are you checking unlimited option?
On Splunk Enterprise 6.4 we have exported all events for backup and it did work as expected.
Based on your concern you definitely do not need reindexing of data and need to correct your question since MAX_EVENTS is not related to cvs export like I have mentioned before.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

New Member

Thank you for your comments. Please check jkat54's answer and I have marked it as accepted.

0 Karma