Deployment Architecture

In Search Head why bundle.info files are getting created at $SPLUNK_HOME/var/run/ ?

sarvesh_11
Communicator

Hey People!

My search head is running out of disk storage (recently). While checking, get to know many files *.bundle.info are getting created at $SPLUNK_HOME/var/run.
Just as a work around, i got to know following things,

1) Need to upgrade my search head version, (currently it is 6.6.3).
Not sure, whether it will rectify my issue.

Also in many similar replies i get to know, this is a known bug, but trust me i'm unable to find mentioned issue number :
SPL-140831 in "https://docs.splunk.com/Documentation/Splunk/7.2.6/ReleaseNotes/KnownIssues"

TIA.

0 Karma
1 Solution

soskaykakehi
Path Finder

Hi sarvesh_11

SPL-140831 is fixed in 6.6.3 release. You can find this Issue Code in :
https://docs.splunk.com/Documentation/Splunk/6.6.3/ReleaseNotes/6.6.3
If you are using 6.6.3 now, I think it is a diffrent issue.But i couldn't find other similar issue in 6.6.3.

"Why bundle files are getting created"

These Bundle files are includes apps , query data and many Lookup files .
If you using searchhead in distributed environment, Searchhead creates a Bundle files and sync it to Indexers and other searchpeers when it changed.
Search job is distribute to indexers and these search jobs are runnig at Indexer (not only searchhead).
So, Indexer need to knew Searchhead's query details (query string, lookup files, etc) , before search running.

If the bundle files (.bundle, .bundle.info, .delta) are 2 or 3 files, it looks usulaly.
If the bundle size is so huge, you have 3 options to fix it.

1) Delete the Lookup files which are large and unused (please delete it from Splunk WebUI. Not from bundle file directly).
2) If you are using many Large Lookup files(".csv" files), you can compress it to ".gz" files and decrease the bundle file size. The bundle file is not compress (just a tar ball).You need to change lookup defign, after you upload it. select ".gz" files and delete old ".csv" files.
3) You can use blacklist to exclude the specific lookup files from bundle files. But you need to add some options to your search query and it affect to search performance.(So I don't recomend this option. )

Some Apps (Like MLToolkit) is including many large sample Lookup files. But many case , we don't need it to include with bundle file.
You can delete these lookup files from WebUI or exclude it from bundle file.

I hope this would be some of help.

View solution in original post

0 Karma

ddrillic
Ultra Champion
0 Karma

soskaykakehi
Path Finder

Hi sarvesh_11

SPL-140831 is fixed in 6.6.3 release. You can find this Issue Code in :
https://docs.splunk.com/Documentation/Splunk/6.6.3/ReleaseNotes/6.6.3
If you are using 6.6.3 now, I think it is a diffrent issue.But i couldn't find other similar issue in 6.6.3.

"Why bundle files are getting created"

These Bundle files are includes apps , query data and many Lookup files .
If you using searchhead in distributed environment, Searchhead creates a Bundle files and sync it to Indexers and other searchpeers when it changed.
Search job is distribute to indexers and these search jobs are runnig at Indexer (not only searchhead).
So, Indexer need to knew Searchhead's query details (query string, lookup files, etc) , before search running.

If the bundle files (.bundle, .bundle.info, .delta) are 2 or 3 files, it looks usulaly.
If the bundle size is so huge, you have 3 options to fix it.

1) Delete the Lookup files which are large and unused (please delete it from Splunk WebUI. Not from bundle file directly).
2) If you are using many Large Lookup files(".csv" files), you can compress it to ".gz" files and decrease the bundle file size. The bundle file is not compress (just a tar ball).You need to change lookup defign, after you upload it. select ".gz" files and delete old ".csv" files.
3) You can use blacklist to exclude the specific lookup files from bundle files. But you need to add some options to your search query and it affect to search performance.(So I don't recomend this option. )

Some Apps (Like MLToolkit) is including many large sample Lookup files. But many case , we don't need it to include with bundle file.
You can delete these lookup files from WebUI or exclude it from bundle file.

I hope this would be some of help.

0 Karma

sarvesh_11
Communicator

Thanks @soskaykakehi ,

That was really helpful!
We are going to blacklist certain unwanted files.

0 Karma

koshyk
Super Champion

My advice is to increase the filesystem space where /opt/splunk is residing in a Search Head Cluster. There is a clean-dispatch option (Link) but the clean-dispatch is a temporary fix only

In my experience for (minimum)
- Standalone system requires 20GB for /opt/splunk
- Cluster system requires 50GB or more for /opt/splunk (for bundle replication)

0 Karma

sarvesh_11
Communicator

Hi @koshyk ,

/opt/splunk is already having 100gb in standalone.
And we dont have clustered environment, it's on-prem standalone hybrid search head, and indexer is on splunk-cloud.

0 Karma

koshyk
Super Champion

seems like a bug in Splunk version then. Splunk 6.6.9 is quite stable if you want a similar version

How many users do you have? may be genuinely too many bundles/search items?

Please have a check of files in /opt/splunk/etc/users/ . to see if the config items are large in number

0 Karma
Get Updates on the Splunk Community!

Monitoring MariaDB and MySQL

In a previous post, we explored monitoring PostgreSQL and general best practices around which metrics to ...

Financial Services Industry Use Cases, ITSI Best Practices, and More New Articles ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Splunk Federated Analytics for Amazon Security Lake

Thursday, November 21, 2024  |  11AM PT / 2PM ET Register Now Join our session to see the technical ...