Getting Data In

Splunk Deployment App Filtering SharePoint Logs

tbarn005
Loves-to-Learn

 

Hi Splunk Community,

I’m trying to reduce disk space usage on my Splunk Universal Forwarder by filtering out unnecessary SharePoint logs and only forwarding those with a severity of High, error, or warning in the message

I created a deployment app named SharePoint. here is what's in that folder:

 

tbarn005_0-1751572966537.png

 

I attempted to create a props and transforms.conf files to filter out the data that was unnecessary. i only need to see the log files in the dir that have certain key words not all of those logs here is what i wrote in the files. I didn't write the regex myself i found something similar to it online somewhere and tried to make it work for my environment

tbarn005_1-1751573106509.png

After deploying this i now do not see any of my SharePoint logs indexed at all for this specific server even the ones with high. As you can see from the logs i even pointed them at a test index that i made so i should be seeing them I'm not sure what's going on. 

 

 

0 Karma

livehybrid
Super Champion

Hi @tbarn005 

Can I just check, you want to reduce your storage usage on your Universal Forwarder, but the UF isnt storing your data ingested, its only sending it on. 

UFs are typically not used for parsing the data. Did you apply the screenshotted configuration to your UF or a different (HF/IDX) instance?

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

0 Karma

tbarn005
Loves-to-Learn

I may have misspoken i want to reduce the storage usage on my indexer.  I have a SharePoint server that has Splunk UF on it and its ingesting unnecessary data that is eating a lot of storage on my indexer. The screen shots come from my indexer. Im doing a bit of research now and it looks as if i can use the ingest actions to possibly filter out some of that unnecessary data from that sharepoint UF? 

0 Karma

Prewin27
Contributor

@tbarn005 

Your props and transform looks ok. Make sure you are applying this to HF or Indexer not on the UF. Also add one more transform to filter out other noise.

props.conf

[source::E:\\SPLogs\\CLGDEVSPAPPSO1*]
TRANSFORMS-debug = route_high_to_debug,drop_noise

In transforms.conf
[drop_noise]
REGEX = .
DEST_KEY = queue
FORMAT = nullQueue

Restart Splunk and check again. Also make sure you have new high category logs from this server.


Regards,
Prewin
Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!

0 Karma

richgalloway
SplunkTrust
SplunkTrust

There appear to be a few problems here.

1) The SharePoint app should have a single folder called 'default'.  The default folder should contain the files shown in the first screenshot.

2) Universal Forwarders do not consume disk space so filtering will not save any there.  Caveat: if you use persistent queuing then the UF will use disk space, but the space will be returned once the queue is drained.

3) Universal Forwarders do not process transforms so they cannot filter events this way.  Put the props and transforms on the first full instance that touches the data (indexer or heavy forwarder).

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to July Tech Talks, Office Hours, and Webinars!

What are Community Office Hours?Community Office Hours is an interactive 60-minute Zoom series where ...

Updated Data Type Articles, Anniversary Celebrations, and More on Splunk Lantern

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

A Prelude to .conf25: Your Guide to Splunk University

Heading to Boston this September for .conf25? Get a jumpstart by arriving a few days early for Splunk ...