can anyone share me how to get data from SharePoint Online List to Splunk Enterprise.
I have to get user custom actions details from SharePoint application to Splunk Enterprise.
Please give me the code and samples too if it available
Integrating SharePoint Online data with Splunk is a powerful way to gain operational insights. The most robust and scalable method is to use a scripted input that leverages the Microsoft Graph API. Below is a detailed, step-by-step guide.
The recommended approach involves three key components:
Microsoft Azure App Registration: To authenticate and authorize Splunk to access SharePoint data.
Python Script (Using the msal library): To handle authentication and fetch data from the Microsoft Graph API.
Splunk Scripted Input: To execute the Python script on a schedule and ingest the data.
This creates an identity for your Splunk instance in Azure AD.
Go to the Azure Portal.
Navigate to Azure Active Directory > App registrations > New registration.
Give your application a name (e.g., "Splunk-SharePoint-Ingestion").
Under Supported account types, select "Accounts in this organizational directory only".
Leave the "Redirect URI" blank for now.
Note down the Application (client) ID and Directory (tenant) ID.
Configure API Permissions:
In your new App Registration, go to API permissions > Add a permission.
Select Microsoft Graph > Application permissions.
Add the Sites.Read.All permission (this is sufficient for reading list data from all sites).
Click "Grant admin consent" to approve these permissions.
Create a Client Secret:
Go to Certificates & secrets > Client secrets > New client secret.
Add a description and select an expiry duration.
Crucially, copy the Secret's Value immediately—you won't be able to see it again.
You now have three vital pieces of information:
TENANT_ID
CLIENT_ID
CLIENT_SECRET
This script will handle the OAuth 2.0 client credentials flow and call the Microsoft Graph API. You must install the msal library first: pip install msal.
Script: sharepoint_to_splunk.py
#!/usr/bin/env python3 import msal import requests import json import sys import os from datetime import datetime, timezone # 1. Configuration - Fill in these details config = { "authority": "https://login.microsoftonline.com/<YOUR_TENANT_ID>", "client_id": "<YOUR_CLIENT_ID>", "client_secret": "<YOUR_CLIENT_SECRET>", "scope": ["https://graph.microsoft.com/.default"], "site_id": "<YOUR_SITE_ID>", # e.g., "contoso.sharepoint.com,<SITE_GUID>,<WEB_GUID>" "list_name": "<YOUR_LIST_NAME>" # The internal name of your list } def get_access_token(config): app = msal.ConfidentialClientApplication( config["client_id"], authority=config["authority"], client_credential=config["client_secret"], ) result = app.acquire_token_for_client(scopes=config["scope"]) if "access_token" in result: return result["access_token"] else: print(f"Error acquiring token: {result.get('error_description')}", file=sys.stderr) sys.exit(1) def get_list_items(access_token, site_id, list_name): url = f"https://graph.microsoft.com/v1.0/sites/{site_id}/lists/{list_name}/items?expand=fields" headers = {'Authorization': 'Bearer ' + access_token} response = requests.get(url, headers=headers) if response.status_code == 200: return response.json() else: print(f"Error fetching list items: {response.status_code} - {response.text}", file=sys.stderr) sys.exit(1) if __name__ == "__main__": # 2. Authenticate and get data access_token = get_access_token(config) data = get_list_items(access_token, config["site_id"], config["list_name"]) # 3. Output to STDOUT for Splunk to ingest # Each list item is printed as a JSON string, one per line. if 'value' in data: for item in data['value']: # Add a timestamp for Splunk parsing item['_splunk_ingest_time'] = datetime.now(timezone.utc).isoformat() print(json.dumps(item))
How to find your site_id and list_name:
Site ID: Use the Graph Explorer tool to query https://graph.microsoft.com/v1.0/sites/root. The id field of your target site is the site_id.
List Name: Go to your SharePoint list, click on Settings > List Settings. The URL in your browser's address bar will contain a parameter like List=%7B...%7D. The value after List= is your list's GUID (the list_name).
Place the Script: Save the Python script on your Splunk Heavy Forwarder or Indexer (e.g., in $SPLUNK_HOME/bin/scripts).
Create the Input:
Navigate to Settings > Data inputs > Scripts.
Click New Local Script.
Provide the full path to your Python script.
Set the Source type (e.g., _json) and Index.
Configure the Schedule (e.g., run every 5 minutes).
Set Permissions and Environment:
Ensure the msal library is installed in the Python environment that the Splunk user has access to.
Ensure the script is executable.
This script outputs each list item as a JSON object. Splunk will automatically extract the fields, making them instantly searchable. You can then create alerts, dashboards, and reports based on your SharePoint list data.
For a quick data validation test before full Splunk integration, you can run your script manually and check its JSON output. A tool like JSON Format & Validate can be helpful for this initial check. You can find a simple one at Michigan County Map.
This method is scalable, secure, and leverages modern APIs, providing a solid foundation for your monitoring and analytics needs.
How can organizations efficiently handle and extract relevant data, such as webcam activity, from Office 365 audit logs, particularly when leveraging tools like the "Splunk Add-on for Microsoft Office 365"?
Audit logs in 0365 contains the data that you seek. You can use the add-on "Splunk Add-on for Microsoft Office 365" to get the audit logs, which contains sharepoint online activity, but it will also get you logs that you may not want like Azure Active Directory and Exchange Online. However, the additional data can be easily filtered using props and transforms conf , if the sharepoint data can be identified using a regex.
https://splunkbase.splunk.com/app/4055
++If this helps, please consider accepting as an answer++