Dashboards & Visualizations

How to get SharePoint list data into splunk using owssvr.dll

walkerhound
Path Finder

I realize this question has been asked before, but I don't see an answer.

Here is what I do in QlikView. It uses owssvr.dll.
If I browse to https://review.pcapp.bsii.....com/PMO/_vti_bin/owssvr.dll?Cmd=Display&RowLimit=0&List={5c534761-0fea...

I get an xml file, which I can't attach due to lack of Karma Points.

In QlikView, this is what I do:
LOAD ows_Team_x0020_Name as TeamName,
date(ows_Date,'YYYY-MM-DD') as MeetingDate,
ows_Number_x0020_of_x0020_Members as NumberOfMembers,
ows_Number_x0020_of_x0020_Members_x0 as MembersAttending,
ows_Percent_x0020_of_x0020_Members_x as PercentAttending,
ows_Number_x0020_of_x0020_Vistors_x0 as NumberOfVisitors,
ows_Shift_x0020_Workers_x0020_Unable as ShiftUnable,
ows_Secial_x0020_Event as SpecialEvent,
1 as RealDate
FROM https://review.pcapp.bsii.bechtel.com/PMO/_vti_bin/owssvr.dll?Cmd=Display&RowLimit=0&List={5c534761-...;

Tags (1)
0 Karma

Richfez
SplunkTrust
SplunkTrust

If you have the Sharepoint list as XML, you should be able to create an input to get it into Splunk.

If you need a way to schedule and automate it, or to retrieve it from a *nix machine, you can use wget. I had this specific problem recently and figured out at least one way to do this. I wrote it up as an answer for "Sharepoint list into Splunk", but basically it's just building a url from various pieces in the connection information (e.g. from when you dump a test copy into Excel), then wrapping that url into a wget and scheduling it via cron.

I don't document the scheduling of it, it's just cron - much information is available elsewhere. An enterprising person could create it as a modular input and let Splunk handle scheduling. I haven't gotten that far yet. 🙂

https://answers.splunk.com/answers/236497/sharepoint-list-into-splunk.html

manderson7
Contributor

This did great for me, thank you!

0 Karma

Richfez
SplunkTrust
SplunkTrust

You are welcome!

0 Karma

manderson7
Contributor

So were you able to figure out how to download multiple pages? I can't see anything in the guid/url that displays a page, or else I'd iterate the wget command.

any help you can provide would be great.

0 Karma

Richfez
SplunkTrust
SplunkTrust

Have you found out how to handle this?

If you have an XML file, could you just use that as an XML input to Splunk?

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...