- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Mac OS X Sierra - How to get all logs from the Unified Log database?
Couldn't find a similar question to this one. How are people retrieving logs from Mac OS X Sierra that are in the Unified Logging Database? This was a new logging technology released with Sierra (think it's stored in a binary database). It has way better and more detailed logs compared to the deprecated system.log file. There is practically nothing going to the system.log file in newer OS X versions... Ideally, I'd like to output data from the database and append it to the system.log file so it can get picked up with the rest of our old fashioned syslog (and be forwarded by using an old fashioned forwarding server over udp:514.) The asl.conf appears to be superseded by the Unified Logging as well. Any ideas?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Since Splunk 9.x supports Apple Unified Logging but Splunk didn't release corresponding TA I decided to publish technology add-on to make things CIM compliant with Splunk Enterprise Security:
https://splunkbase.splunk.com/app/6561/
I also published an app to visualize key security-relevant events from MacOS datasource:
https://splunkbase.splunk.com/app/6562/
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi ,
I would like to onboard MAC PC all Browser's history in to the Splunk.
Please suggest best way to onboard those logs.
~/Library/Safari/
~/Users/<username>/Library/Application /Support/Firefox/Profiles
~/Users/<username>/Library/Application/Support/Google/Chrome/Default/History
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi All, There is a Splunk Idea to track this issue: https://ideas.splunk.com/ideas/EID-I-562
You're welcome to go to the idea to follow it, vote for it and to add additional comments. There is active engineering work done on this, the best way to track that progress and help shape the outcome would be to comment in that Splunk Idea.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

I ended up kludging a pretty generic scripted input that.
Runs the log show command from start_date to end_date.
Greps what you want using an include file.
Greps out stuff with an exclude file.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Your tar file doesn't exist anymore. It's an empty file. Can you re-share it?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Thanks for posting. How did you managed to deal with "log show" permissions? Is there any other way than putting user "splunk" into the admin group?
dseditgroup -o edit -a splunk -t user admin
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

I don't currently have a mac to test with and I'm not a Mac guy but something like this might work.
Add this to /etc/sudoers to permit the splunk user to run log without a password.
splunk ALL = NOPASSWD: /path/to/log
Edit the uf_macintosh/bin/mac_log_monitor.sh and add sudo to the command.
FROM
log show --style syslog --start "$START_DATE" --end "$END_DATE" | egrep -f $INCLUDE | egrep -vf $EXCLUDE
TO
sudo /path/to/log show --style syslog --start "$START_DATE" --end "$END_DATE" | egrep -f $INCLUDE | egrep -vf $EXCLUDE
Let me know how it goes!
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Thanks for the quick response and advice. I had to modify config entry a little, but not much.
splunk ALL=(ALL) NOPASSWD: /usr/bin/log
Just the thing I've noticed. In case the "log show" is not allowed to be run or some other exception happens, the script still updates the last_run_date.txt file. I am thinking of modifying the script so it would update the last_run_date.txt file after log show command would be successfully run.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

So are we in a state where basically Splunk does not work with newer OS/X versions, with the new logging system?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Please provide step-by-step guidance to on how to get logs from Sierra to Splunk
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
One possibility is to use osquery to pull the data from asl and put it into a file monitored by the splunk forwarder. And of course osquery exposes lots of other stuff you could grab too.
https://medium.com/@clong/osquery-for-security-b66fffdf2daf
https://blog.kolide.com/monitoring-macos-hosts-with-osquery-ba5dcc83122d
This works - the part I'm struggling with is figuring out what to grab.
Working with the log command in Sierra lets you play with the logged data but I don't see any guidance or recommendations on what to grab to meet standard audit requirements. If you can grab everything great - but if you are concerned about license capacity then most of the stuff going to asl looks like noise and should be filtered at the host.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
https://github.com/droe/xnumon might also help it's "sysmon for macos"
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Bumping xnumon as a pretty complete solution to this problem. You'll need to transform the input to be CIM compliant since there is not an app available at the time but out of the box it's a fairly on par with what sysmon offers.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'd be very happy to add a sample Splunk config for CIM compliant field extraction to xnumon, feel free to submit one on Github in an issue or pull request.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
One other rabbit hole I went down to get audit log data is using auditreduce + praudit
Again this works - audit data goes to splunk - but produces mostly noise. It checks a compliance box without being particularly useful.
I'll check out xnumon. Thanks.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It is probably best to contact Splunk, if you need the data from unified logging. That way they can push SPL-129734 internally. For now we rely on some scripts from the Unix TA, I have heard that others use https://osquery.io/
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The Splunkforwarder can be installed and configured to index information from unified logging. There is just no out of the box functionality for that.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm in the same boat. Need to send/ingest the unified logs from 10.12+ clients.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
For anyone else stumbling accross this question: Splunk has an open enhancement request for this: SPL-129734. If this is something you need opening a case with a reference to this question might accelerate the implementation.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

some docs from macos
https://www.mac4n6.com/blog/2016/11/13/new-macos-sierra-1012-forensic-artifacts-introducing-unified-...
You could create scripted or modular inputs to run the "log show" command and ingest the events.
The difficulty will be :
- to decide if you want a live tailing, or a backlog collection of the logs.
- to specify and maintain a position checkpoint to avoid reindexing the same events over and over. A modular input may be more appropriate. see http://docs.splunk.com/Documentation/Splunk/6.6.0/AdvancedDev/ModInputsIntro
