Splunk Enterprise Security

When merging asset lists in Splunk Enterprise Security, what does the error "'NoneType' object has no attribute 'iteritems'" mean?

Path Finder


I'm trying to add a new asset list to Splunk Enterprise Security. I can see the lookup in Configuration->Data Enrichment->Identity Management, but it's not showing up when I search for assets: " assets returns only the demonstration assets.

I tried to force a merge with *$SPLUNK_HOME/bin/splunk cmd splunkd print-modinput-config identity_manager | $SPLUNK_HOME/bin/python $SPLUNK_HOME/etc/apps/SA-IdentityManagement/bin/identity_manager.py --username=admin
*, but this had no effect.

The search suggested in the documentation to verify the merge (index=_internal source=*python_modular_input.log "Updated: target lookup table") shows 0 results, but searching these logs for errors returns numerous instances of the error in the title.

Sample log:

2016-07-22 09:08:38,411 ERROR pid=80905 tid=asset file=lookup_modinput.py:streaming_merge_task:298 | status="Error processing record" count="1518" record="fields(ip='xx.x.xx.xx', mac='xxxxxx', nt_host='xxxxxxxx', dns='', owner='', priority='', lat='', long='', city='', country='', category='', pci_domain='', is_expected='', should_timesync='', should_update='', requires_av='')"
Traceback (most recent call last):
  File "/opt/splunk/etc/apps/SA-Utils/lib/SolnCommon/lookup_conversion/lookup_modinput.py", line 288, in streaming_merge_task
    success = output_handler.process_streaming_record(result)
  File "/opt/splunk/etc/apps/SA-Utils/lib/SolnCommon/lookup_conversion/output.py", line 147, in process_streaming_record
    for field, value in [(k, v) for k, v in record.iteritems() if k != self.KEY_FIELD]:
AttributeError: 'NoneType' object has no attribute 'iteritems'

Does anybody know what this error is referring to, and how I can fix or get around it?

Many thanks.

Path Finder

In my case, it turns out that it was EXTRA fields in the lookup table that are not listed in the docs (http://docs.splunk.com/Documentation/ES/4.2.1/User/AssetandIdentityCorrelation#Asset_lookup_header)

Once I removed the extra fields, it worked without any problems.
BUG-FIX request: Can this be fixed so that the python just ignores the extra fields?

0 Karma


Usually the script will throw an error about this. Field names with spaces is another one (has to be underscores).

0 Karma

Path Finder

I had a heck of a time getting assets to be recognized by ES a year or two ago. I had to do two main things in Notepad++ though you can do them in other text editors, I'm sure:

  • UTF-8 w/o BOM encoding
  • UNIX line endings

You can use dos2unix on most *nix systems to convert an ASCII file to *nix line endings. I made sure to convert my CSV to *nix line endings after saving it in Excel, and left a blank line at the bottom. UTF-8 is likely fine with or without BOM encoding, whatever that is.

I haven't seen your error specifically, but it looks to me like it simply can't find any items in the list. Definitely ensure *nix line endings. @nnmiller is likely on the right track as well -- hidden characters, mis-matched quotes, special characters, etc.

0 Karma


You will need to review your asset files for problems. Based on cases I have dealt with (from my Splunker hat), I would check for IPv6 addresses which can be represented multiple ways, not all of which we parse well. (Samples in a support case to file bugs would be helpful and appreciated.)

This looks like the key error:
2016-07-22 09:08:38,411 ERROR pid=80905 tid=asset file=lookup_modinput.py:streaming_merge_task:298 | status="Error processing record" count="1518" record="fields(ip=...

Find the line mentioned following the above part of the error, and look at that specific record, as well as the records before and after it. Something around there is causing the python to fail. Check for hidden characters in a text editor, mis-matched quotes, etc.

If you are feeling industrious (make a backup copy), you can modify the output.py file to change it's logging level:

# normal logging level

# additional logging

Change should be made on line 19 of output.py.

Switch the logging level back after you've tested, or you will cause your logs to fill up with unnecessary stuff. Modular inputs run every 5 minutes.

Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...