Splunk SOAR

Why don't I see Splunk events for containers with Phantom-ingested emails?

gf13579
Communicator

If I try to search phantom container events by label, status or several other fields, I don't see events relating to containers created by the email poll-based ingestion feature of Phantom.

Why don't they show up?

Labels (1)
0 Karma
1 Solution

gf13579
Communicator

The json logged by Phantom break's Splunk's parsing. Early fields in the json will be available for you to search on, but later fields such as label and status won't be automatically extracted.

I think the raw_email field is the one that breaks things. Based on a quick test, a json linter had no problem with what phantom was sending Splunk, so the issue seems more with Splunk parsing the dta.

As a workaround - extract the relevant fields at search time (or define your own local props):

index=phantom_container 
| rex "\x22label\x22: \x22(?<label>[^\x22]+)\x22" 
| rex "\x22status\x22: \x22(?<status>[^\x22]+)\x22" 
| search label!="servicenow-poll" status=new 
| eval _time = strptime(create_time,"%Y-%m-%dT%H:%M:%S") + 10*60*60 
| stats min(_time) as _time, values(label) as label by id 
| timechart span=10m count by label

 

View solution in original post

0 Karma

gf13579
Communicator

The json logged by Phantom break's Splunk's parsing. Early fields in the json will be available for you to search on, but later fields such as label and status won't be automatically extracted.

I think the raw_email field is the one that breaks things. Based on a quick test, a json linter had no problem with what phantom was sending Splunk, so the issue seems more with Splunk parsing the dta.

As a workaround - extract the relevant fields at search time (or define your own local props):

index=phantom_container 
| rex "\x22label\x22: \x22(?<label>[^\x22]+)\x22" 
| rex "\x22status\x22: \x22(?<status>[^\x22]+)\x22" 
| search label!="servicenow-poll" status=new 
| eval _time = strptime(create_time,"%Y-%m-%dT%H:%M:%S") + 10*60*60 
| stats min(_time) as _time, values(label) as label by id 
| timechart span=10m count by label

 

0 Karma
Get Updates on the Splunk Community!

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI!Discover how Splunk’s agentic AI ...

🔐 Trust at Every Hop: How mTLS in Splunk Enterprise 10.0 Makes Security Simpler

From Idea to Implementation: Why Splunk Built mTLS into Splunk Enterprise 10.0  mTLS wasn’t just a checkbox ...