Splunk Search

Error in 'eval' command: The expression is malformed. The factor is missing.

ramprakash
Explorer

Hello Splunkers,

Today I have upgraded my Splunk environment from 6.0.1 to 6.6.1. Every dashboard and Splunk query is working fine except this.
Can someone please correct why i am suddenly seeing this error after upgrade.

index=ip_lux_metadata s_event="MSG_*"

            | eval q_precedence=if(s_event=="MSG_RECEIVE" AND like(s_comp_id,replace("*","[*]","%")) AND e_to=="SMC", 0, 1)
            | eval q_event=if(q_precedence==0, "MSG_SEND", s_event)
            | eval s_comp_id=if(e_to=="SMC", "SMC", s_comp_id)

            | search s_comp_id="*"

            | fillnull value="" e_id, e_ppg_id, e_crl_id, e_action


            | search e_id="*" e_ppg_id="*" e_crl_id="*" e_action="*"

            | eval q_proc_id = mvindex(s_proc_id, -1)

            | sort q_precedence
            | dedup s_proc_id

            | fillnull value="%DEFAULT_STATUS%" s_proc_outcome
            | eval q_time=_time
            | eval q_status_time=q_time
            | eval q_agent=coalesce(a_agent, s_comp_id)

            %START%
            | join type=outer s_proc_id [search index=ip_lux_metadata s_comp_id="*" s_proc_outcome="*" | eval q_status_time=_time | fields s_proc_id, s_proc_outcome, q_status_time]
            %END%

            | search s_proc_outcome="*"

            | eval q_time=_time

            | sort -q_time, s_comp_id, e_path

            | eval q_info=q_time + "," + q_status_time + "," + q_proc_id + "," + s_event + "," + s_proc_outcome
            | eval q_proc=q_time + "," + q_status_time + "," + q_proc_id + "," + s_proc_outcome
            | eval q_proc_outcome=q_proc_id + "," + s_proc_outcome

            | table q_info, q_proc, q_time, s_comp_id, q_event, e_action, q_proc_outcome, e_path, q_agent, e_ppg_id, e_id, e_crl_id

            | eval q_event=if(q_event == "MSG_RECEIVE", "Received", "Sent")

            | convert timeformat="%d/%m/%y %H:%M:%S.%3N" ctime(q_time)

            | rename q_time as Time, s_comp_id as Endpoint, q_event as "Received / sent", q_agent as Agent, e_action as Action, q_proc_outcome as Status, e_path as Path, e_ppg_id as "Propagation id", e_id as "Message id", e_crl_id as "Correlation id", q_info as " ", q_proc as "  "

ERROR IS

Error in 'eval' command: The expression is malformed. The factor is missing.
The search job has failed due to an error. You may be able view the job in the Job Inspector.

0 Karma
1 Solution

woodcock
Esteemed Legend

Try this (but keep in mind that you DEFINITELY should get rid of the join because it is almost certainly silently dropping events):

index=ip_lux_metadata s_event="MSG_*" 
| eval q_precedence=if(s_event=="MSG_RECEIVE" AND like(s_comp_id,replace("*","[*]","%")) AND e_to=="SMC", 0, 1) 
| eval q_event=if(q_precedence==0, "MSG_SEND", s_event) 
| eval s_comp_id=if(e_to=="SMC", "SMC", s_comp_id) 
| search s_comp_id="*" 
| fillnull value="" e_id, e_ppg_id, e_crl_id, e_action 
| search e_id="*" e_ppg_id="*" e_crl_id="*" e_action="*" 
| eval q_proc_id = mvindex(s_proc_id, -1) 
| sort 0 q_precedence 
| dedup s_proc_id 
| fillnull value="%DEFAULT_STATUS%" s_proc_outcome 
| eval q_time=_time 
| eval q_status_time=q_time 
| eval q_agent=coalesce(a_agent, s_comp_id)
| join type=outer s_proc_id 
    [ search index=ip_lux_metadata s_comp_id="*" s_proc_outcome="*" 
    | eval q_status_time=_time 
    | fields s_proc_id, s_proc_outcome, q_status_time]
| search s_proc_outcome="*" 
| eval q_time=_time 
| sort 0 -q_time, s_comp_id, e_path 
| eval q_info=q_time + "," + q_status_time + "," + q_proc_id + "," + s_event + "," + s_proc_outcome 
| eval q_proc=q_time + "," + q_status_time + "," + q_proc_id + "," + s_proc_outcome 
| eval q_proc_outcome=q_proc_id + "," + s_proc_outcome 
| table q_info, q_proc, q_time, s_comp_id, q_event, e_action, q_proc_outcome, e_path, q_agent, e_ppg_id, e_id, e_crl_id 
| eval q_event=if(q_event == "MSG_RECEIVE", "Received", "Sent") 
| convert timeformat="%d/%m/%y %H:%M:%S.%3N" ctime(q_time) 
| rename q_time as Time, s_comp_id as Endpoint, q_event as "Received / sent", q_agent as Agent, e_action as Action, q_proc_outcome as Status, e_path as Path, e_ppg_id as "Propagation id", e_id as "Message id", e_crl_id as "Correlation id", q_info as " ", q_proc as "  "

View solution in original post

woodcock
Esteemed Legend

Try this (but keep in mind that you DEFINITELY should get rid of the join because it is almost certainly silently dropping events):

index=ip_lux_metadata s_event="MSG_*" 
| eval q_precedence=if(s_event=="MSG_RECEIVE" AND like(s_comp_id,replace("*","[*]","%")) AND e_to=="SMC", 0, 1) 
| eval q_event=if(q_precedence==0, "MSG_SEND", s_event) 
| eval s_comp_id=if(e_to=="SMC", "SMC", s_comp_id) 
| search s_comp_id="*" 
| fillnull value="" e_id, e_ppg_id, e_crl_id, e_action 
| search e_id="*" e_ppg_id="*" e_crl_id="*" e_action="*" 
| eval q_proc_id = mvindex(s_proc_id, -1) 
| sort 0 q_precedence 
| dedup s_proc_id 
| fillnull value="%DEFAULT_STATUS%" s_proc_outcome 
| eval q_time=_time 
| eval q_status_time=q_time 
| eval q_agent=coalesce(a_agent, s_comp_id)
| join type=outer s_proc_id 
    [ search index=ip_lux_metadata s_comp_id="*" s_proc_outcome="*" 
    | eval q_status_time=_time 
    | fields s_proc_id, s_proc_outcome, q_status_time]
| search s_proc_outcome="*" 
| eval q_time=_time 
| sort 0 -q_time, s_comp_id, e_path 
| eval q_info=q_time + "," + q_status_time + "," + q_proc_id + "," + s_event + "," + s_proc_outcome 
| eval q_proc=q_time + "," + q_status_time + "," + q_proc_id + "," + s_proc_outcome 
| eval q_proc_outcome=q_proc_id + "," + s_proc_outcome 
| table q_info, q_proc, q_time, s_comp_id, q_event, e_action, q_proc_outcome, e_path, q_agent, e_ppg_id, e_id, e_crl_id 
| eval q_event=if(q_event == "MSG_RECEIVE", "Received", "Sent") 
| convert timeformat="%d/%m/%y %H:%M:%S.%3N" ctime(q_time) 
| rename q_time as Time, s_comp_id as Endpoint, q_event as "Received / sent", q_agent as Agent, e_action as Action, q_proc_outcome as Status, e_path as Path, e_ppg_id as "Propagation id", e_id as "Message id", e_crl_id as "Correlation id", q_info as " ", q_proc as "  "

ramprakash
Explorer

It worked.. Thanks

0 Karma

woodcock
Esteemed Legend

What is the %START% and %END%? If I remove this, I do not get any errors. Your question doesn't really make sense to me...

0 Karma

FrankVl
Ultra Champion

Have you tried building your query up step by step, to see at which line it fails?

What is that %START% %END% doing there? If I try the following simple test, I get the same error you got. So I think that is your issue. Is that perhaps some old, no longer supported, commenting syntax or so? (I haven't used anything before 6.4 or something like that)

|  makeresults 
%START%
|  eval test = "foo"
%END%
0 Karma

ramprakash
Explorer

Hi @FrankVl .. I have experience only in Admin part 😞

If remove these parenthesis also, i am receiving the similar error.

Error in 'eval' command: The expression is malformed. Expected ).
The search job has failed due to an error. You may be able view the job in the Job Inspector.

0 Karma

FrankVl
Ultra Champion

What do you mean by removing the parenthesis?

And again: best to simply start with just the first few lines and if those don't give an error, add the rest line by line, until you get the error. That should help pinpoint which part of your search is triggering that error.

0 Karma

ramprakash
Explorer

Thanks I will try.

With Parenthesis i meant %START% and %END%.

It was working in morning but after upgrade i am seeing this error. If you receive any details on this please let me know.

0 Karma

FrankVl
Ultra Champion

Ok, thanks for that clarification. As mentioned: you're getting a different error now. Maybe accidentally removed a closing ) somewhere while stripping out that %START% %END%?

Anyway, taking it step by step should help getting to the root cause of the error. Keep us posted on your progress 🙂

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...