All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Sorry I miscounted, it does look right - the issue here is that the trade_id does not match the first field in the mx_to_sky event
Sorry how do i rectify it? 32265376;DEAD;3887.00000000;XAU;CURR;FXD;FXD;CM TR GLD AUS;X_CMTR XAU SWAP Did u mean this 
Your regex assumes (insists!) that the event has 9 fields separated by (8) semi-colons - your sample data has only 8 fields separated by 7 semi-colons.
感谢您的回复。对于我描述的问题,我深表歉意! 我们的样本数据如下: 2024-12-12 00:30:12 “, 0699075634,” 刘志强 “,” 物流部 “,” 是 “ 2024-12-12 08:30:14 ”, 0699075634,“ 刘志强 ”,“ 物流部 ”,“ 是 ” 2024-12-12 11:30:12 “, 0699075634,” 刘志强 “,” 物流部 “... See more...
感谢您的回复。对于我描述的问题,我深表歉意! 我们的样本数据如下: 2024-12-12 00:30:12 “, 0699075634,” 刘志强 “,” 物流部 “,” 是 “ 2024-12-12 08:30:14 ”, 0699075634,“ 刘志强 ”,“ 物流部 ”,“ 是 ” 2024-12-12 11:30:12 “, 0699075634,” 刘志强 “,” 物流部 “,” 是 “ 2024-12-13 15:30:55 ”, 0699075634,“ 刘志强 ”,“ 物流部 ”,“ 是 ” 2024-12-13 00:30:12 “, 0699075634,” 刘志强 “,” 物流部 “,” 是 “ 2024-12-14 19:30:30 ”, 0699075634,“ 刘志强 ”,“ 物流部 ”,“ 是 ” 2024-12-14 22:30:12 “, 0699075634,” 刘志强 “,” 物流部 “,” 是 “ 字段标题为: opr_time oprt_user_acct oprt_user_name blng_dept_name is_cont_sens_acct
Since the first part is just determining values for earliest and latest, you might be able to avoid map like this index=edwapp sourcetype=ygttest is_cont_sens_acct="是" [search index=edwapp sourcetyp... See more...
Since the first part is just determining values for earliest and latest, you might be able to avoid map like this index=edwapp sourcetype=ygttest is_cont_sens_acct="是" [search index=edwapp sourcetype=ygttest is_cont_sens_acct="是" | stats earliest(_time) as earliest_time latest(_time) as latest_time | addinfo | table info_min_time info_max_time earliest_time latest_time | eval earliest_time=strftime(earliest_time,"%F 00:00:00") | eval earliest_time=strptime(earliest_time,"%F %T") | eval earliest_time=round(earliest_time) | eval searchEarliestTime2=if(info_min_time == "0.000", earliest_time, info_min_time) | eval searchLatestTime2=if(info_max_time="+Infinity", relative_time(latest_time,"+1d"), info_max_time) | eval earliest=mvrange(searchEarliestTime2,searchLatestTime2, "1d") | mvexpand earliest | eval latest=relative_time(earliest,"+7d") | where latest <=searchLatestTime2 | eval latest=round(latest) | fields earliest latest] | dedup day oprt_user_name blng_dept_name oprt_user_acct | stats count as "fwcishu" by day oprt_user_name blng_dept_name oprt_user_acct | eval a=$a$ | eval b=$b$ | stats count as "day_count",values(day) as "qdate",max(day) as "alert_date" by a b oprt_user_name,oprt_user_acct " maxsearches=500000 | where day_count > 2 | eval alert_date=strptime(alert_date,"%F") | eval alert_date=relative_time(alert_date,"+1d") | eval alert_date=strftime(alert_date, "%F") | table a b oprt_user_name oprt_user_acct day_count qdate alert_date  
And the problem is these columns are empty for some and populated for some. For those empty, I clearly checked the NB is matching in both searches TRN_STATUS, NOMINAL, CURRENCY, TRN_FMLY, TRN_GRP,... See more...
And the problem is these columns are empty for some and populated for some. For those empty, I clearly checked the NB is matching in both searches TRN_STATUS, NOMINAL, CURRENCY, TRN_FMLY, TRN_GRP, TRN_TYPE, BPFOLIO, SPFOLIO
HI query joining 2 searches on left join. Its matching some rows and not matching some rows although the column where I join on is clearly seen in both searches.       index=sky sourcetype=sky... See more...
HI query joining 2 searches on left join. Its matching some rows and not matching some rows although the column where I join on is clearly seen in both searches.       index=sky sourcetype=sky_trade_murex_timestamp | rex field=_raw "trade_id=\"(?<trade_id>\d+)\"" | rex field=_raw "mx_status=\"(?<mx_status>[^\"]+)\"" | rex field=_raw "sky_id=\"(?<sky_id>\d+)\"" | rex field=_raw "event_id=\"(?<event_id>\d+)\"" | rex field=_raw "operation=\"(?<operation>[^\"]+)\"" | rex field=_raw "action=\"(?<action>[^\"]+)\"" | rex field=_raw "tradebooking_sgp=\"(?<tradebooking_sgp>[^\"]+)\"" | rex field=_raw "portfolio_name=\"(?<portfolio_name>[^\"]+)\"" | rex field=_raw "portfolio_entity=\"(?<portfolio_entity>[^\"]+)\"" | rex field=_raw "trade_type=\"(?<trade_type>[^\"]+)\"" | rename trade_id as NB | dedup NB | eval NB = tostring(trim(NB)) | table sky_id, NB, event_id, mx_status, operation, action, tradebooking_sgp, portfolio_name, portfolio_entity, trade_type | join type=left NB [ search index=sky sourcetype=mx_to_sky | rex field=_raw "(?<NB>\d+);(?<TRN_STATUS>[^;]+);(?<NOMINAL>[^;]+);(?<CURRENCY>[^;]+);(?<TRN_FMLY>[^;]+);(?<TRN_GRP>[^;]+);(?<TRN_TYPE>[^;]*);(?<BPFOLIO>[^;]*);(?<SPFOLIO>[^;]*)" | eval NB = tostring(trim(NB)) | table TRN_STATUS, NB, NOMINAL, CURRENCY, TRN_FMLY, TRN_GRP, TRN_TYPE, BPFOLIO, SPFOLIO] | table sky_id, NB, event_id, mx_status, operation, action, tradebooking_sgp, portfolio_name, portfolio_entity, trade_type, TRN_STATUS, NOMINAL, CURRENCY, TRN_FMLY, TRN_GRP, TRN_TYPE, BPFOLIO, SPFOLIO        This above is my source code And the raw data is       Time Event 27/12/2024 17:05:39.000 32265376;DEAD;3887.00000000;XAU;CURR;FXD;FXD;CM TR GLD AUS;X_CMTR XAU SWAP host = APPSG002SIN0117source = D:\SkyNet\data\mx_trade_report\MX2_TRADE_STATUS_20241227_200037.csvsourcetype = mx_to_sky Time Event 27/12/2024 18:05:36.651 2024-12-27 18:05:36.651, system="murex", id="645131777", sky_id="645131777", trade_id="32265483", event_id="100023788", mx_status="DEAD", operation="NETTING", action="insertion", tradebooking_sgp="2024/12/26 01:02:01.0000", eventtime_sgp="2024/12/26 01:01:51.7630", sky_to_mq_latency="-9.-237", portfolio_name="I CREDIT INC", portfolio_entity="ANZSEC INC", trade_type="BondTrade" host = APPSG002SIN0032source = sky_trade_murex_timestamp sourcetype = sky_trade_murex_timestamp  
It appears you have multiple stats for the same transaction in the event . try using mvdedup | spath | eval date=strftime(_time,"%m-%d %k:%M") | table date *.pct2ResTime | foreach *.pct2ResTime ... See more...
It appears you have multiple stats for the same transaction in the event . try using mvdedup | spath | eval date=strftime(_time,"%m-%d %k:%M") | table date *.pct2ResTime | foreach *.pct2ResTime [| eval <<FIELD>> = mvdedup('<<FIELD>>')] | untable date transaction pct2ResTime | eval "Transaction Name"=mvindex(split(transaction,"."),0) | xyseries "Transaction Name" date pct2ResTime
This seems to be different from your previous description. Counting is one thing, listing sessions is another. Furthermore, we don't know your data.
Thank you, but the client wants to obtain dimensions every 7 days, with approximately 1200 result sets. The output results need to include: start time, end time, username, department, number of days ... See more...
Thank you, but the client wants to obtain dimensions every 7 days, with approximately 1200 result sets. The output results need to include: start time, end time, username, department, number of days visited, multi value query time, and alarm time
You're overcomplicating your search. If you want to calculate how many days during a week your users connected to a service there are probably several ways about it. The easiest and most straightfor... See more...
You're overcomplicating your search. If you want to calculate how many days during a week your users connected to a service there are probably several ways about it. The easiest and most straightforward would probably be to | bin _time span=1d to have all visits during the same day with the same timestamp (the alternative would be to use strftime) Now you need to calculate different days in each week for each user | stats dc(_time) by user _time span=1d the alternative is the timechart command.
More words please. What do you want to achieve and why?
Hi, Can you try the following regex Regex: Rule:\s(?P<Rule>(.*?)(?=,\d+)) It uses positive lookahead (?=) and captures everything until it finds "," followed by digit. If the end of the rule... See more...
Hi, Can you try the following regex Regex: Rule:\s(?P<Rule>(.*?)(?=,\d+)) It uses positive lookahead (?=) and captures everything until it finds "," followed by digit. If the end of the rule always has a digit then this will work. Keep in mind that if an word is replaced by digit at the end of the rule this will not work. Please try and if it works an upvote is appreciated.
What happened if Splunk SOAR license expired? I cannot find a document to explain it.
Thank you, I forgot the code format when using it less
The SPL I provided is indeed not a problem with the production environment. I want to implement data statistics for the interval from 2019 to the present, where a user visits multiple times a day and... See more...
The SPL I provided is indeed not a problem with the production environment. I want to implement data statistics for the interval from 2019 to the present, where a user visits multiple times a day and counts it as one visit. I want to calculate the continuous number of user visits for the interval every 7 days since 2019.
In addition to @ITWhisperer comments, there is an alternative way to set/unset pairs of tokens using the <eval> token mechanism, i.e. <drilldown> <eval token="ModuleA">if($click.value2$="AAA", "tr... See more...
In addition to @ITWhisperer comments, there is an alternative way to set/unset pairs of tokens using the <eval> token mechanism, i.e. <drilldown> <eval token="ModuleA">if($click.value2$="AAA", "true", null())</eval> <eval token="ModuleOther">if($click.value2$="AAA", null(), $trellis.value$)</eval> <drilldown> I prefer this mechanism over <condition>, where null() is equivalent to unsetting a token. It avoids the &quot; usage and keeps the number of lines down. Again, click.value2 may not be the right one
Can you summarise what you are trying to do, your SPL contains some errors, e.g. using info_min_time in your mvrange() eval statement, which does not exist and that fact that you have max_searches as... See more...
Can you summarise what you are trying to do, your SPL contains some errors, e.g. using info_min_time in your mvrange() eval statement, which does not exist and that fact that you have max_searches as half a million indicates you're going about this the wrong way. Describe the problem you are trying to solve, your inputs and your expected outputs.  
Can you edit and format your SPL as a code block using this symbol </> in the Body menu - makes it far easier to digest long SPL