Activity Feed
- Posted Re: Extract multivalue field using transforms mv_add=true not working as expected on Splunk Search. 09-17-2024 06:26 AM
- Posted Re: Extract multivalue field using transforms mv_add=true not working as expected on Splunk Search. 09-17-2024 04:53 AM
- Posted Extract multivalue field using transforms mv_add=true not working as expected on Splunk Search. 09-16-2024 11:35 PM
- Posted Is it possible to use map search twice? on Splunk Search. 02-13-2023 10:47 AM
- Tagged Is it possible to use map search twice? on Splunk Search. 02-13-2023 10:47 AM
- Got Karma for Bar Chart Color change for dynamically changing string values of one field. 02-01-2023 05:34 AM
- Karma Re: is there a way to pass field value from search to write kind of an event in the same search using eval command ? for gcusello. 09-15-2022 02:43 AM
- Posted Re: is there a way to pass field value from search to write kind of an event in the same search using eval command ? on Splunk Search. 05-20-2022 08:17 AM
- Posted Is there a way to pass field value from search to write kind of an event in the same search using eval command? on Splunk Search. 05-20-2022 07:38 AM
- Posted Re: Help with Regex on Splunk Enterprise. 09-21-2021 12:18 PM
- Karma Re: Help with Regex for bowesmana. 09-21-2021 12:18 PM
- Posted Re: Help with Regex on Splunk Enterprise. 09-20-2021 01:32 PM
- Posted Re: Help with Regex on Splunk Enterprise. 09-20-2021 12:17 PM
- Posted Re: Help with Regex on Splunk Enterprise. 09-20-2021 11:09 AM
- Posted Re: Help with Regex on Splunk Enterprise. 09-17-2021 01:26 AM
- Posted Help with Regex on Splunk Enterprise. 09-16-2021 09:30 AM
- Tagged Help with Regex on Splunk Enterprise. 09-16-2021 09:30 AM
- Tagged Help with Regex on Splunk Enterprise. 09-16-2021 09:30 AM
- Tagged Help with Regex on Splunk Enterprise. 09-16-2021 09:30 AM
- Tagged Help with Regex on Splunk Enterprise. 09-16-2021 09:30 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
1 | |||
0 | |||
0 |
09-17-2024
06:26 AM
@ITWhisperer , this will be good if am doing transforming search using mvexpand but any idea on how i can achieve the same results through search time fields extractions using props & transforms.conf
... View more
09-17-2024
04:53 AM
Thanks @ITWhisperer , the additional backslash seems to be doing the trick for rex command but still no luck having this worked with transforms.conf mv_add=true setting. Basically i need this fields to be available at search time hence trying to figure out a way for that. And when you say extract each group of fields as a whole what you mean by that. Can you please help me with an example to better understand that approach ?
... View more
09-16-2024
11:35 PM
Hi,
I am having hard time extracting multi value fields present in an event using transforms mv_add=true, it seems to be partially working by just extracting the first and third value present in the event but skipping the second and the fourth value.
The regex which i am using seems to be perfectly matching for all the values in regex101 but not sure why Splunk is unable to capture it all.
Following is the sample event and regex I am using -
Event -
postreport=test_west_policy\;passed\;(first_post:status:passed:pass_condition[clear]:fail_condition[]:skip_condition[]\;second_post:status:skipped:pass_condition[clear]:fail_condition[]:skip_condition[timed_out]\;third_post:status:failed:pass_condition[]:fail_condition[error]:skip_condition[]\;fourth_post:status:passed:pass_condition[clear]:fail_condition[]:skip_condition[])
Regex - https://regex101.com/r/r66eOz/1
(?<=\(|]\\;)(?<post>[^:]+):status:(?<status>[^:]*):pass_condition\[(?<passed_condition>[^\]]*)\]:fail_condition\[(?<failed_condition>[^\]]*)\]:skip_condition\[(?<skipped_condition>[^\]]*)\]
so splunk is just matching all values for first_post and third_post in above event and skipping the second_post & fourth_post..
the same regex i tried with rex command and in that it just matches first_post field values -
|rex field=raw_msg max_match=0 "(?<=\(|]\\;)(?<post>[^:]+):status:(?<status>[^:]*):pass_condition\[(?<passed_condition>[^\]]*)\]:fail_condition\[(?<failed_condition>[^\]]*)\]:skip_condition\[(?<skipped_condition>[^\]]*)\]"
Can someone please help me figure if i am missing something here.
Thanks.
... View more
Labels
- Labels:
-
field extraction
-
regex
-
rex
02-13-2023
10:47 AM
Not sure if this is possible through Splunk query but what i am trying to do is basically retrieve field value from one search and pass it into another and same is to be done twice to get desired result Lets consider below 3 different events as _raw data
14:06:06.932 host=xyz type=xyz type_id=123 14:06:06.932 host=xyz type=abc category=foo status=success 14:30:15.124 host=xyz app=test
now 1st and second event are going into same index and sourcetype but 3rd event is in different index & sourcetype
1st and 2nd event are happening at exact same time.
Expected result is to return following field values host type type_id category status app
Below is my search in which i am able to successfully correlate and find category and status field from second event
index=foo sourcetype=foo type=xyz |eval earliest = _time |eval latest = earliest + 0.001 |table host type type_id earliest latest |map search="search index=foo sourcetype=foo type=abc host=$host$ earliest=$earliest$ latest=$latest$ |stats values(_time) as _time values(type) as type values(category) as category values(status) as status by host |append [search index=foo sourcetype=foo type=xyz |stats values(type) as type values(type_id) as type_id by host] |stats values(*) as * by host
the problem comes when i try to add another map search command to retrieve app value present in 3rd event. so basically following mapping should provide me those result
|map search="search index=pqr sourcetype=pqr host=$host$ category=$category$ earliest=-1d latest=now|stats count by app
And then this app value is to be searched in one of the lookup files to get some details.
i have tried multiple ways to incorporate this into search but no luck. Any help is appreciated.
... View more
05-20-2022
08:17 AM
That's exactly what I needed, Thanks much @gcusello
... View more
05-20-2022
07:38 AM
Hey Splunkers,
I am not sure if this is possible or not but what i was trying to do is something like passing the values of search in the eval command to basically form a statement or an event .
So for example consider below search returns multiple users first name, last name and country details.
Now with that field values what i am trying to do is create a eval statement like below-
index=foo source=user_detail
|table first_name last_name country
|eval statement = My name is "$first_name $ $last_name$ and i come from $country$
|table statement
But this is not passing those field values to eval statement, so anyone knows if there is a way we can do this ?
Thanks.
... View more
Labels
- Labels:
-
fields
-
search job inspector
09-20-2021
01:32 PM
@PickleRick Yes , there are some events in which it is extracting exactly the way you mentioned. But then there are some event which are coming as a K/V pairs inside another field, So the extraction which I am doing is for the field, where i am getting data like - field1="field2=\"some_string_value\", field3=\"some_path_value\", field4=\"again_some_value\", ... fieldN=\"valueN\"" So now from field1 to field4 everything is getting formatted perfectly fine using EVAL-field = replace(field,"\\\\(.),"") The issue comes in "fieldN" where an extra double quotes is added to close the quotes which is opened at field1. Now the problem is my fieldN can be any any field and can contain multiple back slashes , double quotes and any string, so need some robust solution here which will work fine on any kind of field value. If somehow I can make the below regex work with replace command , then i guess it should solve my problem, but unfortunately its giving me error with something like - "|ev...{snipped} {errorcontext = ce(conn1,[\\"]+],"")}'. https://regex101.com/r/glIfKz/1 |makeresults | eval conn1="\\\"08/24/2021\\\"\"" |eval conn=replace(conn1,[\\"]+],"")
... View more
09-20-2021
12:17 PM
@ashvinpandey , thank you for your response. I am trying to achieve this using props.conf. Also the data in "conn" field can be a mix of IP , string and other special characters just like the "loc" field which you can see in the question.
... View more
09-20-2021
11:09 AM
@PickleRick The eval which I am using removes the double quotes too, but its just not removing the additional quotes which is coming in some events. In below run anywhere search for conn2 field you will see one double quote remaining at the end, which I don't want. |makeresults |eval conn="\\\"08/24/2021\\\"\"" |eval conn2=replace(conn,"\\\\(.)","") Also there is no such any specific extraction in my props.conf and it roughly looks something like below [mysourcetype] EVAL-conn = replace(conn,"\\\\(.),"") With following regex, I am able to replace all the backslashes and double quotes the way I want but I am unable to make it work with Splunk replace command. https://regex101.com/r/glIfKz/1 @niketn , @ITWhisperer, @bowesmana Thanks in advance.
... View more
09-17-2021
01:26 AM
@PickleRick thank you for your response. The EVAL statements shared by you and the one which I have shared above both are working fine only if the field is like conn="\success\" , but not working if there is an additional double quotes at the end like conn=\"pass\"". So in both; the results looks like - Conn=pass", what I want is; conn=pass
... View more
09-16-2021
09:30 AM
Hello, I want to remove all the back slashes and double quotes from following fields - conn=\"pass\"" ip=\"10.23.22.1\"" I am trying to extract with EVAL-conn = replace(conn,"\\\\(.),"") and EVAL-ip= replace(ip,"\\\\(.),"") in my props.conf but it is not removing the last double quotes and give me following results - conn=pass" ip=10.23.22.1" Results I want : conn=pass & ip=10.23.22.1 Can someone please help/guide me with this extraction. Thanks in Advance
... View more
Labels
- Labels:
-
Other
09-10-2021
01:16 PM
@ITWhisperer Thanks, I tried this but its missing out on some of the fields due to _raw data format being inconsistent. Like for e.g- field report_name its giving me path= but actually that field is empty. Also I am looking to extract this using props & transforms. Thanks.
... View more
- Tags:
- @
09-10-2021
01:08 PM
@richgalloway Basically I was trying something like - props.conf [my_srctype] REPORT-extract_conn=extract_conn tranforms.conf [extract_conn] REGEX = conn="connmsg=([^,]*)|^conn=(.*) MV_ADD = true FORMAT = conn::$1::$2 But this doesn't seem to be working as REGEX/FORMAT is not extracting two values for a single field. So currently I have figured out an alternative way where I am first extracting the entire conn field and then other key value pairs in following way :- [my_srctype] REPORT-extract_conn=extract_conn,extract_connmsg,extract_user ... [extract_conn] REGEX = conn="(.*)" MV_ADD = true FORMAT = conn::$1 [extract_connmsg] REGEX = conn="connmsg=([^,]*) MV_ADD = true FORMAT = connmsg::$1 [extract_user] REGEX = conn=.*user=([^,]*) MV_ADD = true FORMAT = user::$1 .. and so on... If you can think of or suggest any better and more efficient way to this than i will be happy to try that too. Thanks
... View more
09-09-2021
11:56 AM
Hi, I am having difficulty in extracting key=value pairs from one of the auto extracted field. The problem is that, this field may contain just a text value but also could contain multiple key=value pairs in it, so whenever there are multiple key=value pairs in the event then I am not getting the desired results. Following are some of my _raw events - 2021-08-10T11:35:00.505 ip=10.1.10.10 id=1 event="passed" model="t1" conn="connmsg=\"controller.conn_download::message.clean\", file=\"/home/folder1/filename_8555c5s.ext\", time=\"21:22:02\", day=\"08/24/2021\"" 2021-08-10T11:35:00.508 ip=10.1.10.10 id=1 event="running" model="t1" conn="connmsg=\"model.log::option.event.view.log_view_conn, connname=\"model.log::option.event.view.log_view_conn_name\", user=\"xyz\", remote_conn=10.23.55.54, auth_conn=\"Base\"" 2021-08-10T11:35:00.515 ip=10.1.10.10 id=1 event="failed" model="t1" conn="Failed to connect to the file for \"file_name\"" 2021-08-10T11:35:00.890 ip=10.1.10.10 id=1 event="extracting" model="t1" conn="connmsg=\"model.log::option.event.view.logout.message\", user=, job_id=65, report_name=", path=\"{\"type\":1,\"appIds\":\"\",\"path\":\"2021-08-10T11:35:00+00:00_2021-08-10T12:35:00+00:00\\/ip_initiate\\/10.1.120.11\\/http_code\\/200\",\"restrict\":null}\"" 2021-08-10T11:36:00.090 ip=10.1.10.10 id=1 event="extracting" model="t1" conn="connmsg=\"model.log::option.event.view.audit.message, user=\"qic\\abc_pqr\, reason_msg=\"component.auth::message:unknown_user\", path=/abc/flows/timespan/2021-08-10T11:35:00+00:00_2021-08-10T12:35:00+00:00/ip_initiate/10.101.10.20/data.ext" 2021-08-10T11:36:00.380 ip=10.1.10.10 id=1 event="triggered" model="t1" conn="Rule 'Conn Web Service' was triggered by Indicator:'Conn Web Service'" 2021-08-10T11:36:00.880 ip=10.1.10.10 id=1 event="triggered" model="t1" conn="connmsg=\"model.log::option.event.report.finished\", user=, job_id=65, report_name=", path=\"{\"type\":1,\"namespace\":\"flows\",\"appIds\":\"10,11,12\",\"path_bar\":\"[\\\"ip_initiate=10.1.120.11\\\"]\",\"2021-08-10T11:35:00+00:00_2021-08-10T12:35:00+00:00\\/ip_initiate\\/10.1.120.11\\/http_code\\/200\",\"restrict\":null}\"" The field which I am facing issue is "conn" field and I want data to be extracted in conn field in somewhat below manner - conn \"controller.conn_download::message.clean\" model.log::option.event.view.log_view_conn Failed to connect to the file for \"file_name\" \"model.log::option.event.view.logout.message\" \"model.log::option.event.view.audit.message\" "Rule 'Conn Web Service' was triggered by Indicator:'Conn Web Service'" but currently its just extracting the next value coming after conn= ,so basically current data in my conn field based on the above raw events looks like - conn connmsg=\ connmsg=\ Failed to connect to the file for \"file_name\" connmsg=\ connmsg=\ Rule 'Conn Web Service' was triggered by Indicator:'Conn Web Service' The "conn" field might contain even more key value pairs , so also wanted to know if there is some dynamic way to capture if any new key value pair pops in conn field other than those specified ? Also along with that, the other key value pairs in conn field is sometimes getting auto extracted and sometime it isn't. I am trying to write Search time field extraction using props and transforms but no luck so far in getting what I want, can someone please help ? Thanks in Advance.
... View more
Labels
- Labels:
-
field extraction
-
fields
-
regex
09-25-2020
02:07 AM
Thank You @thambisetty for your response , but its not appending the results as i want. Its creating whole new row of results. So for e.g - i currently i have following data present in my csv file title 07 Jul 2020 08 Aug 2020 Title1 99.998 99.561 Title2 99.700 99.760 Title3 99.989 99.973 now when i append the new data for the month of September, then I want it to append just the Months column like below - title 07 Jul 2020 08 Aug 2020 09 Sep 2020 Title1 99.998 99.561 99.577 Title2 99.700 99.760 99.389 Title3 99.989 99.973 99.123 but its appending the results like below. title 07 Jul 2020 08 Aug 2020 09 Sep 2020 Title1 99.998 99.561 Title2 99.700 99.760 Title3 99.989 99.973 Title1 99.577 Title2 99.389 Title3 99.123 How do i fix this ?
... View more
09-08-2020
05:41 AM
Hi @thambisetty , thank you. I am now able to get the data in below format in splunk - title 07 Jul 2020 08 Aug 2020 title 07 Jul 2020 08 Aug 2020 Title1 99.998 99.561 Title2 99.700 99.760 Title3 99.989 99.973 is there a way to just append it to a csv file every month ? For example i will store this file on search head and on 1st of October i will like to append data for month of September in the csv before sending. If so ,then what changes i would need to make in following search so that it just adds the latest months data - my search .. | fillnull value=1000 response_code | eval success=case(response_code>=400, 0, timed_out == "True", 0, response_code="",0) | fillnull value=1 success | eval month=strftime(_time,"%m %b %Y") | stats count as total, sum(success) as successes by title month | eval availability=round(100*(successes/total),3) | chart list(availability) over title by month | eval month=replace(month, "\d+ (.*)","\1") Also I had added the month_num(%m) to sort the data by month but when i am trying to remove it using replace its not working. what am i doing wrong there any idea ?
... View more
09-06-2020
07:28 AM
Hi, I want to create a report through splunk that will send out an email consisting data of each months stats by auto appending the excel file with the past months data before sending. Following is my query which gives me the required data. Now i don't know how to distribute that data by month and then append in some excel file and send it through over email. my search.. | fillnull value=1000 response_code | eval success=case(response_code>=400, 0, timed_out == "True", 0, response_code="",0) | fillnull value=1 success |stats count as total, sum(success) as successes by title | eval availability=round(100*(successes/total),2) |stats count by title availability I want the data in excel file to look something like below - I want this to be done automatically through Splunk Schedule reports at the beginning of each month. Can someone please help me figure out a way if its possible through Splunk ? Thanks in advance.
... View more
Labels
- Labels:
-
lookup
08-31-2020
02:42 AM
@thambisetty this helps.. Thank You!!
... View more
08-31-2020
01:46 AM
oops ..my bad @niketn . Actually i did use sort before, but was viewing data in stats table itself and since the values was separated didn't paid much attention that it had already sorted 🙂 ..anyways thanks again for your help
... View more
08-30-2020
04:12 AM
Hi, I have a transaction that goes through multiple Status before its completed. Now the challenge I am facing here is , one status can be mapped multiple time before the transaction is completed and in some cases the same status can keep repeated until its move to the next state - For example ,my logs look something like below for one transaction ID - (Note-there can be many more statuses other than the one's below) 2020-08-27 08:00:40.000, ID="20", STATUS="CREATE" 2020-08-27 08:01:11.000, ID="20", STATUS="POST" 2020-08-27 08:01:42.000, ID="20", STATUS="POST" 2020-08-27 08:02:24.000, ID="20", STATUS="POST" 2020-08-27 08:03:46.000, ID="20", STATUS="REPAIR" 2020-08-27 08:03:56.000, ID="20", STATUS="PENDING" 2020-08-27 08:04:00.000, ID="20", STATUS="UPDATE" 2020-08-27 08:04:12.000, ID="20", STATUS="UPDATE" 2020-08-27 08:04:30.000, ID="20", STATUS="POST" 2020-08-27 08:04:46.000, ID="20", STATUS="COMPLETE" 2020-08-27 08:04:56.000, ID="20", STATUS="COMPLETE" Now , What i want to do is calculate the total duration of time a transaction spent in a particular status. So the final results should look something like below - ID STATUS max(_time) duration (sec) 20 CREATE 2020-08-27 08:00:40.487 31 20 POST 2020-08-27 08:02:24.265 155 20 REPAIR 2020-08-27 08:03:46.529 10 20 PENDING 2020-08-27 08:03:56.097 4 20 UPDATE 2020-08-27 08:04:12.715 30 20 POST 2020-08-27 08:04:30.366 16 20 COMPLETE 2020-08-27 08:04:56.517 As of now, with below query I am able to map the status according to time but the duration is not being calculated accurately. Can someone please help me figure this out. my search ... | sort 0 _time | streamstats current=false last(STATUS) as newstatus by ID | reverse | streamstats current=false last(_time) as next_time by ID | eval duration=next_time-_time | reverse | streamstats count(eval(STATUS!=newstatus)) as order BY ID | stats max(_time) as _time, sum(duration) as "duration(sec)" BY ID order STATUS Thanks in advance.
... View more
Labels
- Labels:
-
correlation search
08-30-2020
02:49 AM
@niketn Thanks a lot for documenting it once again, this was something i was looking for . Just one more question though - how do i sort the statuses by count ?
... View more
08-28-2020
01:08 AM
1 Karma
Hello, I have very basic query which is giving me the desired results and visualization but even after lot of researching what I am not able to do is color the bars according to the field values. Below is something what my query looks like - my search ...| chart count by status now this status fields has around 15-20 status values like complete,pending, repair, canceled, posted,etc, etc.. so as of now my Bar chart has Status field on X-axis with the field values name mapped to it like complete, cancelled,etc and my Y -axis has the count of those statuses. Now what i want is to keep the visualization as it is and change colors of few statuses like green for Complete, red for canceled, amber for repair and so on. I tried charting.fieldColors">{"COMPLETE":#32a838,"CANCELED":#e81324,"REPAIR":#FFC200} but is not helping. I also tried to transposing rows to column with which i am able to change the colors but then the mapping of field values onto to the Y-axis is being removed and converted to legends, which is not looking good. Can this be achieved by keeping my current visualization intact ? I have gone through multiple pages here on Splunk Answers but no luck. Thanks in advance.
... View more
Labels
- Labels:
-
chart
-
panel
-
simple XML
08-07-2020
05:52 AM
Hi @rwardwell , Have you had any luck in achieving this ? If yes, then can you please share your inputs as I am struggling with the same.
... View more
07-10-2020
05:06 AM
@bowesmana perfect ..Thanks!
... View more