All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Also   /test1/folder1/scripts/monitor/log/env/dev/Error.log is dynamic source field value and not hardcoded value /test1/folder1/scripts/monitor/log/env/dev/Error.log so I need to integrate index c... See more...
Also   /test1/folder1/scripts/monitor/log/env/dev/Error.log is dynamic source field value and not hardcoded value /test1/folder1/scripts/monitor/log/env/dev/Error.log so I need to integrate index command with makeresults something like this index="monitoring" source="/test1/folder1/.scripts/monitor/log/env/dev/Error.log" extract values of env and dev...that can be dynamic to separate fields.
yes   /test1/folder1/.scripts/monitor/log/env/dev/Error.log   I want  field 1=value of env field 2=value of dev   as there is scope of this changing later.
sorry for confusion. I want to skip 1st 7 lines of environment variable "_raw" and then copy that info from 7th line  to new variable "_raw_refined"  
The makeresults / eval is an example you can run to show you how this works - in your first post you said you wanted lines 5 and 6 - now you want to skip the first 7 lines and your post says you want... See more...
The makeresults / eval is an example you can run to show you how this works - in your first post you said you wanted lines 5 and 6 - now you want to skip the first 7 lines and your post says you want to skip first 10 lines? Confused... Please give more precise information about your requirement
Not clear what you are saying - your original post says want the word "dev", but you also want the word "env" also? Is "env" something that can change?
I want to execute this skip first 7 lines on splunk default environment variable "_raw" and not on  eval _raw="line 1 line 2 line 3 line 4 line 5 line 6"  
Thanks for your response But my file location is /test1/folder1/.scripts/monitor/log/env/dev/Error.log So interested to get both values of  env and dev    
There doesn't appear to be anything wrong with it - but it would require that field to be extracted so it could be searched. Do you know if it's an indexed field or extracted at search time? If you... See more...
There doesn't appear to be anything wrong with it - but it would require that field to be extracted so it could be searched. Do you know if it's an indexed field or extracted at search time? If you add | stats count by dataName to your search do you get any results - if not, then that field is not extracted. If you run the search in verbose mode, does the dataName field show up in the fields in the left hand panel?  
You should look at using streamstats - here's an example that creates 10 events where every 4th event changes from warning to critical. | makeresults count=10 | streamstats c | eval _time=now() - c ... See more...
You should look at using streamstats - here's an example that creates 10 events where every 4th event changes from warning to critical. | makeresults count=10 | streamstats c | eval _time=now() - c | eval type=if(c % 4 = 0, "critical", "warning") | fields - c | sort - _time | streamstats count reset_after="("type=\"warning\"")" by type | where count=1 AND type="critical" To give you an exact solution would need to know more about your requirement. This will give 2 results when the type changes to critical from warning
Are the time ranges for both searches the same - if the search is to "now" as latest time, then naturally they could come up with different results depending on when the search is dispatched and how ... See more...
Are the time ranges for both searches the same - if the search is to "now" as latest time, then naturally they could come up with different results depending on when the search is dispatched and how long it takes to run. I am guessing these are some kind of requests, so MA->COSMOS->PHB - is a negative figure not possible? Presumably there can be requests from COSMOS->PHB at the start of the search window that do not have corresponding requests inside the range from MA->COSMOS - without knowing your environment it's impossible to know.
Are you saying you want to remove the milliseconds and timezone specifier or are you saying that your epoch time does not convert correctly, as this time in your message 1714363262.904000  is not act... See more...
Are you saying you want to remove the milliseconds and timezone specifier or are you saying that your epoch time does not convert correctly, as this time in your message 1714363262.904000  is not actually the time 2024-04-29T12:01:15.710Z When you use strptime to parse that time, you will get a time in your local time. If you are in GMT then it is the same, but here in Australia, I get a time that represents 2024-04-29 22:01:15.710 AEST, i.e. 10 hours later than the Zulu time. If you are just looking to remove the milliseconds and time zone indicator, then just reformat using  | eval latest_time=strftime(strptime(latest_time, "%FT%T.%Q%Z"), "%F %T") Note that %F is shorthand for %Y-%m-%d and %T is a shortcut for %H:%M:%S Note that that new time will be in your local time.  If you don't care about time zones at all and simply want to remove the T, milliseconds and Z then you could just use sed, i.e. | rex mode=sed field=latest_time "s/\.\d+Z// s/T/ /"      
Need a little more information about the real data and its format, but if you want to ignore the first 4 lines, which are terminated by a linefeed then get the rest of the data, see this example | m... See more...
Need a little more information about the real data and its format, but if you want to ignore the first 4 lines, which are terminated by a linefeed then get the rest of the data, see this example | makeresults | fields - _time | eval _raw="line 1 line 2 line 3 line 4 line 5 line 6" | rex "(?ms)([^\n]*\n){4}(?<copyofraw>.*)"
Here's an example you can run in the search window - you are interested in the last two lines : rex statement and the final eval statement. | makeresults | fields - _time | eval source=split("/test... See more...
Here's an example you can run in the search window - you are interested in the last two lines : rex statement and the final eval statement. | makeresults | fields - _time | eval source=split("/test1/folder1/scripts/monitor/log/env/dev/Error.log,/test1/folder1/scripts/monitor/log/env/test/Error.log", ",") | mvexpand source | rex field=source ".*\/(?<env>\w+)\/.*" | eval environment=case(env="dev","development",env="test","loadtest",true(), "unknown:".env) There are several ways you can assign the name to the environment - if you have lots of environments you can do this from a lookup or just use the case statement.
_raw= line 1 line 2 line 3 line 4 line 5 line 6 how to define another new field "copyofraw"  to contain just line 5 and line 6
Hi, How do I extract word "Dev" from below file location source=/test1/folder1/scripts/monitor/log/env/dev/Error.log and add some if condition statements like if word=dev,change it to development ... See more...
Hi, How do I extract word "Dev" from below file location source=/test1/folder1/scripts/monitor/log/env/dev/Error.log and add some if condition statements like if word=dev,change it to development word=test,change it to loadtest in splunk query.   Thanks    
This is not how rex works - you need to provide a pattern as a regular expression to identify what you want to extract. For example, do you want everything from "change" to "}}"? Does this pattern ho... See more...
This is not how rex works - you need to provide a pattern as a regular expression to identify what you want to extract. For example, do you want everything from "change" to "}}"? Does this pattern hold true for all your event where you want to extract this field? Aside from that, this looks like json - why aren't you using spath or the other json functions to extract the json field?
Try it this way | eval successtime=if(status=200,_time,null()) | streamstats range(successtime) as successrange count(successtime) as successcount window=3 by status reset_on_change=t | where succes... See more...
Try it this way | eval successtime=if(status=200,_time,null()) | streamstats range(successtime) as successrange count(successtime) as successcount window=3 by status reset_on_change=t | where successcount=3 and successrange > 10
Thank you for your time and response. I now don't see double quotes in the search query. This is helpful. startswith="my start msg" endswith="my end msg" --> works startswith IN ("my start msg1", "... See more...
Thank you for your time and response. I now don't see double quotes in the search query. This is helpful. startswith="my start msg" endswith="my end msg" --> works startswith IN ("my start msg1", "my start msg2", "my start msg3") endswith="my end msg"  ---> This is honoring only endswith flag and not returning events starting with my start msg lines "my start msg1" or "my start msg2" or "my start msg3"  I notice that splunk search returns events before these matching startswith fields  I will open a different question for that.
I am new to administrating Splunk Enterprise Server. I'm guessing the answer is obvious to some, but I'm getting confused trying to figure out a solution from the documentation. We are using Splunk ... See more...
I am new to administrating Splunk Enterprise Server. I'm guessing the answer is obvious to some, but I'm getting confused trying to figure out a solution from the documentation. We are using Splunk Enterprise Server v 9.2.1 stand-alone on an isolated network. We primarily collect and report on multiple systems' audit logging.  The server is set up and I can see ingested logs arriving and create reports on the data. But I need one more thing. I must archive all the original data exactly as it is received on the TCP receiver and copy it to offline storage for safe keeping. I need to be able to re-ingest the raw data at some future date, but that seems pretty straightforward. How can I do this?  Is there some way I can grab the data being received on my TCP port listener in RAW form or some magic I need to do with some indexer or forwarder->Receiver string?   I'm sure I'm not the first person to need this... How do others accomplish this stuff? Thank You!
Can I change the default message in the Alert Trigger "Send Email" ? I have been looking around and cant find anything where I could change this. My goal is to create a template message so we can str... See more...
Can I change the default message in the Alert Trigger "Send Email" ? I have been looking around and cant find anything where I could change this. My goal is to create a template message so we can stream light our alert messages. Any help would be great !