All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Still i get "Pending" for all the files even though it was success and timestamp is there.
thanks @bowesmana @sainag_splunk , I tried both and results were near same! Sinece the CN field is already extracted I modified the search like this.... base search .... | rex field=cn "(?... See more...
thanks @bowesmana @sainag_splunk , I tried both and results were near same! Sinece the CN field is already extracted I modified the search like this.... base search .... | rex field=cn "(?<ipAddr>\d{1,3}[._]\d{1,3}[._]\d{1,3}[._]\d{1,3})" | eval cn = coalesce(replace(ipAddr, "_", "."), cn) In case anyone runs into this thread later.  Much appreciated!
Where is this used? In my search SLP or in the config alerts? alert.suppress.fields = <comma-separated-field-list>  
Assuming you already have filenames extracted, then try something like this | eval file_name=coalesce('FileTransfer.FileName', file_name) | stats values(eval(if(sourcetype="filecopy",_time,null())))... See more...
Assuming you already have filenames extracted, then try something like this | eval file_name=coalesce('FileTransfer.FileName', file_name) | stats values(eval(if(sourcetype="filecopy",_time,null()))) as FileCopyLocation values(eval(if(sourcetype="transfer",_time,null()))) as TargetLocation by file_name | eval FileCopyLocation=strftime(FileCopyLocation,"%F %T") | eval TargetLocation=strftime(TargetLocation, "%F %T") | fillnull TargetLocation value="Pending"
Only thing what comes my mind is that you should try to find some matches from other logs including sh side, which process or query has initiated this query on indexer side and found more information ... See more...
Only thing what comes my mind is that you should try to find some matches from other logs including sh side, which process or query has initiated this query on indexer side and found more information over there. Another option is create a support case to splunk.
I think so. I haven’t try by myself those ARM based Linux UFs if those are working also in KALI.
Here is link to docs https://docs.splunk.com/Documentation/Splunk/9.4.0/Indexer/Setupmultipleindexes Remember if you are editing conf files then you must do restart or in some cases reload is enough.... See more...
Here is link to docs https://docs.splunk.com/Documentation/Splunk/9.4.0/Indexer/Setupmultipleindexes Remember if you are editing conf files then you must do restart or in some cases reload is enough. If you want to avoid that, then you should use GUI or CLI commands to modify those values.
Can you share your instrumented code?
Hello everybody, I am facing some challenges with some custom log file containing bits of xml surrounded by some sort of headers... The file looks something like this:   [1][DATA]BEGIN --- - 06:... See more...
Hello everybody, I am facing some challenges with some custom log file containing bits of xml surrounded by some sort of headers... The file looks something like this:   [1][DATA]BEGIN --- - 06:03:09[012] <xml> <tag1>value</tag1> <nestedTag> <tag2>another value</tag2> </nestedTag> </xml> [1][DATA]END --- - 06:03:09[012] [1][DATA]BEGIN --- - 07:03:09[123] <xml> <tag1>some stuff</tag1> <nestedTag> <tag2>other stuff</tag2> </nestedTag> </xml> [1][DATA]END --- - 07:03:09[123] [1][DATA]BEGIN --- - 08:03:09[456] <xml> <tag1>some more data</tag1> <nestedTag> <tag2>fooband a bit more</tag2> </nestedTag> </xml> [1][DATA]END --- - 08:03:09[456]    It is worth noting that the xml parts can be very large. I would like to take advantage of Splunk's automatic xml parsing as it is not realistic to do it manually in this case, but the square bracket lines around each xml block seem to prevent the xml parser to do its job and I get no field extraction. So, what I would like to do is: Converting the "data begin" line with the square brackets, before each xml block, into an xml formatted line, so that I can use it for the time of the event (the date itself is encoded in the filename...) and let Splunk parse the rest of the xml data automatically Stripping out the lines with the "data end" bit after each block of xml. These are not useful as they provide the same time than the "data begin" line. Aggregating the xml lines of the same block into one event What I have tried with props.conf and transforms.conf: props.conf   [my_sourcetype] BREAK_ONLY_BEFORE_DATE = DATETIME_CONFIG = KV_MODE = xml LINE_BREAKER = \]([\r\n]+)\[1\]\[DATA\]BEGIN NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = Custom pulldown_type = true TRANSFORMS-full=my_transform # only with transforms.conf v1 TRANSFORMS-begin=begin # only with transforms.conf v2 TRANSFORMS-end=end # only with transforms.conf v2   transforms.conf (version 1):   [my_transform] REGEX = (?m)\[1\]\[DATA\]BEGIN --- - (\d{2}:\d{2}:\d{2}).*([\r\n]+)([^\[]*)\[1\]\[DATA\]END.*$[\r\n]* FORMAT = <time>$1</time>$2$3 WRITE_META = true DEST_KEY = _raw    transforms.conf (version 2):   [begin] REGEX = (?m)^\[1\]\[DATA\]BEGIN --- - (\d{2}:\d{2}:\d{2}).*$ FORMAT = <time>$1</time> WRITE_META = true DEST_KEY = _raw [end] REGEX = (?m)^\[1\]\[DATA\]END.*$ DEST_KEY = queue FORMAT = nullQueue     With the various combinations listed here, I got all sorts of results: well separated events but with square brackets left over one big block with all events aggregated together and no override of the square bracket lines one event with the begin square bracket line truncated at 10k characters 4 events with one "time" xml tag but nothing else... Could anybody help me out with this use case? Many thanks, Alex
Hi @devsru, You can use makeresults for that: | makeresults | eval msg="Daylight savings is scheduled tomorrow, please be alerted " | fields - _time Create an cron scheduled alert based on this ... See more...
Hi @devsru, You can use makeresults for that: | makeresults | eval msg="Daylight savings is scheduled tomorrow, please be alerted " | fields - _time Create an cron scheduled alert based on this SPL, triggering when the results are more than 0, and configure the 'Send Email' alert action.
| makeresults count=365 | streamstats count | eval DayOfYear=strftime(round(relative_time(now(), "-0y@y"))+((count-1)*86400),"%Y-%m-%d") | eval FirstOfMonth=strftime(strptime(DayOfYear, "%Y-%m-%d"... See more...
| makeresults count=365 | streamstats count | eval DayOfYear=strftime(round(relative_time(now(), "-0y@y"))+((count-1)*86400),"%Y-%m-%d") | eval FirstOfMonth=strftime(strptime(DayOfYear, "%Y-%m-%d"),"%Y-%m-01") | eval Sunday=strftime(relative_time(strptime(FirstOfMonth, "%Y-%m-%d"),"+2w@w0"), "%Y-%m-%d") | eval Match=if((Sunday=DayOfYear AND (strftime(round(relative_time(now(), "-0y@y"))+((count-1)*86400),"%m")=="03" OR strftime(round(relative_time(now(), "-0y@y"))+((count-1)*86400),"%m")=="11") ),"TRUE","FALSE") | table _time DayOfYear FirstOfMonth Sunday Match | search Match=TRUE This search will find the second Sunday of every March and November for the current year.  You actually need to identify if today is the day before in order to trigger an alert which you can program to send an email. There might be easier methods to identify the DST change but my research has not found it yet this morning.  Also this assumes the DST change is for the Americas, other portions of the globe may not share the same DST days.
Transaction command is costly and it has limitations for wider timeframe and larger datasets.
Right! If you have only one indexer
@ITWhisperer  Can you please help?
Hi All,   I've installed the Splunk Add-on for Unix and Linux in both Splunk Enterprise as well as my forwarder which is running 9.3.2  However I keep running into this error below: 12-19-2024 15:... See more...
Hi All,   I've installed the Splunk Add-on for Unix and Linux in both Splunk Enterprise as well as my forwarder which is running 9.3.2  However I keep running into this error below: 12-19-2024 15:54:30.303 +0000 ERROR ExecProcessor [1376795 ExecProcessor] - message from "/opt/splunkforwarder/etc/apps/Splunk_TA_nix/bin/vmstat_metric.sh" /opt/splunkforwarder/etc/apps/Splunk_TA_nix/bin/hardware.sh: line 62: /opt/splunkforwarder/var/run/splunk/tmp/unix_hardware_error_tmpfile: No such file or directory   The above is coming from the splunkd.log after I have stopped and restarted the SplunkForwarder.service.  I am very new to Splunk and do not posses any certifications.  My company has tasked me with learning and configuring Splunk and I am enjoying it except I am unable to get this data sent to my indexer so that I can see the data in Search and Reporting.   These are the steps taken so far: Installed Splunk Add-on for Unix and Linux on my enterprise UI machine installed Splunk Add-on for Unix and Linux on my unvf As the local directory was not created at "/opt/splunkforwarder/etc/apps/Splunk_TA_nix/" I created it and copied the inputs.conf from default to local and then allowed the scripts I wanted. Made sure the Splunk user was owner and had the privileges needed to the local directory stopped the splunk service and restarted it. Ran -  cat /opt/splunkforwarder/var/log/splunk/splunkd.log | grep ERROR Almost every Error is "unix_hardware_error_tmpfile: No such file or directory" if I create the tmpfile it disappears and is not recreated I'm sure there are many other things I didnt mention because I honestly dont remember because I have been trying to figure this issue out since yesterday and am not getting anywhere.  PLEASE HELP!
I am trying to set up a synthetic browser test that makes use of variables. I can't seem to find information about the usage of variables in Browser Tests other than this. So far I tried to: Access... See more...
I am trying to set up a synthetic browser test that makes use of variables. I can't seem to find information about the usage of variables in Browser Tests other than this. So far I tried to: Access Global variable as a url in a "go to URL" step -> which leads to a "https:// is not a valid URL" although I can see the variable in the "variables" tab Access Global/Predefined variables in "execute Javascript" steps -> undefined Set variable via "Save return value from javascript" to variable var and try to reuse it in both -> "assert Text present" with {{custom.$var}} -> undefined            -> "execute Javascript" with custom.var -> undefined Assign var/const in "execute Javascript" step and reference it in consecutive "execute Javascript" steps -> undefined Tried to access built-in variables: in "execute Javascript" tests (except those that are only available in API-Tests) -> undefined   Which raises the following questions for me: What is the idiomatic way to use variables in Synthetic Browser Tests? So far it seems to me that they can only be used to fill a field as its the only Action mentioned here and no other action I tried seems to support variables which would quite honestly be really dissapointing. Am I overlooking any documentation? Which kind of actions support the use of variables created by other steps?   Thank you
Thank you for the help. I always get null for  TargetLocation in stats and thus showing "Pending" I notice that latest(TargetLocation) has multiple values and null is the latest. Is there a way to e... See more...
Thank you for the help. I always get null for  TargetLocation in stats and thus showing "Pending" I notice that latest(TargetLocation) has multiple values and null is the latest. Is there a way to eliminate null so that the latest time can be displayed?
Is there a way to capture the Thread Name for Business Transactions in Java? I see RequestGUID and URL are captured when looking through the UI. Thanks.
Thank you for the detailed explanation; I truly appreciate it.
Happens in Splunk Enterprise v9.4.0 for Windows too.