All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I thank you but I can not share much information because confidential. It’s better to close the post. Thanks for your help. Excuse me for being upset.
Hi @AL3Z , as I said, does your regex match the string to search or not? if matches it's correct, if not, it isn't! Ciao. Giuseppe
Hello @ITWhisperer, Thank you for your response.  Can you please help with example of how to write the code? |inputlookup myTable.csv |where _time=relative_time(now(),"-1d@d") Now I need to apply... See more...
Hello @ITWhisperer, Thank you for your response.  Can you please help with example of how to write the code? |inputlookup myTable.csv |where _time=relative_time(now(),"-1d@d") Now I need to apply the regular expression on fieldA and store the extracted data from each row in field: res. It would be very helpful if you could help. Thank you
Hi,  I script something by myself and I want to share it with you.   https://github.com/Gotarr/Splunkbase-Download-script  (python-script) My Inspiration is from @tfrederick74656  but his script... See more...
Hi,  I script something by myself and I want to share it with you.   https://github.com/Gotarr/Splunkbase-Download-script  (python-script) My Inspiration is from @tfrederick74656  but his script dosnt work for me very well.   Happy splunking and let me know if something dosnt work.  
@yuanliu  Your suggestion worked for me, but is there a way to put comments with Carriage Return in multiple lines? See below.. Thanks { "visualizations": { "viz_OQMhku6K": { "type": "splu... See more...
@yuanliu  Your suggestion worked for me, but is there a way to put comments with Carriage Return in multiple lines? See below.. Thanks { "visualizations": { "viz_OQMhku6K": { "type": "splunk.ellipse", "_comment": " ================================== This is created by Person1 on 1/1/2023 @companyb On 2/1/2023 - added base search On 2/5/203 - added dropdown box " } },  
You need to direct the "unwanted" events to a nullqueue
You need to specify a field that you wish the extracted pattern to be put in - for example: xxx[\_\w]+:(?<fieldname>[a-z_]+)
How to I eliminate partial user id characters coming out of a search query?   Here are examples of incomplete userIDs - whereupon they shouldnt appear at all:   The middle GSA line is the correct exa... See more...
How to I eliminate partial user id characters coming out of a search query?   Here are examples of incomplete userIDs - whereupon they shouldnt appear at all:   The middle GSA line is the correct example userID- the rest is garbage and I want to eliminate that 01022703 021216 07602381 "1206931120@GSA.GOV" 177 177670 1969412 232789
Hello All, I have a lookup file with multiple fields. I am reading it using inputlookup command and implementing some filters. Now  I need to apply regex on a field and extract the corresponding mat... See more...
Hello All, I have a lookup file with multiple fields. I am reading it using inputlookup command and implementing some filters. Now  I need to apply regex on a field and extract the corresponding matched string from each row of the lookup into a separate field. The regex is: xxx[\_\w]+:([a-z_]+) Thus, I need your guidance and inputs to build the same. Thank you Taruchit  
I want to filter the palo logs at the forwarder level by looking at the packet before indexing (licensing) based certain condition like... zone, firewall name (enterprise) etc The logs come to both ... See more...
I want to filter the palo logs at the forwarder level by looking at the packet before indexing (licensing) based certain condition like... zone, firewall name (enterprise) etc The logs come to both our UF & HF, what is the best way to achieve it. Was looking into a few docs suggesting to apply ingest eval, is that feasible? Can anyone please help me with this.
But, this regex with SED will replace in all events, i only need them replaced when action=Allowed and Service=23" in raw events. your regex will not satisfy below event. 2023-11-20K00:12:00-05:00 1... See more...
But, this regex with SED will replace in all events, i only need them replaced when action=Allowed and Service=23" in raw events. your regex will not satisfy below event. 2023-11-20K00:12:00-05:00 111.111.11.111 time=1700513221|hostname=firewallhost|product=Firewall|action=Denied|ifdir=inbound|ifname=eth3-01|logid=xxxx|loguid={xxxx,xxxx,xxxx,xxxx}|origin=111.111.11.111|originsicname=PK\=originsicname,O\=xpljdkdk..xpl78kdk|sequencenum=00|time=1700513221|version=5|dst=111.11.1.111|dst_country=PL|inspection_information=Geo-location outbound enforcement|inspection_profile=Geo_settings_upgraded_from_FWPRMLP_Internet_v4|protection_type=geo_protection|proto=99|s_port=1234|service=67|src=111.11.1.111|src_country=Other  
Use SEDCMD in props.conf [mysourcetype] SEDCMD-rm-geo_protection = s/protection_type=geo_protection/---/g  
Well, this is how it's supposed to work. list() or values() gives you a multivalued field with a list of values. If you need something else, you need to do something else.
It's possible but it's not possible to "add" this to an already done search because at each "pipe point" in the pipeline you lose all the data that's not passed to the next step in the pipeline so yo... See more...
It's possible but it's not possible to "add" this to an already done search because at each "pipe point" in the pipeline you lose all the data that's not passed to the next step in the pipeline so you can't "gather" additional data (unless you do some fancy and ineffective things like the map command). So you'd have to first do | tstats not just as a general count but would have to do the additional clause of "BY _time span=1d" to get a separate data point for each day.  With few different user you probably could do timechart then (you could use prestats=t mode of tstats for that case) and do streamstats count resetting on zero count values for given day. Otherwise you'd probably have to use streamstats to find last date for each user that showed the count and then do eval to mark consecutive days and another streamstats to count those consecutive days. Kinda complicated but it's doable.
You're much more likely to get a relevant answer if you post a new question instead of digging up an old thread (especially that old).
No, it's about the unescaped quotes in the searchmatch() argument. If it needs embedded strings, the quotes for those strings should be escaped.
Got a customer wanting the UFs to send data to the forcepoint DLP and then to the intermediate heavy forwarder. The reason is to have the dlp mask PII even though we could have splunk do that. Any re... See more...
Got a customer wanting the UFs to send data to the forcepoint DLP and then to the intermediate heavy forwarder. The reason is to have the dlp mask PII even though we could have splunk do that. Any reason that that this data flow wouldn't work? An admin that I have to go through for the UF installs and the DLP setup tells me that forcepoint can't read splunk data sent to it which I don't believe.  Apologies for possibly an easy question but any documentation to support the above architecture would be great. 
This is my raw data, I have to calculate the last 3 months executions date month countries Jobs type November AUS Execution October JER Execution September IND Execution Augus... See more...
This is my raw data, I have to calculate the last 3 months executions date month countries Jobs type November AUS Execution October JER Execution September IND Execution August ASI Execution
It is not working as if doesn't take AND and NOT in if command. getting error : Error in 'EvalCommand': The expression is malformed. Expected ).
I want to filter the palo logs at the forwarder level by looking at the packet before indexing( licensing) based certain condition like... zone, firewall name(enterprise) etc The logs comes to both ... See more...
I want to filter the palo logs at the forwarder level by looking at the packet before indexing( licensing) based certain condition like... zone, firewall name(enterprise) etc The logs comes to both our UF & HF, what is the best way to achieve it. Was looking into a few doc suggesting to apply ingest eval, is that feasible? Can anyone please help me with this.