All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

It is important where you put your settings. Parsing is done on the first "heavy" component in event's path to indexers. So if you have a HF as an intermediate forwarder, you need to put your props/... See more...
It is important where you put your settings. Parsing is done on the first "heavy" component in event's path to indexers. So if you have a HF as an intermediate forwarder, you need to put your props/transforms there. Of course you will still be getting already indexed events during searching index-time transforms are applied only for the new events.
OK. But do you have just one column with multiple values? Or do you have multiple columns? How would your lookup contents match the data you want to search for?
It highly depends on the components involved. But this is a fairly normal functionality for SOAR playbook to get an artifact, manipulate it, check it using configured external services and return a r... See more...
It highly depends on the components involved. But this is a fairly normal functionality for SOAR playbook to get an artifact, manipulate it, check it using configured external services and return a report or use the result of suhch check to modify behaviour in further part of a playbook. You can download the community version of Splunk SOAR and see for yourself.
Thank you for your response! Could you please share your insights on how we can achieve this in a Splunk SOAR environment? Additionally, if there are any apps on Splunkbase that provide similar funct... See more...
Thank you for your response! Could you please share your insights on how we can achieve this in a Splunk SOAR environment? Additionally, if there are any apps on Splunkbase that provide similar functionality, I would greatly appreciate your recommendations.
I have a lookup file saved with a single column having values of specific fields in it. And want to use to search in query which matched with values in field names Example: lookupname : test.csv ... See more...
I have a lookup file saved with a single column having values of specific fields in it. And want to use to search in query which matched with values in field names Example: lookupname : test.csv column name: column1 fieldname: field1
Yes , Heavy forwarders are being used but how it will impact the data filtration ?
Hi @redmandba , if a search gives results, can be used in a dropdown. Can you share the code of your dropdown? maybe the issue is in the other parameters. Ciao. Giuseppe
Hi @shoaibalimir , the formula is always the same, but anyway, on Splunk Cloud, you don't need to think to the required storage, because you have only to think about how many logs must be indexed ev... See more...
Hi @shoaibalimir , the formula is always the same, but anyway, on Splunk Cloud, you don't need to think to the required storage, because you have only to think about how many logs must be indexed every day, required storege is a problem of Splunk Cloud administrators. In your contract you shoudl have defined the daily indexed volume and the retention period, storage isn't your problem. The license consuption and the storage entitlement are two related but different values, you have to put attention only one license consuption to avoid to exceed the limit too times. Ciao. Giuseppe
extract the new messages is fine. tried 'Extract New Fields', not easy to work
You can use "rule_description" as the field for the above description.
I want to extract the 'description' field. it can be for the new messages
ah yes, this example needs to run on its own and will create sample events. but see my other reply this needs more logic
Just noticed that will not work will ned some loop hooping to get months and then days ....
Hi Mus  Thanks for help However when i run the query i am getting an error message  Error in 'makeresults' command: This command must be the first command of a search.  
Hi there, try this :   | makeresults | eval alert_value=1060, BatteryAge=strftime(alert_value, "%m months %d days")   this will return: but not sure you then can use it in a single value pa... See more...
Hi there, try this :   | makeresults | eval alert_value=1060, BatteryAge=strftime(alert_value, "%m months %d days")   this will return: but not sure you then can use it in a single value panel. Just give it a try. Hope this helps ... Cheers, MuS Update: This is based on the simple assumption every month has 4 weeks, because I'm not a mathematician nor scientist | makeresults | eval alert_value=1060, secs=alert_value*86400, months=round(secs/604800), days=round(alert_value - ((secs- (secs/604800)) / 60 /60 /24)) , alert_value = months ." months ". days ." days"  
Hi, @sainag_splunk  I entered your search command on my splunk search app, the results were not shown. No results in your command from my source type, "my_json". I have confused how to resolve this... See more...
Hi, @sainag_splunk  I entered your search command on my splunk search app, the results were not shown. No results in your command from my source type, "my_json". I have confused how to resolve this issue, It may cause critical errors for analysing our data.  Is there anything to try to resolve the issue? I have tried that,  the data has line breaking after ':', so the parsing error was caused, in my think. I treid to change the value "LINE_BREAKER=[}|,]+[\r\n]+", this means if the end of line is ":\r\n", UF will don't break the line. But though changing the LINE_BREAKER value, the parsing errors are still raised.  24/10/23 12:02:22.193   10-23-2024 12:02:22.193 +0900 ERROR JsonLineBreaker [7804 structuredparsing] - JSON StreamId:15916142412051242565 had parsing error:Unexpected character: ':' - data_source="C:\splunk\<my_path>.bin", data_host="<my_host>", data_sourcetype="my_json"
Hi @gcusello, Thank you for sharing the formula for the storage, but is it applicable to Splunk Cloud? Also the average license consumption, is it the data ingestion or the storage entitlement we a... See more...
Hi @gcusello, Thank you for sharing the formula for the storage, but is it applicable to Splunk Cloud? Also the average license consumption, is it the data ingestion or the storage entitlement we are talking about? Thanks in advance!
Hi @afeng You want to extract at for the already ingested/existing logs at Splunk indexer(search time)  or for the new logs yet to be ingested to splunk (are you using any addons, TA's.. are you us... See more...
Hi @afeng You want to extract at for the already ingested/existing logs at Splunk indexer(search time)  or for the new logs yet to be ingested to splunk (are you using any addons, TA's.. are you using UF and/or HF?)  
I allowed Windows Firewall port 8000. And I got firewall log. Then at local server browser I accessed https://192.168.0.8:8000. This browser access was timeout. And in firewall log access log didn... See more...
I allowed Windows Firewall port 8000. And I got firewall log. Then at local server browser I accessed https://192.168.0.8:8000. This browser access was timeout. And in firewall log access log didn't remain. I think before Windows firewall allow or block, browser access is being blocked by anyone. But I don't know local access is denied without Windows firewall. I use Windows defender. I don't use firewall application without it. What does stop browser access in local server. Who do have any idea? Thank you.
Over a decade later but here is my RPI info and which forwarder worked on it: @raspberrypi:/opt# uname -a Linux raspberrypi 6.1.53-v8+ #1680 SMP PREEMPT Wed Sep 13 18:09:06 BST 2023 aarch64 GNU/Lin... See more...
Over a decade later but here is my RPI info and which forwarder worked on it: @raspberrypi:/opt# uname -a Linux raspberrypi 6.1.53-v8+ #1680 SMP PREEMPT Wed Sep 13 18:09:06 BST 2023 aarch64 GNU/Linux From previous releases page: Splunk Universal Forwarder 8.1.9 / ARMv6 / 2.6+, 3.x+, 4.x+, or 5.x+ kernel Linux distributions 32-bit