All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @cherrypick , are you using INDEXED_EXTRACTIONS = JSON in your sourcetype? Ciao. Giuseppe
I am trying out this . I will let you know whether it worked ! Thanks .
Hi @BRFZ , you have only one solution, use it and maintain by yourself. Otherwise you should create your own custom add-on that's the same thing! Ciao. Giuseppe
Hi All, I have two queries which searches for users that use an app. The apps are not in the same fields which was why I had to split the queries. But now I want to join the queries to get the res... See more...
Hi All, I have two queries which searches for users that use an app. The apps are not in the same fields which was why I had to split the queries. But now I want to join the queries to get the results Query 1 index=db_it_network sourcetype=pan* url_domain="www.perplexity.ai" | table user, url_domain, date_month | stats count by user url_domain date_month  | chart count by url_domain date_month  | sort url_domain 0 Query 2 index=db_it_network sourcetype=pan*  app=claude-base OR app=google-gemini* OR app=openai* OR app=bing-ai-base | table user, app, date_month | stats count by user app date_month  | chart count by app date_month  | sort app 0 results example that I want App August July claude-base 123 120 google-gemini 124 42 openai 153 123 bing-ai-base 212 232 www.perplexity.com 14 12
Hello, I have successfully integrated Cloudflare with Splunk Enterprise using the pull method. This integration was set up on a Heavy Forwarder, so the logs are first received by the HF before being... See more...
Hello, I have successfully integrated Cloudflare with Splunk Enterprise using the pull method. This integration was set up on a Heavy Forwarder, so the logs are first received by the HF before being forwarded to the Indexers. While the integration itself is working correctly, I encountered an issue with the time zone in the logs. The API we are using requires the timestamps to be in UTC. As a result, when the API fetches the logs, the events are recorded in the UTC timezone. However, I need to convert these timestamps from UTC to UTC+5 (Pakistan Standard Time, PKT). Here is a sample log event from Cloudflare:   " --- EdgeEndTimestamp: 2024-08-26T09:07:43Z EdgeResponseBytes: 72322 EdgeResponseStatus: 206 EdgeStartTimestamp: 2024-08-26T09:07:43Z --- " We are extracting the EdgeStartTimestamp and using it for the _time field, but this timestamp is in UTC format. In my props.conf file on the Heavy Forwarder, I have the following configuration: [cloudflare:json] disabled = false TIME_PREFIX = \"EdgeStartTimestamp\":\" TIME_FORMAT = %Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD = 19   I also tried adding the TZ setting to props.conf: [cloudflare:json] TZ = Asia/Karachi However, this didn't work because the events themselves contain timezone information (UTC), so the TZ setting doesn't have any effect. I then tried using TZ_ALIAS in props.conf: [cloudflare:json] TZ_ALIAS = Z=UTC+5 This didn't work either. Finally, I tried the following in props.conf, but it still didn't resolve the issue: [cloudflare:json] EVAL-_time = _time + 5*3600   Any help would be appreciated.
Hi @gcusello Thanks for your reply when i check my "ES Content Updates" is version 4.0.0 and it available Update to 4.38.0 I have question if I update it, will it affect other use cases that have bee... See more...
Hi @gcusello Thanks for your reply when i check my "ES Content Updates" is version 4.0.0 and it available Update to 4.38.0 I have question if I update it, will it affect other use cases that have been enabled? Or will it be safe for other use cases?
Hi, I am currently learning Splunk and trying to set up for myself on my local machine. I am looking at the Splunk BOTS v2 guide and can see there are a number of apps to be added. There is one ap... See more...
Hi, I am currently learning Splunk and trying to set up for myself on my local machine. I am looking at the Splunk BOTS v2 guide and can see there are a number of apps to be added. There is one app which I am unsure how to download and add to the web gui as there are no links to download. App: Collectd App for Splunk Enterprise https://splunkbase.splunk.com/app/2875/ Upon visiting the site (to GitHub), I am presented with some instructions to configure things which is a little confusing for new starters, but also not able to see the app download link. Am I missing something here or it's just no longer relevant for V2? I am not using any forwarders, indexers etc, just one host to try set this up.   Thanks.
I managed to add extra python modules to Python for Scientific Computing by building it from source. In my case I added xgboost for Linux (64-bit) Cloned the GitHub repo: https://github.com/splunk/... See more...
I managed to add extra python modules to Python for Scientific Computing by building it from source. In my case I added xgboost for Linux (64-bit) Cloned the GitHub repo: https://github.com/splunk/Splunk-python-for-scientific-computing Just for increased stability I checked out the latest available git tag (currently 4.2.1, this step might not be necessary) I then added in environment.nix.yml the python module that I want (in my case: - xgboost=2.1.1) Afterwards, followed the readme and run: make freeze make build make dist Finally copied the tarball from the build directory to the user_apps directory on Splunk (substituting the existing Python for Scientific Computing app if already installed). When using Python for Scientific Computing copy the exec_anaconda and util python scripts to the bin directory of your app. Also copy the lib/splunklib from Python for Scientific Computing to the bin directory of your app. Add these lines to the start of your script:   import exec_anaconda exec_anaconda.exec_anaconda()    
I managed to add extra python modules to Python for Scientific Computing by building it from source. In my case I added xgboost for Linux (64-bit) Cloned the GitHub repo: https://github.com/splunk/... See more...
I managed to add extra python modules to Python for Scientific Computing by building it from source. In my case I added xgboost for Linux (64-bit) Cloned the GitHub repo: https://github.com/splunk/Splunk-python-for-scientific-computing Just for increased stability I checked out the latest available git tag (currently 4.2.1, this step might not be necessary) I then added in environment.nix.yml the python module that I want (in my case: - xgboost=2.1.1) Afterwards, followed the readme and run: make freeze make build make dist Finally copied the tarball from the build directory to the user_apps directory on Splunk (substituting the existing Python for Scientific Computing app if already installed). When using Python for Scientific Computing copy the exec_anaconda and util python scripts to the bin directory of your app. Also copy the lib/splunklib from Python for Scientific Computing to the bin directory of your app. Add these lines to the start of your script: import exec_anaconda exec_anaconda.exec_anaconda()  
Here I have 2 event statistics (id=superman & id=batman) in Json format. How do I arrange it in a table format based on the id and enemy information values? This is slightly different to previous que... See more...
Here I have 2 event statistics (id=superman & id=batman) in Json format. How do I arrange it in a table format based on the id and enemy information values? This is slightly different to previous questions as for the "enemies"-key values, the "enemy_information" has the value headers while "enemy_information_values" contains a list of values matching the  "enemy_information" value headers.  For example, I want the result to look something like the table below. I know the data needs pre-processing, but I wanted to know if it was possible to do so via SPL commands. The reasoning to avoid pre-processing is I already have previous data with the same format that are too far back for me to re-ingest.  [{     "id": "Superman",     "birthName": "Kal-El",     "origin": "Krypton",     "enemies": [         {             "enemy_information": [                 "name",                 "location",                 "powers"             ],             "enemy_information_values": [                 [                     "Doomsday",                     "Kryptonian Prison",                     [                         "Super Strength",                         "Invulnerability",                         "Regeneration",                         "Adaptation",                         "Enhanced Durability",                         "Immunity to Kryptonite"                     ]                 ],                 [                     "Lex Luthor",                     "Metropolis",                     [                         "Genius-level Intellect",                         "Skilled Strategist",                         "Advanced Technology and Weaponry",                         "Political Influence",                         "Expert in Kryptonite"                     ]                 ]                             ]         }     ] }, {     "id": "Batman",     "birthName": "Bruce Wayne",     "origin": "Gotham City",     "enemies": [         {             "enemy_information": [                 "name",                 "location",                 "powers"             ],             "enemy_information_values": [                 [                     "Joker",                     "Gotham City",                     [                         "Genius-level Intellect",                         "Master of Psychological Manipulation",                         "Skilled Hand-to-Hand Combatant",                         "Expert in Criminal Psychology",                         "Master of Disguise"                     ]                 ],                 [                     "Two-Face",                     "Gotham City",                     [                         "Expert Marksman",                         "Skilled Hand-to-Hand Combatant",                         "Access to Advanced Weaponry",                         "Strategic Mind",                         "Psychological Trauma"                     ]                 ]             ]         }     ] }]    
HI, I have a customer using splunk for just syslog.  There has recently been a ddos attack, we are looking to report on how much traffic came from the known ddos hosts. In the syslog the router has... See more...
HI, I have a customer using splunk for just syslog.  There has recently been a ddos attack, we are looking to report on how much traffic came from the known ddos hosts. In the syslog the router has flagged the known IP's as > msg="torproject.org:Anonymizers, SSI:N" note="ACCESS BLOCK" We can search for this fine, however there is a preceding entry for the sending IP address that is in the syslog where the router has forwarded this from firewall to its ip address check phase.  We are looking to get total rows of all traffic from ddos hosts So we search for "torproject" we then want to search again for all ip's that appeared in that first search. Then extract from that search every "src="103.76.173.203:7627" then search for all those Any ideas please? End goal = how much traffic was from ddos hosts and how much wasnt (as a rough %) Thanks in advance
Hello, I've noticed that the application is marked as archived and unsupported. When I try to download it from your link, I receive the following message: "Archived apps are unsupported. These apps ... See more...
Hello, I've noticed that the application is marked as archived and unsupported. When I try to download it from your link, I receive the following message: "Archived apps are unsupported. These apps were removed from Splunkbase or archived by the developer. Splunk does not provide support for these apps."
Hi @BRFZ , you can use the Stormshield_TA (https://splunkbase.splunk.com/app/3069). ciao. Giuseppe
Hello, I need to collect logs from a firewall Stormshield. Do you have any suggestions on how to gather these logs, or is there a specific add-on available for this purpose? Thank you in advance.
Hi @vasudevahebri , when you click to the NFR License generation, Splunk sends an email to you containing the licenses and the links. Ciao. Giuseppe
Hi @zksvc , this lookup belongs to the ES Content Updates App, so it should be updated when you update this app. Ciao. Giuseppe
Hi @VijaySrrie assuming you are collecting the logs on syslog server then forwarding to Splunk with a UF? You can check if the UF is reaching its thruput limit which could cause indexing lag: index... See more...
Hi @VijaySrrie assuming you are collecting the logs on syslog server then forwarding to Splunk with a UF? You can check if the UF is reaching its thruput limit which could cause indexing lag: index=_internal sourcetype=splunkd component=ThruputProcessor "has reached maxKBps"
Hi @nawab123 most likely there is an issue with either your authentication.conf settings, or an issue connecting to the LDAP/SAML system. Check splunkd.log for those kind of errors
Hi everyone i want to ask where can i get latest update for legit_domains.csv ? Ask here because when i check it in lookup it says no owner, so i think it created automatically from Splunk. I k... See more...
Hi everyone i want to ask where can i get latest update for legit_domains.csv ? Ask here because when i check it in lookup it says no owner, so i think it created automatically from Splunk. I know it can be update it manually, but it takes time again. it will helpfull when you can give me latest update for this .csv   
Checking the "host" details these complaints are all coming from the heavy forwarders, so I assumed this is where I should be checking. Running the command on a heavy forwarder produced the followin... See more...
Checking the "host" details these complaints are all coming from the heavy forwarders, so I assumed this is where I should be checking. Running the command on a heavy forwarder produced the following output: /opt/splunk/etc/system/default/props.conf DATETIME_CONFIG = /etc/datetime.xml  While on any search head or indexer the output is as you suggested: /opt/splunk/etc/apps/Splunk_TA_nix/default/props.conf DATETIME_CONFIG = CURRENT There is no '/etc/datetime.xml', however there is an '/opt/splunk/etc/datetime.xml' file. I have no idea of the configuration is reffering to a non-existing file or the built in Splunk one. I don't know if or why anyone would have modified this setting, or the consequences of doing such. I'll do some reasearch on my own but any feedback and/or suggestions are more than welcome