All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I managed to add extra python modules to Python for Scientific Computing by building it from source. In my case I added xgboost for Linux (64-bit) Cloned the GitHub repo: https://github.com/splunk/... See more...
I managed to add extra python modules to Python for Scientific Computing by building it from source. In my case I added xgboost for Linux (64-bit) Cloned the GitHub repo: https://github.com/splunk/Splunk-python-for-scientific-computing Just for increased stability I checked out the latest available git tag (currently 4.2.1, this step might not be necessary) I then added in environment.nix.yml the python module that I want (in my case: - xgboost=2.1.1) Afterwards, followed the readme and run: make freeze make build make dist Finally copied the tarball from the build directory to the user_apps directory on Splunk (substituting the existing Python for Scientific Computing app if already installed). When using Python for Scientific Computing copy the exec_anaconda and util python scripts to the bin directory of your app. Also copy the lib/splunklib from Python for Scientific Computing to the bin directory of your app. Add these lines to the start of your script: import exec_anaconda exec_anaconda.exec_anaconda()  
Here I have 2 event statistics (id=superman & id=batman) in Json format. How do I arrange it in a table format based on the id and enemy information values? This is slightly different to previous que... See more...
Here I have 2 event statistics (id=superman & id=batman) in Json format. How do I arrange it in a table format based on the id and enemy information values? This is slightly different to previous questions as for the "enemies"-key values, the "enemy_information" has the value headers while "enemy_information_values" contains a list of values matching the  "enemy_information" value headers.  For example, I want the result to look something like the table below. I know the data needs pre-processing, but I wanted to know if it was possible to do so via SPL commands. The reasoning to avoid pre-processing is I already have previous data with the same format that are too far back for me to re-ingest.  [{     "id": "Superman",     "birthName": "Kal-El",     "origin": "Krypton",     "enemies": [         {             "enemy_information": [                 "name",                 "location",                 "powers"             ],             "enemy_information_values": [                 [                     "Doomsday",                     "Kryptonian Prison",                     [                         "Super Strength",                         "Invulnerability",                         "Regeneration",                         "Adaptation",                         "Enhanced Durability",                         "Immunity to Kryptonite"                     ]                 ],                 [                     "Lex Luthor",                     "Metropolis",                     [                         "Genius-level Intellect",                         "Skilled Strategist",                         "Advanced Technology and Weaponry",                         "Political Influence",                         "Expert in Kryptonite"                     ]                 ]                             ]         }     ] }, {     "id": "Batman",     "birthName": "Bruce Wayne",     "origin": "Gotham City",     "enemies": [         {             "enemy_information": [                 "name",                 "location",                 "powers"             ],             "enemy_information_values": [                 [                     "Joker",                     "Gotham City",                     [                         "Genius-level Intellect",                         "Master of Psychological Manipulation",                         "Skilled Hand-to-Hand Combatant",                         "Expert in Criminal Psychology",                         "Master of Disguise"                     ]                 ],                 [                     "Two-Face",                     "Gotham City",                     [                         "Expert Marksman",                         "Skilled Hand-to-Hand Combatant",                         "Access to Advanced Weaponry",                         "Strategic Mind",                         "Psychological Trauma"                     ]                 ]             ]         }     ] }]    
HI, I have a customer using splunk for just syslog.  There has recently been a ddos attack, we are looking to report on how much traffic came from the known ddos hosts. In the syslog the router has... See more...
HI, I have a customer using splunk for just syslog.  There has recently been a ddos attack, we are looking to report on how much traffic came from the known ddos hosts. In the syslog the router has flagged the known IP's as > msg="torproject.org:Anonymizers, SSI:N" note="ACCESS BLOCK" We can search for this fine, however there is a preceding entry for the sending IP address that is in the syslog where the router has forwarded this from firewall to its ip address check phase.  We are looking to get total rows of all traffic from ddos hosts So we search for "torproject" we then want to search again for all ip's that appeared in that first search. Then extract from that search every "src="103.76.173.203:7627" then search for all those Any ideas please? End goal = how much traffic was from ddos hosts and how much wasnt (as a rough %) Thanks in advance
Hello, I've noticed that the application is marked as archived and unsupported. When I try to download it from your link, I receive the following message: "Archived apps are unsupported. These apps ... See more...
Hello, I've noticed that the application is marked as archived and unsupported. When I try to download it from your link, I receive the following message: "Archived apps are unsupported. These apps were removed from Splunkbase or archived by the developer. Splunk does not provide support for these apps."
Hi @BRFZ , you can use the Stormshield_TA (https://splunkbase.splunk.com/app/3069). ciao. Giuseppe
Hello, I need to collect logs from a firewall Stormshield. Do you have any suggestions on how to gather these logs, or is there a specific add-on available for this purpose? Thank you in advance.
Hi @vasudevahebri , when you click to the NFR License generation, Splunk sends an email to you containing the licenses and the links. Ciao. Giuseppe
Hi @zksvc , this lookup belongs to the ES Content Updates App, so it should be updated when you update this app. Ciao. Giuseppe
Hi @VijaySrrie assuming you are collecting the logs on syslog server then forwarding to Splunk with a UF? You can check if the UF is reaching its thruput limit which could cause indexing lag: index... See more...
Hi @VijaySrrie assuming you are collecting the logs on syslog server then forwarding to Splunk with a UF? You can check if the UF is reaching its thruput limit which could cause indexing lag: index=_internal sourcetype=splunkd component=ThruputProcessor "has reached maxKBps"
Hi @nawab123 most likely there is an issue with either your authentication.conf settings, or an issue connecting to the LDAP/SAML system. Check splunkd.log for those kind of errors
Hi everyone i want to ask where can i get latest update for legit_domains.csv ? Ask here because when i check it in lookup it says no owner, so i think it created automatically from Splunk. I k... See more...
Hi everyone i want to ask where can i get latest update for legit_domains.csv ? Ask here because when i check it in lookup it says no owner, so i think it created automatically from Splunk. I know it can be update it manually, but it takes time again. it will helpfull when you can give me latest update for this .csv   
Checking the "host" details these complaints are all coming from the heavy forwarders, so I assumed this is where I should be checking. Running the command on a heavy forwarder produced the followin... See more...
Checking the "host" details these complaints are all coming from the heavy forwarders, so I assumed this is where I should be checking. Running the command on a heavy forwarder produced the following output: /opt/splunk/etc/system/default/props.conf DATETIME_CONFIG = /etc/datetime.xml  While on any search head or indexer the output is as you suggested: /opt/splunk/etc/apps/Splunk_TA_nix/default/props.conf DATETIME_CONFIG = CURRENT There is no '/etc/datetime.xml', however there is an '/opt/splunk/etc/datetime.xml' file. I have no idea of the configuration is reffering to a non-existing file or the built in Splunk one. I don't know if or why anyone would have modified this setting, or the consequences of doing such. I'll do some reasearch on my own but any feedback and/or suggestions are more than welcome
i've parsed my InputFile (json-parser) and before one of the missing event there is an error, like unexpected non-white-space sign. So i think, it is not a problem of splunk!  
Thank you for your questions @PickleRick . I'm using forwarding mechanismen. Here are the stanzas form the forwarder: inputs.conf [monitor:///daten/datasources/data/mg_test/entry2group/*.json] d... See more...
Thank you for your questions @PickleRick . I'm using forwarding mechanismen. Here are the stanzas form the forwarder: inputs.conf [monitor:///daten/datasources/data/mg_test/entry2group/*.json] disabled = false index = mg_test sourcetype = json_test crcSalt = <SOURCE> whitelist = .*\d{8}_Q\d_entry_entry2group\.v\d\.(\d\d\.){2}json$ [json_test] DATETIME_CONFIG = TIMESTAMP_FIELDS = test.sys_created_on INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true category = Structured description = test json disabled = false pulldown_type = true I've copied this props.conf from my first try to upload (over splunk-web). Here is the stanza from ../etc/system/local/props.conf [test_json] DATETIME_CONFIG = INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) TIMESTAMP_FIELDS = test.sys_created_on category = Structured description = JavaScript Object Notation format. For more information, visit http://json.org/ disabled = false pulldown_type = true Another investigation shows me, you are on the right way! I found following event on _internal. 08-25-2024 19:31:28.338 +0200 ERROR JsonLineBreaker [1737739 structuredparsing] - JSON StreamId:1586716756715697390 had parsing error:Unexpected character while looking for value: ',' - data_source="daten/datasources/data/mg_test/entry2group/20240825_Q2_entry_entry2group.v0.03.01 .json]", data_host="socmg_local_fw", data_sourcetype="json_test"   So in the next step i will isolate one event (object) which is lost if there are special sign in the data.
There is something not right about this. If your events are indeed formed this way (a multiline entries) and your LINE_BREAKER is set to ([\r\n]+) there is no way they are ingested as a whole. Tell ... See more...
There is something not right about this. If your events are indeed formed this way (a multiline entries) and your LINE_BREAKER is set to ([\r\n]+) there is no way they are ingested as a whole. Tell us more about how you are ingesting it (and if you're reading a file with a forwarder, show us the relevant inputs.conf stanza and props.conf stanza from the forwarder).
Further investigation: I shortened the json-objects from 44 to 43 lines. { "1.Entry": "1.Data", ... "43.Entry": "43.Data" }, ... 48.186 similiar entries... { "1.Entry": "1.Data", ... ... See more...
Further investigation: I shortened the json-objects from 44 to 43 lines. { "1.Entry": "1.Data", ... "43.Entry": "43.Data" }, ... 48.186 similiar entries... { "1.Entry": "1.Data", ... "43.Entry": "43.Data" }  But forwarding the json-file leaded to the count of 45.352 events (presents 45.352 json-objects), instead of 48.188 objects. That's a little bit 'loco' i think.
We don't have access to your Splunk so we can't provide links to it. To find your license page, go to Settings->Licensing.  You must have (or inherit) the Admin role and be using Splunk Enterprise.
Hi, it really did my my headthis path thing, but I found a way, I tried LongPath Tool Program and that sorted it.
260 get me ? Yeah it can be a real headache, I tried LongPath Tool Program which helped a lot.
260 get me ? Yeah it can be a real headache, I tried LongPath Tool Program which helped a lot.