All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @Arun, at first, I hint to open a different question, instead using an already closed one, this is better to you to have more answers. Anyway, did you disabled local firewall (firewalld) on Red ... See more...
Hi @Arun, at first, I hint to open a different question, instead using an already closed one, this is better to you to have more answers. Anyway, did you disabled local firewall (firewalld) on Red Hat? I didn't experience the issues you describe, check the local firewall and the grants on the user you're using. Ciao. Giuseppe
Hi @rferg06,  Be sure that your dataSources section, you are specifying the queryParameters with your dateRange token "queryParameters": { "earliest": "$dateRange.earliest$" ... See more...
Hi @rferg06,  Be sure that your dataSources section, you are specifying the queryParameters with your dateRange token "queryParameters": { "earliest": "$dateRange.earliest$" "latest": "$dateRange.latest$" }   You can also add a submit button to your "layout" section  so that the tokens will run when the submit button is clicked: "layout": { "options": { "submitButton": true, "height": 1750 } }   If you already have these in place, you may want to clear your splunk web cache by adding _bump to the end of your url.
Hi @inventsekar , as @isoutamo said, if you don't need a local indexing, you can use the Forwarder License (it was created just for this purpose!). Using this license, you have all the features of ... See more...
Hi @inventsekar , as @isoutamo said, if you don't need a local indexing, you can use the Forwarder License (it was created just for this purpose!). Using this license, you have all the features of a Splunk instance except indexing. In other words, you can preprocess (mainly parse) your data. In this way you can locate this job on HFs instead IDXs. Using the Forwarder License, you don't need to communicate with the LM, unless you want a local copy of your logs: in this case you need an unidirectional connection with the LM on 8089 port. Ports between HF and LM aren't relevant. On HF you need only: 9997 to send data to IDXs and 8089 to manage HF with your DS. Ciao. Giuseppe
No, it's not about "base search" functionality in dashboards. I'm not talking about that. I'm talking about the base search conceptually. But conceptually I'd do something like that: <your base sea... See more...
No, it's not about "base search" functionality in dashboards. I'm not talking about that. I'm talking about the base search conceptually. But conceptually I'd do something like that: <your base search> | eval success=if(your_criteria_for_success,1,0) | stats count as total sum(success) as susccessed by UserAgent | eval ratio=successed/total
Hello @gcusello ,   I've followed the same method of migration as you reffered here. But when the entire setup is migrated from CentOS to RHEL. From backend all looks working fine but just the web ... See more...
Hello @gcusello ,   I've followed the same method of migration as you reffered here. But when the entire setup is migrated from CentOS to RHEL. From backend all looks working fine but just the web UI does not comes up. I tried doing multiple things but now luck. - inputs and server. Conf has the same host name as of servers - port 800 never comes up to listening state. Other ports are working as expected like 8089.8088 etc -tired untaring the splunk package file on the new server as with the same version but no luck. The web log and splunk.log  files under opt/splunk/var/log/splunk directory also does not speak anything about the issue
Hello Thanks for you detailed answer What im trying to achieve is to calculate the ration between the attempts and the success Actually I've searched some details regarding the base search and how... See more...
Hello Thanks for you detailed answer What im trying to achieve is to calculate the ration between the attempts and the success Actually I've searched some details regarding the base search and how to use it in other parts of the query (also i want to use base search in other panel at the same dashboard) and i couldn't find any information I will really appreciate if you will be able to correct my query as im not familiar with different way to achieve the same. P.S - at the end, the final query should append the results from this query and results from similar query but for the BE part so the final query will be larger and heavier 
Hi, I'm struggling to confirm in the docs whether this is permitted or not? I'm working on a TA for Netgear Wi-Fi, the log format is not brilliant to work with but I want to extract the ssid (Wi-Fi)... See more...
Hi, I'm struggling to confirm in the docs whether this is permitted or not? I'm working on a TA for Netgear Wi-Fi, the log format is not brilliant to work with but I want to extract the ssid (Wi-Fi) network name. There are two formats of log containing this. I have written: EXTRACT-ssid EXTRACT-wifi_join_leave_ssid   Wi-Fi/default/props.conf EVAL-src_mac = bssid Wi-Fi/default/props.conf EXTRACT-bssid = \"bssid\"\:\"(?<bssid>\w+\-\w+\w+\-\w+\-\w+\-\w+\-\w+)" Wi-Fi/default/props.conf EXTRACT-ssid = \"ssid\"\:\"(?<ssid>.*?)" Wi-FI/default/props.conf EXTRACT-wifi_join_leave_ssid = (disconnected\sfrom\s|connected\sto\s)(?<ssid>.+?)(?: with an RSSI|}$)     Both these extractions appear to work just fine at search time which really surprised me, I was obsessing over trying to combine a long REGEX with an OR. I've obviously referred to: https://docs.splunk.com/Documentation/Splunk/9.1.1/Admin/Propsconf Which makes it clear that the CLASS must be unique (no problem) but the capture group name gets no mention?
Recently we did an upgrade of Splunk from version 8.0.5 to 9.1.1.  After the upgrade we get the message "Failed to load source for Wordcloud visualization" when we select the Wordcloud visualization.... See more...
Recently we did an upgrade of Splunk from version 8.0.5 to 9.1.1.  After the upgrade we get the message "Failed to load source for Wordcloud visualization" when we select the Wordcloud visualization. We use the most recent version 1.11 (Wordcloud Custom Visualization | Splunkbase). It should be compatible with version 9.1 according to splunkbase. Has anyone the same issue or knows a solution for this ?
OK. Some housekeeping stuff first: 1) Don't Use Wildcards At Beginning Of Your Search Term! Never! (or at least until you fully understand why you shouldn't do that). It will make Splunk have to rea... See more...
OK. Some housekeeping stuff first: 1) Don't Use Wildcards At Beginning Of Your Search Term! Never! (or at least until you fully understand why you shouldn't do that). It will make Splunk have to read every single event from the given timerange, which will make it sloooooow. 2) Inclusion is always better than inclusion, so Request.url!=*twofactor* not only have that dreaded wildcard at the beginning, but also you are doing exclusion which again needs to parse every single event (as if p.1 didn't force Splunk to do it anyway) 3) Both of your searches have search index=clientlogs sourcetype=clientlogs Categories="*networkLog*" "Request.url"="*v3/auth*" Request.url!=*twofactor* "Request.actionUrl"!="*dev*" AND "Request.actionUrl"!="*staging*" As the "base search". There's no point in running this heavy (see p.1) search twice. Just run the search once and mark some events if needed. 4) I'm always very cautious about the dedup command. I find it very unintuitive and producing "strange" results - I prefer stats values() or similar stuff. 5) dedup by _time - what are you trying to achieve here? Especially that you don't use any binning? Also, as @ITWhisperer already mentioned - appendcols just pastes a set of fields along given fields without any correlation between them. (not to mention subsearch limitations which might skew results even further). So - what's the business case? Because that's surely better done otherwise - most probably with one search (even if we leave for now the thing about wildcards and exclusions).
@gcusello may i know your advice please
Yes, you can.  Whether you should or not is a different (and better) question. CPU utilization is not a good measure of how well Splunk is using a VM.  Recall that the number of concurrent searches ... See more...
Yes, you can.  Whether you should or not is a different (and better) question. CPU utilization is not a good measure of how well Splunk is using a VM.  Recall that the number of concurrent searches Splunk can run is based on how many CPUs are available.  Reduce that number and you reduce the number of searches you can run. If your maximum concurrent searches is less than the number of CPUs available then a smaller VM might make sense; otherwise, look for other instance options.
What product are you trying to onboard?  If you name it then perhaps someone who's worked with it before will respond.  Have you contacted the vendor to see if they have a private add-on available? ... See more...
What product are you trying to onboard?  If you name it then perhaps someone who's worked with it before will respond.  Have you contacted the vendor to see if they have a private add-on available? The lack of an add-on does not imply an API is needed.  There are other ways to get data into Splunk. Install a universal forwarder on the server to send log files to Splunk Have the server send syslog data to Splunk via a syslog server or Splunk Connect for Syslog Use Splunk DB Connect to pull data from the server's SQL database. Have the application send data directly to Splunk using HTTP Event Collector (HEC).
It depends on what your goal is - just presenting us with a bunch of SPL without a clear definition of what it is intended to do and without some sample events (anonymised of course) makes it challen... See more...
It depends on what your goal is - just presenting us with a bunch of SPL without a clear definition of what it is intended to do and without some sample events (anonymised of course) makes it challenging for volunteers to spend their time trying to figure out how to help you.
Hi @kattey , how much do you know Splunk? if you start from scratch you need to learn hot to ingest data in Splunk and how to search on Splunk. Data sources com from you infrastructure, if you hav... See more...
Hi @kattey , how much do you know Splunk? if you start from scratch you need to learn hot to ingest data in Splunk and how to search on Splunk. Data sources com from you infrastructure, if you haven't, you could use an automatic generator, but it isn't another stack to learn! Best way, is to search in Community answers about basic learning (e.g. Search Tutorial) and getting data in. Then you should define a perimeter to identify the data sources to ingest. Ciao. Giuseppe
Hi @kattey ... please check these things: 1) As i heard, the Splunk Essentials app got some sample data.  https://splunkbase.splunk.com/app/3435 2) and then you can find some sample data in this r... See more...
Hi @kattey ... please check these things: 1) As i heard, the Splunk Essentials app got some sample data.  https://splunkbase.splunk.com/app/3435 2) and then you can find some sample data in this repo: https://github.com/splunk/botsv3 3) and then, there is an app.. EventGen. very difficult to configure and very worst documentation. i would suggest this as last resort. thanks.  4) Splunk Datasets Add-On: This Splunk add-on provides a variety of sample data sets, including security logs, for you to work with. You can download and install the add-on directly from Splunkbase: https://splunkbase.splunk.com/app/3245/ 5) Boss of the SOC (BOTS) datasets: You've already mentioned BOTS v1-3, but don't forget about BOTS v4, which was released later. You can find it here: https://github.com/splunk/botsv4 6) Elastic Common Data Model (ECS) sample data: Although intended for the Elastic Stack, you can adapt these sample logs for use in Splunk. The repository contains logs from various sources, such as network traffic, security events, and web server logs: https://github.com/elastic/ecs/tree/master/generated/samples 6) Sample Log Generator: This tool generates synthetic logs that you can customize to fit your needs. While not real-world data, it can be useful for testing specific scenarios or queries: https://github.com/ErikEJ/SqlQueryStress 7) NIST National Vulnerability Database (NVD) data feeds: NVD provides various data feeds containing vulnerability information. While not logs per se, this data can be useful for exploring security-related data in Splunk: https://nvd.nist.gov/vuln/data-feeds SecRepo: You've already mentioned this repository, but I'd like to emphasize its value as it contains various sample logs from different sources: http://www.secrepo.com/  9) https://github.com/gfek/Real-CyberSecurity-Datasets 10) https://github.com/shramos/Awesome-Cybersecurity-Datasets 11) https://www.secrepo.com/   hope this helps you and other splunkers.. thanks. karma / upvotes appreciated by all, thanks. 
Hello, good day I am very new to Splunk, i and my team want to work on a mini project using splunk cloud with the topic "Splunk Enterprise: An organization's go-to in detecting cyberthreats" how/wh... See more...
Hello, good day I am very new to Splunk, i and my team want to work on a mini project using splunk cloud with the topic "Splunk Enterprise: An organization's go-to in detecting cyberthreats" how/where can i easily get datasets/logs that i can use in splunk for monitoring and analysis.  and what best way should we go about this topic?
Hi Peter,   how has it been solved and when?   Thanks! Alessandro
ok, so what is your suggestion ? how can i achieve my goal ?
Thanks for your reply, I appreciate it.  I intentionally first got all <hosts>  from lookup table to reduce the search footprint.  I get this error: Error in 'lookup' command: Must specify one o... See more...
Thanks for your reply, I appreciate it.  I intentionally first got all <hosts>  from lookup table to reduce the search footprint.  I get this error: Error in 'lookup' command: Must specify one or more lookup fields.
Do you mean something like this? | spath list.entry{}.fields output=items | mvexpand items | spath input=items | fields - _raw items