All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I am currently using a lookup table and definition to compare a list of IPs, Domains, URLs, etc. against certain fields in Splunk for matches. This query is used in a dashboard with multipl... See more...
Hello, I am currently using a lookup table and definition to compare a list of IPs, Domains, URLs, etc. against certain fields in Splunk for matches. This query is used in a dashboard with multiple panels. Below is my query after lookup tables and definitions are established. index="INDEX" [|inputlookup FILE.csv | return 50000 $indicator]| table action, src_ip, source, dst, destination, dst_ip, dstprt, filehash_md5, filehash_sha1, filehash_sha256, affectedFileHash | stats count Sometimes I come across a URL that contains an equal sign '=' in it and it causes the query to not work with the following error. Error in 'search' command: Unable to parse the search: Comparator '=' has an invalid term on the left hand side: "http://IP/ies/api.cgi?act"=getConfig&id. or Error in 'search' command: Unable to parse the search: unbalance parenthesis. Both seem to be tied to the same URLs that have equal signs in them and I am unable to find a solution or workaround for this. The lookup table is put together using Python PANDAS so I could always use some data wrangling if need be, but so far my attempts have failed. I also noticed that using the search bar in splunk accepts the URL string if I use double ticks, versus single but as far as making that the standard output when using the inputlookup and the dashboards, I am not sure.
Greetings. This may be elementary, but I have our Cisco ASA 5516 sending logs via a syslog server to Splunk. I configured a basic inputs.conf file to do so. The logs get into Splunk but the pars... See more...
Greetings. This may be elementary, but I have our Cisco ASA 5516 sending logs via a syslog server to Splunk. I configured a basic inputs.conf file to do so. The logs get into Splunk but the parsing isn't very good. I seem to have to extract most fields (like I saw in another question re: message_id field.) Shouldn't those fields be parsed automatically? I don't see an AnyConnect TA or app except for NVM which my infrastructure team says we're not using. Any guidance would be much appreciated. Thanks!
I have a query like this: | mstats rate(request_total) as request_rate prestats=true WHERE index="index-metrics" AND namespace="namespace" AND pod="podname-*" span=30s BY pod I was looking ... See more...
I have a query like this: | mstats rate(request_total) as request_rate prestats=true WHERE index="index-metrics" AND namespace="namespace" AND pod="podname-*" span=30s BY pod I was looking to mimic the dashboard that grafana provides for RequestsForSecond, ResponseLatency etc.. Can anyone help me with how to achieve with Splunk? I use Splunk 7.3.2
Hello all, I'm currently stumped in trying to figure out why my notable event token is not working. I verified the field that the token uses exist in the correlation search result (example below)... See more...
Hello all, I'm currently stumped in trying to figure out why my notable event token is not working. I verified the field that the token uses exist in the correlation search result (example below). | stats dc("dest") AS host_count Notable Event Title: on $host_count$ hosts. The token for some reason doesn't expand and output the number 13... Can you guys help in figuring this out? Thank you for your time.
So, I'm looking at deploying the Splunk *nix Add-on to allow us to gather some data from some linux servers. I don't wan't the incoming data to end up in the default index, so I've created a new... See more...
So, I'm looking at deploying the Splunk *nix Add-on to allow us to gather some data from some linux servers. I don't wan't the incoming data to end up in the default index, so I've created a new index on our Index cluster, and I've added a new local/inputs.conf to override the Add-on's default input.conf. This has been deployed to a server I'm monitoring and everything is working fine. However, I'm a it confused as to what I need to do with the instance of the add-on that 's supposed to be installed on the search head and indexers. I don't need these to input any data at all (at least, not from the splunk servers they're sitting on). The documentation says I do need these to run on the indexers as I'm using a universal forwarder and not a heavy forwarder - though I'm not sure why. Do I need to do anything about the inputs.conf? I don't want the instance on the indexers or search head to index the splunk servers. Do I need to apply the add-on as is? The Add-on with my custom inputs.conf, or in someway otherwise alter it? The documentation doesn't seem to mention anything along these lines. Thanks Dave
I was removing different application and accidentally removed these Splunk ES supported and other application. It will be so much helpful if you suggest the steps to get these applications on my serv... See more...
I was removing different application and accidentally removed these Splunk ES supported and other application. It will be so much helpful if you suggest the steps to get these applications on my server. Note - There is no Backup of these apps.
Is it possible to run a kvstore backup (or other commands that ask for a username/password) using a Windows Task with a GMSA account without having to install/run an external script to grab the hash?... See more...
Is it possible to run a kvstore backup (or other commands that ask for a username/password) using a Windows Task with a GMSA account without having to install/run an external script to grab the hash? We do not want to store credentials on the system. -Tony
Hello, I have a string field like: View How can I remove tag and to only display View in the search? Thanks,
I am new to Splunk. Please help me out with this. My dashboard is having 5 text fields and allows the users to key in data. 1. I want to make all of the text fields are optional. 2. I need to ... See more...
I am new to Splunk. Please help me out with this. My dashboard is having 5 text fields and allows the users to key in data. 1. I want to make all of the text fields are optional. 2. I need to construct the search query based on the user input. i.e. dynamically construct the search query filter based on user input Ex 1: with 3 text fields input from user, i need to generate the query with three filters. Ex 2: with no inputs from user, it should be generic search with out filters
Hi Community members. I need your help to identify where I am doing wrong in regex field extraction. Actually there are email logs which contains data like:- sender=abc@ibn.com message_id= x... See more...
Hi Community members. I need your help to identify where I am doing wrong in regex field extraction. Actually there are email logs which contains data like:- sender=abc@ibn.com message_id= xxxxxxx@ibn.com _time=13:24:23:445 sender=xyz@xyz.com message_id=yyyyy@xyz.com _time=12:34:13:1344 sender=utr@tbc.com message_id=uuuuu@tbc.com _time=12:12:53:1233 I wrote regex to extract data after @ to see what domains are there in message_id field and wrote regex on website "https://regex101.com/" is working but in Splunk I am not getting expected output where Splunk returning full message_id data means xxxx@ibn.com and not ibn.com Wrote Query: index=email_logs earliest=-30m | regex message_id="(?<=@).+" | stats count by message_id Current Splunk Output is:- xxxxxxx@ibn.com yyyyy@xyz.com uuuuu@tbc.com Required output under message_id should be:- ibn.com xyz.com tbc.com
Hi Everyone, Recently I’m struggling with a small issue. I have a Splunk dashboard where I have used Text-box to search a product number in it. But recently I need to check multiple produc... See more...
Hi Everyone, Recently I’m struggling with a small issue. I have a Splunk dashboard where I have used Text-box to search a product number in it. But recently I need to check multiple products number at a same time, because one by one product number search is too much time taking activity. I have tried to use multi select input feature but somehow it’s not working. Can someone suggest me a easy and best way to do the same. I have multiple table where I have to check the products number are present or not but need to search multiple products number at a sametime. Thanks in Advance, @saibal6
Hello, I need help with what I thought will be easy: I need to execute the 2-nd select depending on the result of the 1-st select and append the results 2-nd to 1-st. |dbxquery query="select t... See more...
Hello, I need help with what I thought will be easy: I need to execute the 2-nd select depending on the result of the 1-st select and append the results 2-nd to 1-st. |dbxquery query="select top 1 m.host, s.service_name, round(m.used_memory_size/1024/1024/1024) as used_memory_size_gb , round(i.memory_allocation_limit/1024/1024/124) as memory_allocation_limit_gb, to_int(round(m.used_memory_size * 100 / i.memory_allocation_limit )) percent from isp_iop_kpistore.sys.m_service_component_memory as m join (select host, port, MEMORY_ALLOCATION_LIMIT from isp_iop_kpistore.sys.M_LOAD_HISTORY_SERVICE where time >= add_seconds(now(), -60)) as i on m.host = i.host and m.port = i.port join isp_iop_kpistore.sys.m_services as s on i.host = s.host and i.port = s.port where m.component = 'Statement Execution & Intermediate Results' and m.used_memory_size >= i.memory_allocation_limit * 0.01 order by round(m.used_memory_size/1024/1024/1024) desc " connection="HANA_MLBSO" |where PERCENT > 20 |append [ | dbxquery query=" select now() from dummy /*select top 100 component, category, SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 1) as "TYPE_1", SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 2) as id_1, SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 3) as "TYPE_2", SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 4) as id_2, SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 5) as "TYPE_3", SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 6) as id_3, sum(round(inclusive_allocated_size/1024/1024/1024,2)) as allocated_gb, sum(round(inclusive_size_in_use/1024/1024/1024, 2)) as used_gb from isp_iop_kpistore.sys.m_context_memory group by component, category, SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 1) , SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 2) , SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 3), SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 4) , SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 5) , SUBSTRING_REGEXPR('[^/]+' IN category FROM 1 OCCURRENCE 6) order by used_gb desc */ " connection="HANA_MLBSO"] Somehow there is no way I can force it to work as I want. When the where condition is met, then yes, I get both results appended. However, when the where the condition is false, still the second SQL gets executed, the results of the first are erased and what I get is only the result of the second SQL. I guess I make some conceptual mistake here. Could you please advise? Kind Regards, Kamil
What is the recommended sequence of upgrading Splunk enterprise to 8.0? Should i upgrade all apps& add-ons first and then Splunk Enterprise or vice-versa? Normally in 7.x.x version i'd done first ... See more...
What is the recommended sequence of upgrading Splunk enterprise to 8.0? Should i upgrade all apps& add-ons first and then Splunk Enterprise or vice-versa? Normally in 7.x.x version i'd done first enterprise and then apps-addons, however this link says different, https://docs.splunk.com/Documentation/Splunk/8.0.2/Installation/Python3LowEffort Also evaluating some apps documents, it states for some apps you need to have Splunk 8.0 first and then upgrade the app, confused!
  I have been waiting for this message long time. Agent is for .NET
I am trying to test Splunk DB Connect (on a DEVTEST instance CentOS 8, Splunk 8.0.2, Splunk DB Connect 3.3.0, openjdk 11.0.6, mysql-connector-java-5.1.48-bin.jar) to connect to a remote MariaDB 5.5.5... See more...
I am trying to test Splunk DB Connect (on a DEVTEST instance CentOS 8, Splunk 8.0.2, Splunk DB Connect 3.3.0, openjdk 11.0.6, mysql-connector-java-5.1.48-bin.jar) to connect to a remote MariaDB 5.5.5 server. I think I followed the install instructions correctly; except I cannot do the "Check DB Connect installation health" steps as Health Check doesn't seem to be available on my DEVTEST instance (Browser Console shows: GET http://splunk:8000/en-US/splunkd/__raw/services/search/distributed/groups?output_mode=json&_=1585579039878 402 (Payment Required) .) Using SQL Explorer in Splunk DB Connect, I can select my Collection, Catalog and Table from the drop downs on the left, which auto-creates a SELECT statement, the connection must be up and properly authenticated to populate the Catalog and Table dropdowns, but when I click Run no results are returned and the following is written to splunk_app_db_connect_dbxquery.2020-03-30.log : 2020-03-30 15:32:58.031 [main] INFO com.splunk.dbx.command.DbxQueryServer - operation= connection_name= stanza_name= action=dbxquery_server got request 2020-03-30 15:32:58.063 [main] INFO com.splunk.dbx.command.DbxQueryServer - operation= connection_name= stanza_name= action=dbxquery_server got request 2020-03-30 15:32:58.093 [main] INFO com.splunk.dbx.command.DbxQueryServer - operation= connection_name= stanza_name= action=dbxquery_server got request 2020-03-30 15:32:58.121 [main] INFO com.splunk.dbx.command.DbxQueryServer - operation= connection_name= stanza_name= action=dbxquery_server got request 2020-03-30 15:32:58.156 24727@splunk [DBX-QUERY-WORKER-60] ERROR com.splunk.dbx.command.DbxQueryCommand - operation= connection_name= stanza_name= action=dbxquery_command failed to get connection com.splunk.dbx.exception.NotFoundException: Can not find object MISPReader of type connection. at com.splunk.dbx.command.DbxQueryCommand.lambda$getConnection$1(DbxQueryCommand.java:180) at java.base/java.util.Optional.orElseThrow(Optional.java:408) at com.splunk.dbx.command.DbxQueryCommand.getConnection(DbxQueryCommand.java:180) at com.splunk.dbx.command.DbxQueryCommand.generate(DbxQueryCommand.java:359) at com.splunk.search.command.GeneratingCommand.process(GeneratingCommand.java:183) at com.splunk.search.command.ChunkedCommandDriver.execute(ChunkedCommandDriver.java:110) at com.splunk.search.command.AbstractSearchCommand.run(AbstractSearchCommand.java:50) at com.splunk.search.command.GeneratingCommand.run(GeneratingCommand.java:15) at com.splunk.dbx.command.DbxQueryCommand.runCommand(DbxQueryCommand.java:256) at com.splunk.dbx.command.DbxQueryServer.lambda$handleQuery$1(DbxQueryServer.java:144) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) I've installed MySQL client on the Splunk server and can use this to connect and run the same SELECT query successfully, so permissions on DB are correct. When I try to set dbxquery log level to debug the UI reports ('%s stanza in commands conf file is not valid because there must exist one and only one attribute with a value -DDBX_COMMAND_LOG_LEVEL=${LOG_LEVEL}', 'dbxquery') I've been search for a few hours trying to find an answer but haven't been able to, any suggestions welcome. Thanks Joe
Hi All, for a report i would like to read a value from a website daily: https://www.broadcom.com/support/security-center/definitions?pid=sep14 With the app "Website Input" it works great on a w... See more...
Hi All, for a report i would like to read a value from a website daily: https://www.broadcom.com/support/security-center/definitions?pid=sep14 With the app "Website Input" it works great on a windows platform when i choose firefox as my browser. Unfortunately, it doesn't work with our productive linux platform. Here I can only choose the built-in client as browser. Firefox and Chrome don't work. If I then enter the css selector, I don't get a match. The website can be reached. Result Preview: Response Code: 200 Response Time: 22.4 ms Response Size:3 KB Encoding: utf-8 URL: https://www.broadcom.com/support/security-center/definitions?pid=sep14 No fields found CSS-Selector: dd.col-sm-7:nth-child(6) Any idea? Thanks for your help.
I am trying to add some field extractions for a log file created by Entrust IdentityGurard authentication solution. Currently when I read it in I read it with a SourceType of log4j as the application... See more...
I am trying to add some field extractions for a log file created by Entrust IdentityGurard authentication solution. Currently when I read it in I read it with a SourceType of log4j as the application outlines it formats the logs in. Things look okay but the fields specific to the log are not being extracted. I am looking into how I can build a custom extraction myself because I have always wanted to learn how it works but figured I would also post the question here to get some tips and best practices. Here is an example of one event in the log file: [2020-03-29 18:37:51,020] [IG Audit Writer] [INFO ] [IG.AUDIT] [AUD6012] [UserNameHere] EventMessageHere Basically, all the fields I want are wrapped in square brackets [] and the message itself is just added at the end with no square brackets. I think I will have to build out my own custom SourceType in the SplunkHome\etc\system\local\props.conf that will just be a copy of the log4j stanza but with either a REPORT key that references a corresponding extraction in the transforms.conf file or use the EXTRACT key and put it in there using regex. Am I on the right path?
Hi, I am setting up the Deep Learning Toolkit (https://splunkbase.splunk.com/app/4607/) but the connexion to the docker instance does not work. What I did: - Install docker and add the contai... See more...
Hi, I am setting up the Deep Learning Toolkit (https://splunkbase.splunk.com/app/4607/) but the connexion to the docker instance does not work. What I did: - Install docker and add the containers needed - Docker runs as root - Install Splunk + Machine Learning Toolkit (+ python maths librairies) + Deep Learning toolkit - The user running Splunk belongs to the docker group When I fill in the form in the set up page of the app with the information about docker (single-instance type), I get this error: UnixHTTPConnectionPool(host='localhost', port=None): Max retries exceeded with url: /v1.35/_ping (Caused by : [Errno 13] Permission denied) In the internal logs, I also see logs like this: ConnectionError: UnixHTTPConnectionPool(host='localhost', port=None): Max retries exceeded with url: /v1.35/containers/json?size=0&filters=%7B%22label%22%3A+%5B%22mltk_container%22%5D%7D&limit=-1&all=0&trunc_cmd=0 (Caused by : [Errno 13] Permission denied) Can you help me to make this work?
Let's say if I have 4 indexers at one site 'AB' and 4 indexers at another site 'CD'(DR site). site_replication_factor=origin:2,total:3 site_search_factor=origin:1,total:2 Question :1 I underst... See more...
Let's say if I have 4 indexers at one site 'AB' and 4 indexers at another site 'CD'(DR site). site_replication_factor=origin:2,total:3 site_search_factor=origin:1,total:2 Question :1 I understand from this document that in a situation where 3 of my indexers go down at 'AB' site , my 4th indexer will keep on ingesting the data and would keep copies in reserve state to be distributed when other indexers come back in place? Please confirm. Question :2 What if all my 4 indexers go down at 'AB' site ..how would ingestion be managed then ? Would cluster master automate the data ingestion to DR site 'CD' indexers ? Question :3 Since I have site_replication_factor of origin:2, total:3 and let's say two indexer machines at 'AB' site, both holding copy of same bucket goes down. Now, in this situation all copies(two) for a specific bucket become unavailable at site 'AB', then would cluster master instruct to receive a copy from DR site 'CD' and get that copied to 2 running indexers at 'AB' site ?
This isn't working properly - the color is not changing based on the value. Any ideas? I have the latest version of the app running on 7.3.1. Thanks