All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

hello dear splunkers i have a small question about the aws app. in the security tab there are some views that uses custom commands such as "command_nadefault.py" or "command_acl_inputlookup.py"...i ... See more...
hello dear splunkers i have a small question about the aws app. in the security tab there are some views that uses custom commands such as "command_nadefault.py" or "command_acl_inputlookup.py"...i wanted to know or if it's possible to ask what's their use in the different dashboards that available in the app...i'm not that familiar with python so i'm not able to understand how they being used in the app. thanks in advance,  etai
Hello i have a json log and i cannot figure out how to break the lines correctly this is how it looks like : how can i break the lines that each event will be on is own ?
Hi I have schedule report that run daily, but often failed! number of events about 80,000,000 job inspection log attach to the post. any idea? Thanks
hello everyone I have an ISF for send stream logs to my Splunk indexer  the logs in  "/opt/streamfwd/var/log" show that this error is happening: 2021-12-03 22:56:08 ERROR [140397528004352] (HTT... See more...
hello everyone I have an ISF for send stream logs to my Splunk indexer  the logs in  "/opt/streamfwd/var/log" show that this error is happening: 2021-12-03 22:56:08 ERROR [140397528004352] (HTTPRequestSender.cpp:1408) stream.SplunkSenderHTTPEventCollector - (#7) HTTP request [https://splunk:8088/services/collector/event?index=_internal&host=stream&source=stream&sourcetype=stream:log] response = 400 Bad Request {"text":"Incorrect index","code":7,"invalid-event-number":1}
Hi, I am running multiple applications on one JVM(Tomcat). How can I segregate each application as there is single java agent running?
I have a search query that looks like this:   index="myindex" sourcetype="mysource" earliest=@d latest=now | append [ search index="myindex" sourcetype="mysource" earliest=-1mon@mon latest=@mon ... See more...
I have a search query that looks like this:   index="myindex" sourcetype="mysource" earliest=@d latest=now | append [ search index="myindex" sourcetype="mysource" earliest=-1mon@mon latest=@mon | stats avg(Price) as past_avg by ID ] | stats values(*) as * by ID | table Date, ID, Price, past_avg This gives me: what I'm trying to do is display only those values in the Price column that are smaller then past_avg,  does anyone know how I could achieve that?
Hi Thank you for al the response and glad this is helping me resolve issues while learning splunk Appreciate your help.  1. how to change source for only one column in a table and filter/sort it... See more...
Hi Thank you for al the response and glad this is helping me resolve issues while learning splunk Appreciate your help.  1. how to change source for only one column in a table and filter/sort it based on other column? Below i need to change source for mainline column and associate data to project column. also i need to filter and sort the data alphabetically.  <table> <search> <query>index="wtqlty" source=pdf-fc-002-rh sourcetype="release_pdf_results_json" | table pdf_name, pdf_state, main_line, Req_report, patch_name, started_on, Stream_start, Handover, planned_stopped_on, fco_state, snapshot, stakeholders.project_leader.name, stakeholders.developer.name, air_issues{}.short_description , Quality, questionnaire | rename pdf_name AS PDF, pdf_state AS "PDF State", main_line AS "Mainline", patch_name AS Project, started_on AS "PDF start", planned_stopped_on AS "Planned Stop", fco_state AS "FCO State", stakeholders.project_leader.name AS PL, stakeholders.developer.name AS Developer, air_issues{}.short_description AS Description, questionnaire AS Questionnaire</query> <earliest>0</earliest> <latest></latest> </search> <option name="drilldown">cell</option> <option name="refresh.display">progressbar</option> <format type="color" field="FCO State"> <colorPalette type="expression">case (match(value,"DRAFT_DEV"), "#DC4E41",match(value,"ACCEPTED"),"#53A051",true(),"#C3CBD4")</colorPalette> </format> <format type="color" field="PDF State"> <colorPalette type="expression">case (match(value,"DRAFT_DEV"), "#DC4E41",match(value,"accepted"),"#53A051",true(),"#C3CBD4")</colorPalette> </format> <drilldown> <condition field="PDF State"> <set token="form.pdf_state_token">$click.value2$</set> </condition> <condition field="PL"> <set token="form.PL">$click.value2$</set> </condition> <condition field="FCO State"> <set token="form.fco_name">$click.value2$</set> </condition> <condition field="Developer"> <set token="form.developer">$click.value2$</set> </condition> <condition field="Snapshot"> <set token="form.snap_shot">$click.value2$</set> </condition> <condition field="Mainline"> <set token="form.main_line">$click.value2$</set> <set token="form.show_clear_filter">*</set> </condition> <condition field="Project"> <set token="form.patch_name">&gt;$click.value2$</set> <link target="_blank">https://stream-dashboard.asml.com/db/overall/$click.value2$/</link> </condition> <condition field="PDF"> <set token="form.pdf_name">$click.value$</set> <link target="_blank">https://at.patchtooling.asml.com/pdf/RH/ML/patches/$row.Project$/</link> </condition>    2.how to change format of text, inside a cell? say if there is an URL for the text in a cell, i need to underline it, so users knows its url link clickable?   3. how can we strike a number based in cell and update with new number(in case of weeks) Appreciate your help. thank you
I'm trying to write a search that will return a table where all average values of the field price grouped by Ids are lower then 1 month ago. This is my attempt:   index="myindex" sourcetype="mysour... See more...
I'm trying to write a search that will return a table where all average values of the field price grouped by Ids are lower then 1 month ago. This is my attempt:   index="myindex" sourcetype="mysourcetype" earliest=-1mon@mon latest=@mon | stats avg(Price) as avg by ID | where avg > [search index="myindex" sourcetype="mysourcetype" earliest=@d | stats avg(Price) as new_avg by ID | return $new_avg] | table * This, however, always returns 0 results even though there are events in these time periods. I even tried subtituting the subsearched with a fixed number and that produces a table. Does anyone now why this isn't working and maybe how to fix it? 
I have the first query First Query :     search criteria | rex field=_raw ".* IPAddress=(?<IPAddress>.+?) " | table IPAddress The above query is returning a table with all IPAddress. I want this da... See more...
I have the first query First Query :     search criteria | rex field=_raw ".* IPAddress=(?<IPAddress>.+?) " | table IPAddress The above query is returning a table with all IPAddress. I want this data to be looked at in the second query. How can we write two queries as single? Second Query :   search criteria  | rex field=_raw ".* IPAddress=(?<IPAddress>.+?)\"" | where IPAddress in (first Query results )  | rex field=_raw ".* value=(?<value>.+?)\"" | table IPAddress,value, _time I tried below but it is empty results <first search > | rex field=_raw ".* IPAddress=(?<IPAddress>.+?)\"" | where IPAddress in ([search <second search> | rex field=_raw ".* IPAddress=(?<IPAddress>.+?) " | fields IPAddress ])| rex field=_raw ".* value=(?<value>.+?)\"" | table IPAddress,value, _time  
could someone who is SPL expert help me reduce this:     |eval dest=replace(dest, "dstdomain|src|any-of|dst|# ", ""), dest=replace(mvjoin(dest, " "), "/32", "|"), dest=split(dest, "|"), dest=split... See more...
could someone who is SPL expert help me reduce this:     |eval dest=replace(dest, "dstdomain|src|any-of|dst|# ", ""), dest=replace(mvjoin(dest, " "), "/32", "|"), dest=split(dest, "|"), dest=split(dest, " ")    
I am looking for unstandable Splunk_TA_linux  deployment on a single site instance: 1 deployer 3 clustered search heads 1 Cluster Master 3 clustered indexes 1 deployment server   I searched and ... See more...
I am looking for unstandable Splunk_TA_linux  deployment on a single site instance: 1 deployer 3 clustered search heads 1 Cluster Master 3 clustered indexes 1 deployment server   I searched and could not find install guide for this add on - I have found some but not deep enough to prevent a messy end.      Help would be greatly appreciated.  
Hi! Been struggling a lot with a pretty simple problem but my SPLUNK REX skills are insufficient for the task. I want to match and list ANY value containing both letters, digits and characters betwee... See more...
Hi! Been struggling a lot with a pretty simple problem but my SPLUNK REX skills are insufficient for the task. I want to match and list ANY value containing both letters, digits and characters between parenthesis at the end of line/end of string - examples: bla bla bla (My Value0/0) bla bla blb (My OtherValue0/1) bla blb blc (My thirdValue0/0/0/0) As you can see - the text BEFORE the ending value inside parenthesis can be what ever. There can also be MULTIPLE similar values also within parenthesis along the string but I ONLY want to match the one at end of line ($). The match must be every letter, space, number or typically "/" characters between the parenthesis. Using other regex dev tools I get a fairly decent result with a simple string like this: \(.*\)$ \( matches the character ( with index 4010 (2816 or 508) literally (case sensitive) . matches any character (except for line terminators) * matches the previous token between zero and unlimited times, as many times as possible, giving back as needed (greedy) \) matches the character ) with index 4110 (2916 or 518) literally (case sensitive) $ asserts position at the end of a line   I also have used variants of this and they all end up working very well in regex testers and dev tools and also in LINUX (when pasting the entire table of messages into a file and applying them. But not in SPLUNK - I believe there is a big coin drop along my SPLUNK use path when everything will make sense to me, unfortunately not there yet. Please help me out!
Hello, We recently upgraded our controller to version 21.4.8-1411. After the upgrade however, our SMS alerts are not working.  According to the health rule's Evaluation Events > Actions Executed, ... See more...
Hello, We recently upgraded our controller to version 21.4.8-1411. After the upgrade however, our SMS alerts are not working.  According to the health rule's Evaluation Events > Actions Executed, it says "SMS Message Sent" but we're not getting any text alerts. Is this a known issue?
I have somewhat of an unwieldy log file I'm trying to wrangle. Each log entry is contained between two lines like so: <TIMESTAMP> BEGIN LOG DECODE log data log data log data <TIMESTAMP> END LOG ... See more...
I have somewhat of an unwieldy log file I'm trying to wrangle. Each log entry is contained between two lines like so: <TIMESTAMP> BEGIN LOG DECODE log data log data log data <TIMESTAMP> END LOG DECODE   What's the best way to grab everything in between and start to extract fields and such?
Hey all, I have 2 source types with the following properties source_1 id value source_2 name description So my events might look similar to: source_1: id=abc-123, value="blah" source_2... See more...
Hey all, I have 2 source types with the following properties source_1 id value source_2 name description So my events might look similar to: source_1: id=abc-123, value="blah" source_2: name=abc-123, description="some_description" The values of source_1.id and source_2.name are equal. Im trying to display the id/name, description and value in a table. I've came up with the following query to do so:   index=main sourcetype=source_2 | rename name AS id | join id [search index=main sourcetype=source_1 id=*] | table id, value, description   Is my query the best way to achieve this? Are there any alternatives?
Hi All, I had this error at it took a while to understand and fix it. Here my environment: Splunk 8.0.5 Splunk DB Connect 3.6.0 Java /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.275.b01-0.el6_10.x86_6... See more...
Hi All, I had this error at it took a while to understand and fix it. Here my environment: Splunk 8.0.5 Splunk DB Connect 3.6.0 Java /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.275.b01-0.el6_10.x86_64/jre/bin/java Red Hat Enterprise Linux Server release 6.10 (Santiago) Target DB is PostgreSQL We have several query all properly running, just one was giving the error. The query is the following:   index=myindex sourcetype=mysourcetype etc… | dbxoutput output=my_stanza   “my_stanza” refers to one present on db_outputs.conf The error in Splunk Search Head was:   rx.exceptions.OnErrorNotImplementedException at rx.internal.util.InternalObservableUtils$ErrorNotImplementedAction.call(InternalObservableUtils.java:386) at rx.internal.util.InternalObservableUtils$ErrorNotImplementedAction.call(InternalObservableUtils.java:383) at rx.internal.util.ActionSubscriber.onError(ActionSubscriber.java:44) at rx.observers.SafeSubscriber._onError(SafeSubscriber.java:153) at rx.observers.SafeSubscriber.onError(SafeSubscriber.java:115) at rx.exceptions.Exceptions.throwOrReport(Exceptions.java:212) at rx.observers.SafeSubscriber.onNext(SafeSubscriber.java:139) at rx.internal.operators.OperatorBufferWithSize$BufferExact.onCompleted(OperatorBufferWithSize.java:128) at rx.internal.operators.OnSubscribeMap$MapSubscriber.onCompleted(OnSubscribeMap.java:97) at rx.internal.operators.OperatorPublish$PublishSubscriber.checkTerminated(OperatorPublish.java:423) at rx.internal.operators.OperatorPublish$PublishSubscriber.dispatch(OperatorPublish.java:505) at rx.internal.operators.OperatorPublish$PublishSubscriber.onCompleted(OperatorPublish.java:305) at rx.internal.operators.OnSubscribeFromIterable$IterableProducer.slowPath(OnSubscribeFromIterable.java:134) at rx.internal.operators.OnSubscribeFromIterable$IterableProducer.request(OnSubscribeFromIterable.java:89) at rx.Subscriber.setProducer(Subscriber.java:211) at rx.internal.operators.OnSubscribeFromIterable.call(OnSubscribeFromIterable.java:63) at rx.internal.operators.OnSubscribeFromIterable.call(OnSubscribeFromIterable.java:34) at rx.Observable.unsafeSubscribe(Observable.java:10327) at rx.internal.operators.OperatorPublish.connect(OperatorPublish.java:214) at rx.observables.ConnectableObservable.connect(ConnectableObservable.java:52) at com.splunk.dbx.command.DbxOutputCommand.process(DbxOutputCommand.java:161) at com.splunk.search.command.StreamingCommand.process(StreamingCommand.java:58) at com.splunk.search.command.ChunkedCommandDriver.execute(ChunkedCommandDriver.java:109) at com.splunk.search.command.AbstractSearchCommand.run(AbstractSearchCommand.java:50) at com.splunk.search.command.StreamingCommand.run(StreamingCommand.java:16) at com.splunk.dbx.command.DbxOutputCommand.main(DbxOutputCommand.java:100) Caused by: java.lang.NullPointerException at java.math.BigDecimal.<init>(BigDecimal.java:809) at com.splunk.dbx.service.output.OutputServiceImpl.setParameterAsObject(OutputServiceImpl.java:288) at com.splunk.dbx.service.output.OutputServiceImpl.setParameter(OutputServiceImpl.java:270) at com.splunk.dbx.service.output.OutputServiceImpl.processInsertion(OutputServiceImpl.java:216) at com.splunk.dbx.service.output.OutputServiceImpl.output(OutputServiceImpl.java:76) at rx.internal.util.ActionSubscriber.onNext(ActionSubscriber.java:39) at rx.observers.SafeSubscriber.onNext(SafeSubscriber.java:134) ... 19 more     Looking at search.log from job inspector:   12-03-2021 17:26:18.187 INFO DispatchExecutor - END OPEN: Processor=noop 12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr: Exception in thread "main" java.lang.IllegalStateException: I/O operation on closed writer 12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr: at com.splunk.search.command.AbstractWriteHandler.checkValidity(AbstractWriteHandler.java:100) 12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr: at com.splunk.search.command.AbstractWriteHandler.flush(AbstractWriteHandler.java:228) 12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr: at com.splunk.search.command.ChunkedWriteHandler.flush(ChunkedWriteHandler.java:69) 12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr: at com.splunk.search.command.AbstractWriteHandler.close(AbstractWriteHandler.java:233) 12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr: at com.splunk.search.command.ChunkedCommandDriver.execute(ChunkedCommandDriver.java:120) 12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr: at com.splunk.search.command.AbstractSearchCommand.run(AbstractSearchCommand.java:50) 12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr: at com.splunk.search.command.StreamingCommand.run(StreamingCommand.java:16) 12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr: at com.splunk.dbx.command.DbxOutputCommand.main(DbxOutputCommand.java:100)     I solved in this way (adding fillnull):   index=myindex sourcetype=mysourcetype etc… | fillnull value=0.00 mbytes_in | fillnull value=0.00 mbytes_out | dbxoutput output=my_stanza     There were 2 records in the extraction having "mbytes_in" and "mbytes_out" fields without any value. I am sure before upgrading to Splunk DB Connect 3.6.0 it was working properly. The target DB is a PostgreSQL and the table is defined as below, as you can see "mbytes_in" and "mbytes_out" can accept NULL values (and I can see several records in the PostgreSQL DB populated in the past with "mbytes_in" and "mbytes_out" having NULL values) Here the table definition in PostgreSQL:   CREATE TABLE myschema.mytable ( field01 integer NOT NULL, field02 character varying(6) NOT NULL, field03 character varying(6), field04 character varying(15) NOT NULL, field05 timestamp(6) with time zone NOT NULL, mbytes_in numeric(12, 2), mbytes_out numeric(12, 2), field06 character varying(15) NOT NULL, field07 character varying(50), field08 character varying(50), field09 character varying(50) NOT NULL, field10 character varying(255) NOT NULL, field11 character varying(15) NOT NULL, field12 character varying(255), field13 date NOT NULL, field14 character varying(255) NOT NULL, CONSTRAINT my_pkey PRIMARY KEY (field01) ) WITH ( OIDS = FALSE ) TABLESPACE mytablespace; ALTER TABLE myschema.mytable OWNER to myuser; GRANT ALL ON TABLE myschema.mytable TO myuser;     The log error that pointed me to a solution was the following: at com.splunk.dbx.command.DbxOutputCommand.main(DbxOutputCommand.java:100) Caused by: java.lang.NullPointerException at java.math.BigDecimal.<init>(BigDecimal.java:809)   By the way no valuable logs were present in Splunk _internal index, usually when some SPL query fail to insert into our PostgreSQL DB I find valuable information like SQL codes and SQL errors. This time it was not present. I hope this post will help someone having the same issue. Best Regards, Edoardo
I've configured via the app instructions and pushed the files I want to be tracked.  Yay.  The app install went well also.  The issue I'm having is that the push from Splunk to the repository is fail... See more...
I've configured via the app instructions and pushed the files I want to be tracked.  Yay.  The app install went well also.  The issue I'm having is that the push from Splunk to the repository is failing with these messages. EXITCODE: 0 repo_size=181149490 COMMAND: git push OUTPUT: fatal: could not read Username for 'https://github.ibm.com ': No such device or address EXITCODE: 128 An exception of type Exception occurred. Arguments: ('Error occured - is authentication to remote site correct? and network path available?',) runtime=0.23 status=1 Is there additional configuration I need to do?
I'm new to splunk, how can I import syslog from my local computer to splunk?  - when i search it says it can be done via universal forwarder. but I want to collect my syslog logs on localhost. -I o... See more...
I'm new to splunk, how can I import syslog from my local computer to splunk?  - when i search it says it can be done via universal forwarder. but I want to collect my syslog logs on localhost. -I opened the 514 udp port and created my settings on splunk. But it doesn't show up in search.
Hi, We have 1000 EC2 instances, how to install forwarders in all instances all at one go? If we use script, from where we need to push the forwarder config to all 1000 instances?  
Hi, With HEC token we see loss in logs. 1. Is there a way to get the logs that were lost? 2. How will we know that there are log loss?