All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, We had this error on an output query set-up on Splunk DB Connect. Basically the Splunk query is inserting data into an external database.     2023-11-08 01:58:32.712 +0100 [QuartzSchedul... See more...
Hello, We had this error on an output query set-up on Splunk DB Connect. Basically the Splunk query is inserting data into an external database.     2023-11-08 01:58:32.712 +0100 [QuartzScheduler_Worker-9] ERROR org.easybatch.core.job.BatchJob - Unable to read next record java.lang.RuntimeException: javax.xml.stream.XMLStreamException: ParseError at [row,col]:[836463,5] Message: Premature EOF at com.splunk.ResultsReaderXml.getNextEventInCurrentSet(ResultsReaderXml.java:128) at com.splunk.ResultsReader.getNextElement(ResultsReader.java:87) at com.splunk.ResultsReader.getNextEvent(ResultsReader.java:64) at com.splunk.dbx.server.dboutput.recordreader.DbOutputRecordReader.readRecord(DbOutputRecordReader.java:82) at org.easybatch.core.job.BatchJob.readRecord(BatchJob.java:189) at org.easybatch.core.job.BatchJob.readAndProcessBatch(BatchJob.java:171) at org.easybatch.core.job.BatchJob.call(BatchJob.java:101) at org.easybatch.extensions.quartz.Job.execute(Job.java:59) at org.quartz.core.JobRunShell.run(JobRunShell.java:202) at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573) Caused by: javax.xml.stream.XMLStreamException: ParseError at [row,col]:[836463,5] Message: Premature EOF at com.sun.org.apache.xerces.internal.impl.XMLStreamReaderImpl.next(XMLStreamReaderImpl.java:599) at com.sun.xml.internal.stream.XMLEventReaderImpl.nextEvent(XMLEventReaderImpl.java:83) at com.splunk.ResultsReaderXml.getResultKVPairs(ResultsReaderXml.java:306) at com.splunk.ResultsReaderXml.getNextEventInCurrentSet(ResultsReaderXml.java:124) ... 9 common frames omitted     The issue was related to a query timeout. We have set-up the upsert_id in the Splunk DB Connect output configuration so that Splunk can go in insert_update. Looking into _internal log we understood that Splunk, when using the upsert_id, performs a select query for each record it has to insert and then commits every 1000 records (by default):   2023-11-10 01:22:28.215 +0100 [QuartzScheduler_Worker-12] INFO com.splunk.dbx.connector.logger.AuditLogger - operation=dboutput connection_name=SPLUNK_CONN stanza_name=SPLUNK_OUTPUT state=success sql='SELECT FIELD01,FIELD02,FIELD03 FROM MYSCHEMA.MYTABLE WHERE UPSERT_ID=?'       2023-11-10 01:22:28.258 +0100 [QuartzScheduler_Worker-12] INFO com.splunk.dbx.connector.logger.AuditLogger - operation=dboutput connection_name=SPLUNK_CONN stanza_name=SPLUNK_OUTPUT state=success sql='INSERT INTO MYSCHEMA.MYTABLE (FIELD01,FIELD02,FIELD03) values (?,?,?)'     Upsert_id is very useful to avoid an sql duplicate key error, and whenever you want to recover data in case the output is failing for some reason. You basically re-run the output query and if the record already exists it is replaced in the sql table. But the side effect is that the WHERE condition of the SELECT statement can be very inefficient if the Database table start to be huge. The solution is to create in the output Database table an SQL index on the upsert_id field.   The output run passed from 11 minutes to 11 seconds, avoiding to hit the timeout of the Splunk DB Connect (30 seconds by default, calculated for every commit).   Best Regards, Edoardo
I am trying to Install event services from Enterprice Console But don't know this error how to handle it  This is Error : Task failed: Starting the Events Service api store node on host: newMachine... See more...
I am trying to Install event services from Enterprice Console But don't know this error how to handle it  This is Error : Task failed: Starting the Events Service api store node on host: newMachineAp as user: root with message: Connection to [<a href="<a href="http://newMachineAp:9080/_ping" target="_blank">http://newMachineAp:9080/_ping</a>" target="_blank"><a href="http://newMachineAp:9080/_ping" target="_blank">http://newMachineAp:9080/_ping</a></a>] failed due to [Failed to connect to newmachineap/192.168.27.211:9080].
Please check the above ss , how can i convert count column in gb , and how am able to know which measurement of count is.
Hello everyone, I am encountering an issue with the Alert Manager Enterprise application; following the triggering of an alert, no event is created in my dedicated index. The status of the health... See more...
Hello everyone, I am encountering an issue with the Alert Manager Enterprise application; following the triggering of an alert, no event is created in my dedicated index. The status of the health check is okay, and we are able to create test events:    Another point to note is that in the application's troubleshooting logs, when an alert is triggered, the event creation occurs but nothing is created in the index: There are no permission issues, as I have confirmed by manually writing a search that we can create events in the index: | makeresults | eval user="TEST", src="192.168.0.1", action="create test event" | sendalert create_alert param.title="Hello $result.user$" param.template=default This successfully creates my event in my index. I have exhausted my troubleshooting ideas, do you have any suggestions on how to resolve this issue? Thank you for your help. MCH
Hi, I implemented an input filter, but i want to improve it. Customers want to select multiple values from the filter and then select more values. in the current situation they need to select 'All... See more...
Hi, I implemented an input filter, but i want to improve it. Customers want to select multiple values from the filter and then select more values. in the current situation they need to select 'All ' and then select the values again (each time they want to add values they need to select All-->select values-->remove All) in addition, they want to select all values except one value, which currently takes time to do. Is there a smarter filter input in Splunk? my code: <input type="multiselect" token="WinTimeStamp" searchWhenChanged="true"> <label>Time</label> <choice value="%">All</choice> <default>%</default> <prefix>(</prefix> <suffix>)</suffix> <valuePrefix>(WinTimeStamp like("</valuePrefix> <valueSuffix>"))</valueSuffix> <delimiter> OR </delimiter> <fieldForLabel>WinTimeStamp</fieldForLabel> <fieldForValue>WinTimeStamp</fieldForValue> <search> <query> | where $Name$ | dedup WinTimeStamp | sort WinTimeStamp</query> </search> </input> Thanks, Maayan
Hi All, Trying to create a dashboard in studio with dynamic coloring elements for single value searches (ie traffic lights) When I go to the Coloring > Dynamic Elements dropdown to select Background... See more...
Hi All, Trying to create a dashboard in studio with dynamic coloring elements for single value searches (ie traffic lights) When I go to the Coloring > Dynamic Elements dropdown to select Background it will not select. A click just has it disappear and show none on the dropdown. Search works fine and displays the correct value. Running enterprise on prem version 9.0.0 BUILD 6818ac46f2ec. Have not found anything on the net or here to suggest this version has an issue. Looking to update to 9.1.1 but as this is production that is a planned exercise. Search is  index=SEPM "virus found" |stats count(message) as "Infected hosts" and I have this traffic light in the normal dashboard build. Just trying studio see how it is.     Any help appreciated!
Hi Team, At present, SSL encryption is enabled between the Universal Forwarder (UF) and the Heavy Forwarder (HF), while communication from HF to Indexers occurs without SSL encryption. However, ther... See more...
Hi Team, At present, SSL encryption is enabled between the Universal Forwarder (UF) and the Heavy Forwarder (HF), while communication from HF to Indexers occurs without SSL encryption. However, there are plans to establish an SSL channel between the HF and Indexers in the future. Additionally, communication between Indexers and the License Master, as well as between HF and the License Master, currently operates through non-SSL channels. There is a requirement to transition these communications to SSL-enabled connections. Could you provide guidance or documentation outlining the necessary implementation steps for securing the communication from Indexers & HF to License Master to facilitate these changes?
My dataset has historical monthly average temperature for years 1745 to 2013. Since my source is a csv file, I used the following so the that the _time field represents the timestamp in each event : ... See more...
My dataset has historical monthly average temperature for years 1745 to 2013. Since my source is a csv file, I used the following so the that the _time field represents the timestamp in each event :   source="Global warming trends.zip:*" source="Global warming trends.zip:./GlobalLandTemperaturesByMajorCity.csv" Country=Canada City=Montreal dt=*-01-* AverageTemperature="*" | eval _time=strptime(dt,"%Y-%m-%d")   However, all the events dated 1970 and prior don't have their timestamp in the 'Time' column, as per the attached capture. I suspect this has do do with Epoch time, but how do I fix this so I can vizualize my entire data set in a line chart?
we had a vendor setup a Splunk instance for us a while ago and one of the things they did was setup a Brute Force attack alert using the following search, | tstats summariesonly=t allow_old_summarie... See more...
we had a vendor setup a Splunk instance for us a while ago and one of the things they did was setup a Brute Force attack alert using the following search, | tstats summariesonly=t allow_old_summaries=t count from datamodel=Authentication by Authentication.action, Authentication.src | rename Authentication.src as source, Authentication.action as action | chart last(count) over source by action | where success>0 and failure>20 | sort -failure | rename failure as failures | fields - success, unknown Now this seems to work OK as I'm getting regular alerts, but these alerts contain little if any detail. Sometimes they contain a server name, so I've checked that server. I can see some failed login attempts on that server, but again, not detail. No account details, not IPs, no servers names. It may be some sort of scheduled task as i get an alert from Splunk every hour and every time it has about the same number of Brute Force attacks (24). But I can't see any scheduled tasks that may cause this. Does anyone have any suggestions on how to track down what is causing these false alerts ?
Im trying to get specific results if two values in the same field are true but I keep failing I want to count the number of times a  sc_action=REQ_PASSED when sc_action=REQ_CHALLENGE_CAPTCHA was req... See more...
Im trying to get specific results if two values in the same field are true but I keep failing I want to count the number of times a  sc_action=REQ_PASSED when sc_action=REQ_CHALLENGE_CAPTCHA was required   I tried this : My search | eval activity=if(IN(sc_action, "REQ_CHALLENGE_CAPTCHA", "REQ_PASSED")"passed","captcha") | stats count by activity I tried if/where and evals, I either get get an error or I get all the results where both are true. Maybe im overthinking it
Hi,  I am running a monthly report to show (unique users) logged in to the system (API) each month until the current time.  . . . . .  earliest=@mon latest=now | stats dc(CN)    But I have diffi... See more...
Hi,  I am running a monthly report to show (unique users) logged in to the system (API) each month until the current time.  . . . . .  earliest=@mon latest=now | stats dc(CN)    But I have difficulty in calculating the the number of new users who have logged in per month as a running total to show a trend of new users over time.  The Query should run as following example:  If 50 distinct users log in in October and none of the 50 has logged in before, the total is 50. If 75 distinct users log in in November but 50 of them are the same that logged in in October, the number of new users is 25. Combined with the total for October, the total for November becomes 75.    
Hello, I need to generate the below report, can someone help please? thank you!!   format: .csv  List of events: authentication failure activity, user logon failure : bad password, user logon fail... See more...
Hello, I need to generate the below report, can someone help please? thank you!!   format: .csv  List of events: authentication failure activity, user logon failure : bad password, user logon failure: bad username, table with subset of fields: user, date/time, VendorMsgID, account, class, process name, object, subject, logMsg) grouped by user schedule: daily search window: -24 hours Expiration= 30 days
Hi, we currently have our on prem Splunk infrastructure running on CentOS 7 servers.  CentOS 7 is going EOL in 2024.  We would like to migrate these servers to a supported Linux OS, preferably one wi... See more...
Hi, we currently have our on prem Splunk infrastructure running on CentOS 7 servers.  CentOS 7 is going EOL in 2024.  We would like to migrate these servers to a supported Linux OS, preferably one with long term support.  What do you recommend for this?  We are strongly considering RHEL 9.2.  Thank you!
I haven't been able to find any documentation stating whether or not DB Connect is IPv6 compatible.  My customer has a requirement to migrate to IPv6 and I need to know if this app will still be able... See more...
I haven't been able to find any documentation stating whether or not DB Connect is IPv6 compatible.  My customer has a requirement to migrate to IPv6 and I need to know if this app will still be able pull data from an MS-SQL DB once the protocol changes.
Please help me on below things: Requirements: Once 3 events meets, immediately next event should published.if event is not published after 5 min ,need alert. Example : We have one customerno.for ... See more...
Please help me on below things: Requirements: Once 3 events meets, immediately next event should published.if event is not published after 5 min ,need alert. Example : We have one customerno.for the customer number ,I have to search whether 3 events meets logs available in the splunk log or not  Ex: index= 1 source type ="abc" "s1 event received" and "s2 event received" and "s3 event received"     When I search above query ,I will be getting like S1 received for 12345 customer S2 received for 12345 customer S3 received for 12345 customer   If for one customer,all 3 event are met,next i want to search "created" message available in the splunk for same customer (12345) Here "created" message index and source type is different If "created" message not available for 12345 customer no after 5 min once all 3 events meets,I need alert.pls help on this query.
Hi I use a splunk alert with a 24 hours slottime what is strange is that this alert show me an event older than 24 hours so I have 2 questiosn 1) How is it possible that an alert occurs with an e... See more...
Hi I use a splunk alert with a 24 hours slottime what is strange is that this alert show me an event older than 24 hours so I have 2 questiosn 1) How is it possible that an alert occurs with an event outside the slot time specified? 2) How to customize the alert for being sure that it shows only new events and not events already shown?  It means that I need the alert occurs just one time when an event is detected thanks
Hi, on the weekend we had an electric problem on our secondary datacenter and currently dont have energy. One indexer was on that datacenter, it was on cluster with another indexer.  There are any t... See more...
Hi, on the weekend we had an electric problem on our secondary datacenter and currently dont have energy. One indexer was on that datacenter, it was on cluster with another indexer.  There are any tasks that i must do in the meantime? the estimated recovery time for the datacenter is 3 to 4 days, maybe put the indexers on maintenance mode? i've read this https://docs.splunk.com/Documentation/Splunk/9.1.1/Indexer/Whathappenswhenapeergoesdown but only talks about what happens with bucket fixing Regards.
Hello all,  I have a lookup with a single column that lists source file names and paths.  I want to search an index and lookup the sources, then show the latest time of those sources.  I also want... See more...
Hello all,  I have a lookup with a single column that lists source file names and paths.  I want to search an index and lookup the sources, then show the latest time of those sources.  I also want to show if a file hasn't logged at all in a given timeframe. I set the lookup to use WILDCARD() in the lookup definition, but I am now struggling with the search. I basically want the search to lookup each source file, then search the index and tell me what the latest time of the log is, as well as show a "No Logs Found" if source doesn't exist. I was toying with this, but the wildcards aren't working, and I think it is because I am not using the definition.  But even so, I can't wrap my ahead around the search.     | inputlookup pvs_source_list | join type=left source [| search index=pvs | stats latest(_time) as TimeAx by source]     Thank you!
Hi, We are testing Splunk Add-on for Sysmon for Linux to ingest Sysmon data from Linux systems. Data ingestion and majority of the extractions are working fine, except the Data part.   <Data Name=... See more...
Hi, We are testing Splunk Add-on for Sysmon for Linux to ingest Sysmon data from Linux systems. Data ingestion and majority of the extractions are working fine, except the Data part.   <Data Name="FieldName">    It appears that Splunk is completely skips over this. We have Sysmon for Windows working as well and same attribute gets extracted just fine. Data format between Sysmon from Linux Vs Windows is identical, so are the transform stanza's in the TA's. Only difference I could see is that the field name in Windows is enclosed in single quotes where for Linux it is double quotes. Could this be causing the regex in TA to not work for Data ? Including some examples here.  Sample Data from Linux Sysmon   <Event><System><Provider Name="Linux-Sysmon" Guid="{ff032593-a8d3-4f13-b0d6-01fc615a0f97}"/><EventID>3</EventID><Version>5</Version><Level>4</Level><Task>3</Task><Opcode>0</Opcode><Keywords>0x8000000000000000</Keywords><TimeCreated SystemTime="2023-11-13T13:34:45.693615000Z"/><EventRecordID>140108</EventRecordID><Correlation/><Execution ProcessID="24493" ThreadID="24493"/><Channel>Linux-Sysmon/Operational</Channel><Computer>computername</Computer><Security UserId="0"/></System><EventData><Data Name="RuleName">-</Data><Data Name="UtcTime">2023-11-13 13:34:45.697</Data><Data Name="ProcessGuid">{ba131d2e-2a52-6550-285f-207366550000}</Data><Data Name="ProcessId">64284</Data><Data Name="Image">/opt/splunkforwarder/bin/splunkd</Data><Data Name="User">root</Data><Data Name="Protocol">tcp</Data><Data Name="Initiated">true</Data><Data Name="SourceIsIpv6">false</Data><Data Name="SourceIp">x.x.x.x</Data><Data Name="SourceHostname">-</Data><Data Name="SourcePort">60164</Data><Data Name="SourcePortName">-</Data><Data Name="DestinationIsIpv6">false</Data><Data Name="DestinationIp">x.x.x.x</Data><Data Name="DestinationHostname">-</Data><Data Name="DestinationPort">8089</Data><Data Name="DestinationPortName">-</Data></EventData></Event>   Sample data from Windows Sysmon   <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Sysmon' Guid='{5770385f-c22a-43e0-bf4c-06f5698ffbd9}'/><EventID>3</EventID><Version>5</Version><Level>4</Level><Task>3</Task><Opcode>0</Opcode><Keywords>0x8000000000000000</Keywords><TimeCreated SystemTime='2023-11-13T13:26:31.064124600Z'/><EventRecordID>1571173614</EventRecordID><Correlation/><Execution ProcessID='2988' ThreadID='5720'/><Channel>Microsoft-Windows-Sysmon/Operational</Channel><Computer>computername</Computer><Security UserID='S-1-5-18'/></System><EventData><Data Name='RuleName'>-</Data><Data Name='UtcTime'>2023-11-13 13:26:13.591</Data><Data Name='ProcessGuid'>{f4558f15-1db6-654f-8400-000000007a00}</Data><Data Name='ProcessId'>4320</Data><Data Name='Image'>C:\..\..\image.exe</Data><Data Name='User'>NT AUTHORITY\SYSTEM</Data><Data Name='Protocol'>tcp</Data><Data Name='Initiated'>true</Data><Data Name='SourceIsIpv6'>false</Data><Data Name='SourceIp'>127.0.0.1</Data><Data Name='SourceHostname'>computername</Data><Data Name='SourcePort'>64049</Data><Data Name='SourcePortName'>-</Data><Data Name='DestinationIsIpv6'>false</Data><Data Name='DestinationIp'>127.0.0.1</Data><Data Name='DestinationHostname'>computername</Data><Data Name='DestinationPort'>4932</Data><Data Name='DestinationPortName'>-</Data></EventData></Event>   Transforms on both sides are also identical except the difference for single Vs double quotes.   Linux [sysmon-data] REGEX = <Data Name="(.*?)">(.*?)</Data> FORMAT = $1::$2 Windows [sysmon-data] REGEX = <Data Name='(.*?)'>(.*?)</Data> FORMAT = $1::$2    Any clues on what could be causing Splunk to not extract Data attribute for Linux? Transforms for other elements such as Computer, Keywords are working fine, it just skips this Data part completely. Thanks,
hi , We could be the  issue regarding the absence of Windows Security Command Line " EventCode=4688 ParentProcessName="C:\\Windows\\System32\\cmd.exe"  events. I have not blacklisted it in any of... See more...
hi , We could be the  issue regarding the absence of Windows Security Command Line " EventCode=4688 ParentProcessName="C:\\Windows\\System32\\cmd.exe"  events. I have not blacklisted it in any of the app. Thanks.