All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Are you getting the logs in via a Kapersky app? If so, is it possible to set the app to debug mode, or perhaps on the Kapersky side, so as to get more detailed log messages describing what component ... See more...
Are you getting the logs in via a Kapersky app? If so, is it possible to set the app to debug mode, or perhaps on the Kapersky side, so as to get more detailed log messages describing what component is not working?
I have a feeling that Splunk is automatically capping the number of rows when you use | timechart span=1s (this could result in 86400 rows per day), which would explain why your search works fine wit... See more...
I have a feeling that Splunk is automatically capping the number of rows when you use | timechart span=1s (this could result in 86400 rows per day), which would explain why your search works fine with 1-2 days but not with more than three. Maybe you could try binning the _time to a 1s value and then doing stats on it. index=test elb_status_code=200 | bin _time span=1s | stats count as total by _time | stats count as num_seconds by total | sort 0 total I am also curious how you got it to show values for total of 0. The count() function does not do that by default.
@PickleRick Permission is already set to global already for field alias.
Assuming your naming is OK, check the permissions.
Well, since you need to connect to an existing database, it must have been set up and be maintained by someone. That's the easiest way to find out - go and ask You can find most popular engines h... See more...
Well, since you need to connect to an existing database, it must have been set up and be maintained by someone. That's the easiest way to find out - go and ask You can find most popular engines here https://en.wikipedia.org/wiki/Relational_database#List_of_database_engines (among other sources).
@gcusello Thanks for you help to understand the issue.  Case1: Actually, there is no timestamp present in the provided csv. In the snapshot you're seeing the data is getting from sample i ingested... See more...
@gcusello Thanks for you help to understand the issue.  Case1: Actually, there is no timestamp present in the provided csv. In the snapshot you're seeing the data is getting from sample i ingested from the dev machine via UF, here even i am not able to see in the events no "timestamp" field.    Case2: When i upload the csv in the data inputs, after selecting the sourcetype as "cmkcsv" there it is showing the timestamp field. So here whatever settings i added in the advance it's not at all removing the warning flag as "failed to parse timestamp defaulting to file modtime" [ cmkcsv ] DATETIME_CONFIG=CURRENT INDEXED_EXTRACTIONS=csv KV_MODE=none LINE_BREAKER=\n\W NO_BINARY_CHECK=true SHOULD_LINEMERGE=false TIME_FORMAT=%Y-%m-%d %H:%M:%S TRUNCATE=200 category=Structured description=Comma-separated value format. Set header and other settings in "Delimited Settings" disabled=false pulldown_type=true TIME_PREFIX=^\w+\s*\w+,\s*\w+,\s* MAX_TIMESTAMP_LOOKAHEAD=20  
Hi @PickleRick    Thank you for your replay, as you probably noticed, I'm not a Data Base person. can you please explain me how I can find out what kind of Data Base I have? and what kind of data... See more...
Hi @PickleRick    Thank you for your replay, as you probably noticed, I'm not a Data Base person. can you please explain me how I can find out what kind of Data Base I have? and what kind of data base there are?
Not sure if this will help but you could try  | sort 0 total
  Hello Splunkers!! Below are the sample event and I want to extract some fields into the Splunk while indexing. I have used below props.conf to extract fields but nothing coming to Splunk in inte... See more...
  Hello Splunkers!! Below are the sample event and I want to extract some fields into the Splunk while indexing. I have used below props.conf to extract fields but nothing coming to Splunk in interesting fields.As well as i attched the screenshot of Splunk UI results in the attachment. Please guide me what i need to change in the setting? [demo] KEEP_EMPTY_VALS = false KV_MODE = xml LINE_BREAKER = <\/eqtext:EquipmentEvent>() MAX_TIMESTAMP_LOOKAHEAD = 24 NO_BINARY_CHECK = true SEDCMD-first = s/^.*<eqtext:EquipmentEvent/<eqtext:EquipmentEvent/g SHOULD_LINEMERGE = false TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%3f%Z TIME_PREFIX = ((?<!ReceiverFmInstanceName>))<eqtext:EventTime> TRUNCATE = 100000000 category = Custom disabled = false pulldown_type = true FIELDALIAS-fields_scada_xml = "eqtext:EquipmentEvent.eqtext:ID.eqtext:Location.eqtext:PhysicalLocation.AreaID" AS area "eqtext:EquipmentEvent.eqtext:ID.eqtext:Location.eqtext:PhysicalLocation.ElementID" AS element "eqtext:EquipmentEvent.eqtext:ID.eqtext:Location.eqtext:PhysicalLocation.EquipmentID" AS equipment "eqtext:EquipmentEvent.eqtext:ID.eqtext:Location.eqtext:PhysicalLocation.ZoneID" AS zone "eqtext:EquipmentEvent.eqtext:ID.eqtext:Description" AS description "eqtext:EquipmentEvent.eqtext:ID.eqtext:MIS_Address" AS mis_address "eqtext:EquipmentEvent.eqtext:Detail.State" AS state "eqtext:EquipmentEvent.eqtext:Detail.eqtext:EventTime" AS event_time "eqtext:EquipmentEvent.eqtext:Detail.eqtext:MsgNr" AS msg_nr "eqtext:EquipmentEvent.eqtext:Detail.eqtext:OperatorID" AS operator_id "eqtext:EquipmentEvent.eqtext:Detail.ErrorType" AS error_type "eqtext:EquipmentEvent.eqtext:Detail.Severity" AS severity ================================= <eqtext:EquipmentEvent xmlns:eqtext="http://vanderlande.com/FM/EqtEvent/EqtEventExtTypes/V1/1/5" xmlns:sbt="http://vanderlande.com/FM/Common/Services/ServicesBaseTypes/V1/8/4" xmlns:eqtexo="http://vanderlande.com/FM/EqtEvent/EqtEventExtOut/V1/1/5"><eqtext:ID><eqtext:Location><eqtext:PhysicalLocation><AreaID>8503</AreaID><ZoneID>3</ZoneID><EquipmentID>3</EquipmentID><ElementID>0</ElementID></eqtext:PhysicalLocation></eqtext:Location><eqtext:Description> LMS not healthy</eqtext:Description><eqtext:MIS_Address>0.3</eqtext:MIS_Address></eqtext:ID><eqtext:Detail><State>WENT_OUT</State><eqtext:EventTime>2024-04-02T21:09:38.337Z</eqtext:EventTime><eqtext:MsgNr>4657614997395580315</eqtext:MsgNr><Severity>LOW</Severity><eqtext:OperatorID>WALVAU-SCADA-1</eqtext:OperatorID><ErrorType>TECHNICAL</ErrorType></eqtext:Detail></eqtext:EquipmentEvent>    
The path looks good. Assuming your index=sysmon exists, it should bring in logs. Give it a shot and see if the logs come in.
Hi, I have this search for example: index=test elb_status_code=200  | timechart count as total span=1s | stats count as num_seconds by total | sort by total When I search this for 1,2 days - my re... See more...
Hi, I have this search for example: index=test elb_status_code=200  | timechart count as total span=1s | stats count as num_seconds by total | sort by total When I search this for 1,2 days - my result includes total of 0,1,2,3 etc.. when i go above, 3 days for example - I loose all the data about the 0 value and my results start with 1,2,3 etc..  Anyone could explain this? am I doing something wrong or could this be a bug somewhere? 
Hi, I got an error while i want to get logs from Kaspersky Console. I`ve done all the tasks to add it such as port,IP , .... index="kcs" Type=Error Message="Cannot start sending events to the SIEM ... See more...
Hi, I got an error while i want to get logs from Kaspersky Console. I`ve done all the tasks to add it such as port,IP , .... index="kcs" Type=Error Message="Cannot start sending events to the SIEM system. Functionality in limited mode. Area: System Management."
Hi @phanikumarcs, the timestamp field is one of the columns of your csv file or it's automatically generated by Splunk because it isn't present in the csv file? I don't see the timestamp field in t... See more...
Hi @phanikumarcs, the timestamp field is one of the columns of your csv file or it's automatically generated by Splunk because it isn't present in the csv file? I don't see the timestamp field in the screenshot you shared. In your screenshot and in your table there are only the following fields: Subscription Name, Resource Group Name, Key Vault Name, Secret Name, Expiration Date, Months. Ciao. Giuseppe  
@gcusello yeah i tried the data add via upload, there when i select sourcetype as csv there i can see the timestamp field. 
hi experts seek assistance with configuring Sysmon for inputs.conf on a Splunk Universal Forwarder. Configuration based on the Splunk Technology Add-on (TA) for Sysmon. [WinEventLog://Microsoft... See more...
hi experts seek assistance with configuring Sysmon for inputs.conf on a Splunk Universal Forwarder. Configuration based on the Splunk Technology Add-on (TA) for Sysmon. [WinEventLog://Microsoft-Windows-Sysmon/Operational] disabled = false renderXml = 1 source = XmlWinEventLog:Microsoft-Windows-Sysmon/Operational index = sysmon is this the correct config ?
I created an API test with Synthetics but I can't set up a detector to check if 2 consecutive requests (2 in a row) are in error. Is there any way to configure the detector to raise an alarm if 2 req... See more...
I created an API test with Synthetics but I can't set up a detector to check if 2 consecutive requests (2 in a row) are in error. Is there any way to configure the detector to raise an alarm if 2 requests in a row go in error?
This is actually a question to your local Splunk sales representative (or your local Partner since you're probably purchasing licenses through a Partner). Typically your ES license must match your m... See more...
This is actually a question to your local Splunk sales representative (or your local Partner since you're probably purchasing licenses through a Partner). Typically your ES license must match your main SE license volumewise and I've never seen it otherwise. The general idea is that all data you are indexing can be used for ES purposes so the only case where you might be able to have a "partial" ES situation would be if you had two separate licences for two separate environments - one with ES and one without. But that in itself is not something you'll easily get.- honestly I'm not sure if you can get such deal unless you are a huuuuuge customer and have completely disconnected environments (otherwise you're supposed to use a single license manager and split your license into separate licensing stacks). Bottom line is - your ES license must match the size of your main SE license and it is highly unlikely you'll get it otherwise.
Hi @phanikumarcs , probably Splunk doesn'r recognize the timestamp field and format you configured: in your data I don't see the field "timestamp" with the format %Y-%m-%d %H:%M:%S, where is it? T... See more...
Hi @phanikumarcs , probably Splunk doesn'r recognize the timestamp field and format you configured: in your data I don't see the field "timestamp" with the format %Y-%m-%d %H:%M:%S, where is it? Try to manualli add a sample of these data using the Add Data function that guides you in the sourcetype creation. Ciao. Giuseppe
Hi Team, @ITWhisperer @gcusello  I am parsing the CSV data to Splunk, testing in dev windows machine from UF. This is the sample csv data: Subscription Name  Resource Group Name  Key Vault Na... See more...
Hi Team, @ITWhisperer @gcusello  I am parsing the CSV data to Splunk, testing in dev windows machine from UF. This is the sample csv data: Subscription Name  Resource Group Name  Key Vault Name  Secret Name  Expiration Date  Months SUB-dully  core-auto  core-auto  core-auto-cert  2022-07-28 -21 SUB-gully  core-auto  core-auto  core-auto-cert  2022-07-28 -21 SUB-pally  core-auto  core-auto  core-auto-cert  2022-09-01 -20   The output i am getting, all events in single event. I created inputs.conf, sourcetype where the sourcetype configurations are Can anyone help me why is it's not breaking.  
I don't know how your licensing differentiates between "Splunk Core" logs and "ES" logs, but if you can find out the technical measure that Splunk uses to decide which category the logs belong to, th... See more...
I don't know how your licensing differentiates between "Splunk Core" logs and "ES" logs, but if you can find out the technical measure that Splunk uses to decide which category the logs belong to, then there should be a way to configure your Splunk environment to shift logs one way or the other. E.g. if it's based on sourcetypes or indexes, then those can be changed with props.conf and transforms.conf if it's based on data models, then those can be changed with eventtypes and tags if it's based on indexers, then that can be changed with the indexer architecture