Hi We are facing below error while Run the search in search head. This is coming frequently and unable to solve it. We have checked bundle size & Network connectivity between Indexer & Search he...
See more...
Hi We are facing below error while Run the search in search head. This is coming frequently and unable to solve it. We have checked bundle size & Network connectivity between Indexer & Search heads. All looks good but still getting this below error. Please check and provide me hopeful solution on this. Unable to distribute to peer named uswaa-dopsidt01.cgdop.com at uri https://XXXX:8089 because replication was unsuccessful. ReplicationStatus: Failed - Failure info: failed_because_BUNDLE_DATA_TRANSMIT_FAILURE. Verify connectivity to the search peer, that the search peer is up, and that an adequate level of system resources are available. See the Troubleshooting Manual for more information.
Hi @altink , retention is defined at index level, so you cannot have a different retention for a sourcetype. If you didn't modified the retention of this index ,by default it's 30 days and ususlly ...
See more...
Hi @altink , retention is defined at index level, so you cannot have a different retention for a sourcetype. If you didn't modified the retention of this index ,by default it's 30 days and ususlly is reducted to save disk space. See in the Monitoring Console or in [Settings > index] what's the latest event in your _internal index to understand the retention of your _internal index and eventually enlarge it changing the frozenTimePeriodInSecs parametr in $SPLUNK_HOME/etc/system/local/indexes.conf file _internal stanza. If you didn't modified this file before you haven't it in the local folder, so you have to copy the stand from the same file in the default folder. Ciao. Giuseppe
@gcusello I have a standalone Windows Splunk server, and from the same server I can able to access the network folder as provided in the screenshot earlier.
Hi @BTB , yes, as @yuanliu said, <my_index> means that you have to use the name of your index, without the angular brackets, and it should work. Ciao. Giuseppe
Hi @richgalloway , i added the year and it is working fine now. I want to only show the calendar week in the chart, any solution for this? What i understand is that we cannot change the labels.
Hi @uagraw01, please could you better describe your architecture? have you a stand alone Splunk server? have you a Forwarder or folders to monitor are accessed by the Splunk server? which user ar...
See more...
Hi @uagraw01, please could you better describe your architecture? have you a stand alone Splunk server? have you a Forwarder or folders to monitor are accessed by the Splunk server? which user are you usig to run Splunk on the the system accessing the folders to monior? have this user the grants to read the files? Ciao. Giuseppe
Hello Splunker In my request, I want to monitor the below files, which are under the network folder. I have configured indexes.conf, props.conf, inputs.conf & transforms.conf but nothing is workin...
See more...
Hello Splunker In my request, I want to monitor the below files, which are under the network folder. I have configured indexes.conf, props.conf, inputs.conf & transforms.conf but nothing is working for me to get data into Splunk. Please check my config and help or suggest me if any changes are required. inputs.conf : [monitor://\\WALVAU-SCADA-1\d$\CM\alarmreports\outgoing*] disabled = false index = scada host = WALVAU-SCADA-1 sourcetype = cm_scada_xml indexes.conf : [scada] coldPath = $SPLUNK_DB/scada/colddb enableDataIntegrityControl = 0 enableTsidxReduction = 0 homePath = $SPLUNK_DB/scada/db maxTotalDataSizeMB = 512000 thawedPath = $SPLUNK_DB/scada/thaweddb props.conf : [cm_scada_xml] KEEP_EMPTY_VALS = false KV_MODE = xml LINE_BREAKER = <\/eqtext:EquipmentEvent>() MAX_TIMESTAMP_LOOKAHEAD = 24 NO_BINARY_CHECK = true SEDCMD-first = s/^.*<eqtext:EquipmentEvent/<eqtext:EquipmentEvent/g SHOULD_LINEMERGE = false TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%3f%Z TIME_PREFIX = ((?<!ReceiverFmInstanceName>))<eqtext:EventTime> TRUNCATE = 100000000 category = Custom disabled = false pulldown_type = true TRANSFORMS-remove-xml-footer = remove-xml-footer TRANSFORMS-keep-came-in-and-went-out-states = keep-came-in-and-went-out-states FIELDALIAS-fields_scada_xml = "eqtext:EquipmentEvent.eqtext:ID.eqtext:Location.eqtext:PhysicalLocation.AreaID" AS area "eqtext:EquipmentEvent.eqtext:ID.eqtext:Location.eqtext:PhysicalLocation.ElementID" AS element "eqtext:EquipmentEvent.eqtext:ID.eqtext:Location.eqtext:PhysicalLocation.EquipmentID" AS equipment "eqtext:EquipmentEvent.eqtext:ID.eqtext:Location.eqtext:PhysicalLocation.ZoneID" AS zone "eqtext:EquipmentEvent.eqtext:ID.eqtext:Description" AS description "eqtext:EquipmentEvent.eqtext:ID.eqtext:MIS_Address" AS mis_address "eqtext:EquipmentEvent.eqtext:Detail.State" AS state "eqtext:EquipmentEvent.eqtext:Detail.eqtext:EventTime" AS event_time "eqtext:EquipmentEvent.eqtext:Detail.eqtext:MsgNr" AS msg_nr "eqtext:EquipmentEvent.eqtext:Detail.eqtext:OperatorID" AS operator_id "eqtext:EquipmentEvent.eqtext:Detail.ErrorType" AS error_type "eqtext:EquipmentEvent.eqtext:Detail.Severity" AS severity transforms.conf : [remove-xml-footer] REGEX = <\/eqtexo:EquipmentEventReport> DEST_KEY = queue FORMAT = nullQueue [keep-came-in-and-went-out-states] REGEX = <State>(?!CAME_IN|WENT_OUT).*?<\/State> DEST_KEY = queue FORMAT = nullQueue
I need to post some custom metrics to AppDynamics for analytics purpose. So I am trying to create new Transaction from application source code using below code. Transaction transaction = Appdynamic...
See more...
I need to post some custom metrics to AppDynamics for analytics purpose. So I am trying to create new Transaction from application source code using below code. Transaction transaction = AppdynamicsAgent.getTransaction(getProcessorName().name(), null, EntryTypes.POJO, false) However getting below while loading AppdynamicsAgenet class Caused by: java.lang.ClassNotFoundException: com.appdynamics.apm.appagent.api.NoOpInvocationHandler
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 92 common frames omitted Anyone know how to fix this?
Hi SMEs, there are logs coming from one of the application in one single event. How to split it in a seperate log event. Like there are 3 diff logs which are tagged to one event log type=USER_ACCT m...
See more...
Hi SMEs, there are logs coming from one of the application in one single event. How to split it in a seperate log event. Like there are 3 diff logs which are tagged to one event log type=USER_ACCT msg=audit(Thu Sep 22 09:09:09 2023.333.12221): pid=12345 uid=0 auid=424242424 ses=6535872 subj=system_u:system_r:crond_t:s0-s0:c0.c1111 msg='op=PAM:accounting grantors=pam_access,pam_faillock,pam_unix,pam_localuser acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success' type=USER_ACCT msg=audit(Thu Sep 22 09:09:09 2023.333.12223): pid=12345 uid=0 auid=424242424 ses=6535872 subj=system_u:system_r:crond_t:s0-s0:c0.c1111 msg='op=PAM:accounting grantors=pam_access,pam_faillock,pam_unix,pam_localuser acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success' type=USER_ACCT msg=audit(Thu Sep 22 09:09:09 2023.333.12229): pid=12345 uid=0 auid=424242424 ses=6535872 subj=system_u:system_r:crond_t:s0-s0:c0.c1111 msg='op=PAM:accounting grantors=pam_access,pam_faillock,pam_unix,pam_localuser acct="root" exe="/usr/sbin/crond" hostname=? addr=? terminal=cron res=success'
Hi SMEs, morning I have a situation where logs are coming from an application recently on-boarded in below format, seems like they are in JSON and should be parsed as per key:value mechanism. Any sugg...
See more...
Hi SMEs, morning I have a situation where logs are coming from an application recently on-boarded in below format, seems like they are in JSON and should be parsed as per key:value mechanism. Any suggestion how to fix it. Many thanks in advance <11>1 2024-02-27T03:22:53.376823921Z hostname-1 ipsec ipsecd[85] log - {"time":"2024-02-27T03:22:53.376823921Z","type":"log","level":"error","log":{"msg":"et_backend: connection failed while getting et keys"},"process":"ipsecd[85]","service":"ipsec","system":"hostname-1","neid":"414399","container":"784722400000","host":"hostname-1","timezone":"UAT"}
I am using Splunk Enterprise version 9.2.0.1 ( Upgraded from 9.0.5 to latest). Before the upgrade, the Splunk deployment server is working as well. When Splunk DS was upgraded to version 9.2.0.1, w...
See more...
I am using Splunk Enterprise version 9.2.0.1 ( Upgraded from 9.0.5 to latest). Before the upgrade, the Splunk deployment server is working as well. When Splunk DS was upgraded to version 9.2.0.1, we saw issues with the client's server class. Client name: EC2AMAZ-XXXXX 1. Client in DS server before upgraded (9.0.5) Splunk Server class: UF_input_WIN, UF_output 2. Client in DS server after upgraded (9.2.0.1) Server class: UF_input_Linux, UF_output The server class "UF_input_Linux" only filters by machine type Linux (see section 3 below). I did not know why this server class is applied to this windows client 3. "UF_input_Linux" Server class configuration 4. "UF_input_WIN" Server class configuration Client is listed in the match list on UF_input_WIN server class Is that a bug? The filter Machine type does not work correctly. I did not change any thing on server class & app when upgraded Splunk DS. Does anyone know or meet this issue before?
We have a server that is using splunk enterprise version 7.3.4. However, I couldn't find Splunk DB Connect compatible with that version in splunk base. Could I get a Splunk DB Connect installation fi...
See more...
We have a server that is using splunk enterprise version 7.3.4. However, I couldn't find Splunk DB Connect compatible with that version in splunk base. Could I get a Splunk DB Connect installation file that is compatible with splunk enterprise 7.3.4?
One problem at a time Your ask was free-hand search without matching specific field name. It is perhaps best to close this one and post another question with the need to extract freehand strings ...
See more...
One problem at a time Your ask was free-hand search without matching specific field name. It is perhaps best to close this one and post another question with the need to extract freehand strings based on lookup values? These are very different search techniques. You will need to explain your lookup AND event data more specifically than mock values tenant1 tenant2 tenant3 and xxx. In particular, what does appended "xxx" signify? How would they appear in event data? (Anonymize, but be specific enough for volunteers without intimate knowledge about your data to be helpful.)
Running the code below will yield ut_domain as ".com" instead of "somethin.shop". It seems like if the subdomain contains a valid TLD string (e.g. .com), then ut_domain is not parsed correctly. A dom...
See more...
Running the code below will yield ut_domain as ".com" instead of "somethin.shop". It seems like if the subdomain contains a valid TLD string (e.g. .com), then ut_domain is not parsed correctly. A domain "somethingbad.shop" will be parsed correctly as it recognizes .shop as a TLD. | makeresults | eval domain_full = "something.com.somethin.shop"
| eval list="*" | `ut_parse(domain_full, list)` Is it a bug? If so, how can we report it? Any workaround you can think of while waiting for bug fix?