All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Sorry for not being clearer, however i need help with props attributes and regex to match event break
Yes, change of the data format can cause incompatibilities with earlier data. That's true. The issue with your data in general (possibly not in the presented example) is - as I said - that you have ... See more...
Yes, change of the data format can cause incompatibilities with earlier data. That's true. The issue with your data in general (possibly not in the presented example) is - as I said - that you have separate arrays which splunk can parse into separate multivalued fields which are not related to one another. If you are absolutely sure that both of those multivalued fields are of the same cardinality and are 1-1 related with one another you can try to do join them using the mvzip() function. Then do mvexpand and split those values back to get corresponding pairs. One caveat though - since the values get merged into a single value, if they contain the delimiter you choose for mvzipping, it's gonna get ugly when you'll be trying to split them again. So it's possible but pretty ugly (and working only with some strong assumptions.
Do you need help how to configure the props.conf or where to configure it?
Yes the files are getting fully overwritten and checked the input status and no issues found.
About the number of files - yes, I figured as much. It was suppose to be a little joke to lighten the mood a bit. Maybe a missed one. Nevermind. "What It Does: This setting includes the file's last ... See more...
About the number of files - yes, I figured as much. It was suppose to be a little joke to lighten the mood a bit. Maybe a missed one. Nevermind. "What It Does: This setting includes the file's last modification time in the checksum calculation." - No, it does not. It includes literal "DATETIME" string in CRC calculation (which doesn't change the situation much). The only possible "dynamic" setting specified in the spec file for inputs.conf is the <SOURCE> setting which is substituted with each file's path. Other than that, the strings are constant literals. Are the files updated or fully rewritten? As usual with any problems with ingesting files, the first debugging steps are to run splunk list monitor and splunk list inputstatus and see if there's something unusual about those files
Depends on your environment. If you have an all-in-one installation, the easiest method would be to go to settings->indexes
Hello, Need an urgent help. I am using REST API Modular input and the problem is i am not able to set the parameter for event breaking, below is the sample log. { "User" : [ { "record_id" : "2", "... See more...
Hello, Need an urgent help. I am using REST API Modular input and the problem is i am not able to set the parameter for event breaking, below is the sample log. { "User" : [ { "record_id" : "2", "email_address" : "dsfsdf@dfdf.net", "email_address_id" : "", "email_type" : "", "email_creation_date" : "", "email_last_update_date" : "2024-08-23T05:28:43.091+00:00", "user_id" : "54216542", "username" : "Audit.Test1", "suspended" : false, "person_id" : "", "credentials_email_sent" : "", "user_guid" : "21SD6F546S2SD5F46", "user_creation_date" : "2024-08-23T05:28:42.000+00:00", "user_last_update_date" : "2024-08-23T05:28:44.000+00:00" }, { "record_id" : "3", "email_address" : "XDCFSD@dfdf.net", "email_address_id" : "", "email_type" : "", "email_creation_date" : "", "email_last_update_date" : "2024-08-28T06:42:43.736+00:00", "user_id" : "300000019394603", "username" : "Assessment.Integration", "suspended" : false, "person_id" : "", "credentials_email_sent" : "", "user_guid" : "21SD6F546S2SD5F46545SDS45S", "user_creation_date" : "2024-08-28T06:42:43.000+00:00", "user_last_update_date" : "2024-08-28T06:42:47.000+00:00" }, { "record_id" : "1", "email_address" : "dfds@dfwsfe.com", "email_address_id" : "", "email_type" : "", "email_creation_date" : "", "email_last_update_date" : "2024-08-06T13:27:34.085+00:00", "user_id" : "5612156498213", "username" : "dfsv", "suspended" : false, "person_id" : "56121564963", "credentials_email_sent" : "", "user_guid" : "D564FSD2F8WEGV216S", "user_creation_date" : "2024-08-06T13:29:00.000+00:00", "user_last_update_date" : "2024-08-06T13:29:47.224+00:00" } ]}
Please execute and share the output. splunk btool list authentication --debug  
Hi @Alex_Rus , what's the resul runnung from cmd: dir C:\MyFolder\MyFolder1\* ? if you haven't results, maybe the path isn't correct or maybe there's another issue: could data be equal to the o... See more...
Hi @Alex_Rus , what's the resul runnung from cmd: dir C:\MyFolder\MyFolder1\* ? if you haven't results, maybe the path isn't correct or maybe there's another issue: could data be equal to the ones from another input? if they are the same, even if from a differen file, Splunk by default doesn't index a log twice. Ciao. Giuseppe
Please set testmode=true in your collect command and please post the outcome. 
the problem is that data from hosts where data is coming to a mounted disk does not come to Splunk
Hi, Thanks for your help. I tried the following configuration in my transforms.conf:   [remove_logoff] INGEST_EVAL = queue=if(match(_raw,"EventCode=4634") AND match(_raw,"Security\sID:[\s]+.*\$")... See more...
Hi, Thanks for your help. I tried the following configuration in my transforms.conf:   [remove_logoff] INGEST_EVAL = queue=if(match(_raw,"EventCode=4634") AND match(_raw,"Security\sID:[\s]+.*\$"), "nullQueue", queue)   props.conf [WinEventLog] TRANSFORMS-remove_computer_logoff = remove_logoff    But after I run the query, I still get the unwanted logs. I tried to make the query on the search as well to check if the regex were right and everything seems fine. index=* sourcetype=WinEventLog | eval result=if(match(_raw,"EventCode=4634") AND match(_raw,"Security\sID:[\s]+.*\$"), "Filter", "No need to filter this log") | stats count by host, result       Am I missing something?   P.S. I cannot do a blacklist directly on the hosts  
Hi @Alex_Rus , What's the problem? you can have two different stanzas for your two different inputs with the same other parameters. Ciao. Giuseppe  
Yes, it is a mistyping, in my inputs.conf i got it right.
Hi @Hiroshi , the issue should be solved, but the url is changed: https://splunk.my.site.com Ciao. Giuseppe
Hi @Alex_Rus , I don't know if it's a mistyping, but you have to use backslashes in windows paths: [monitor://C:\MyFolder\MyFolder1\*] disabled = 0 index = MyIndex1 sourcetype = MySourcetype1 [mon... See more...
Hi @Alex_Rus , I don't know if it's a mistyping, but you have to use backslashes in windows paths: [monitor://C:\MyFolder\MyFolder1\*] disabled = 0 index = MyIndex1 sourcetype = MySourcetype1 [monitor://C:\Program Files\Microsoft\Exchange Server\...\*] disabled = 0 index = MyIndex1 sourcetype = MySourcetype1# Ciao. Giuseppe
Hi, richgalloway! Thank you for your answer.  I wrote this information in response to the previous question from Giuseppe.
Hi, Giuseppe! Thank you for your answer. Let me explain the situation. The application is configured to collect logs from four hosts, on two of which the data is collected in the internal storage C:... See more...
Hi, Giuseppe! Thank you for your answer. Let me explain the situation. The application is configured to collect logs from four hosts, on two of which the data is collected in the internal storage C:\Program Files\Microsoft\Exchange Server\... and the data comes from these hosts correctly. On the other two hosts the data is collected in a folder that is moved to a separate disk C:\MyFolder\MyFolder1\*. My stanza looks like: [monitor://C:/MyFolder\MyFolder1/*] disabled = 0 index = MyIndex1 sourcetype = MySourcetype1   [monitor://C:/Program Files/Microsoft/Exchange Server/.../*] disabled = 0 index = MyIndex1 sourcetype = MySourcetype1#
@KendallW  INFO ThruputProcessor [2963 parsing] - Current data throughput (5125 kb/s) has reached maxKBps. As a result, data forwarding may be throttled. Consider increasing the value of maxKBps in ... See more...
@KendallW  INFO ThruputProcessor [2963 parsing] - Current data throughput (5125 kb/s) has reached maxKBps. As a result, data forwarding may be throttled. Consider increasing the value of maxKBps in limits.conf. We will try increasing the limits.
@Hiroshi We are able to access the partner support portal now. Please check.  Go to the partner portal : https://splunk.my.site.com/partner/s/ and go to the "My Cases".  Karma Points are appreciate... See more...
@Hiroshi We are able to access the partner support portal now. Please check.  Go to the partner portal : https://splunk.my.site.com/partner/s/ and go to the "My Cases".  Karma Points are appreciated. !!!