All Topics

Top

All Topics

Hi Everyone,   I recently observed the splunk internal logs and found that there is a field component and found two values for component field - 1.TailingProcessor 2.Watched file INFO Watch... See more...
Hi Everyone,   I recently observed the splunk internal logs and found that there is a field component and found two values for component field - 1.TailingProcessor 2.Watched file INFO WatchedFile [3338437 tailreader0] - Will use tracking rule=modtime for file='/path/.conf INFO TailingProcessor [3338433 MainTailingThread] - Adding watch on path: /path Please help me understand what these logs says about. Thanks     
Hello, I have an issue regarding the creation of a new index. I want to create a new index to receive logs form NPS servers.  First, I created a new app for NPS in the deployment-apps in the ma... See more...
Hello, I have an issue regarding the creation of a new index. I want to create a new index to receive logs form NPS servers.  First, I created a new app for NPS in the deployment-apps in the master server but I created the new folder for the app and edited everytihng with a root user. I assigned the app to a new server class in the master server. I installed the UF on NPS servers and they are successfully connected to the deployment server. I added the, the new server class. Now I am getting the error that index= radius (defined in the new app created) dos not exist. So, I went to the master-apps in the master server , the previous defined indexes are present in the  directory  X other than _cluster, I added the index radius in the local file of the Directory X . I validated  the conf and push and all the indexes in the indexer cluster have the same config but the index radius is not present. I did all the modification using the root user. Can anyone advise  me to know the issue ? Thank you.    
How to find the memory utilization  on a service level for processes
Hi,   We have a platform where lot of dashboards are populated using splunk searches via splunk api call. All the query runs fine and brings results in 30-40 secs however there is one query whi... See more...
Hi,   We have a platform where lot of dashboards are populated using splunk searches via splunk api call. All the query runs fine and brings results in 30-40 secs however there is one query which brings results in around 5 mins. The query is not complex and runs within seconds on splunk search heads and I also used postman to check how long it takes- it was around seconds on postman as well. I beleive there is no issue at query end. I checked the splunk logs for that particular SID which took 5 mins to run: I see below ERROR logs: ERROR SHCRepJob [18757 SHPPushExecutorWorker-0] - failed job=SHPRepJob peer="splunk-search head", guid="xxx" aid=xxx, tgtPeer="splunk-search head ", tgtGuid="xxx", tgtRP=xxx, useSSL=false tgt_hp=xx:8089 tgt_guid=xx replicate?output_mode=json, error=500 - Failed to trigger replication (artifact='xxx Job id') (err='event=SHPSlave::replicateArtifactTo invalid status=alive to be a source for replication') Any idea about this error. Because i do see this error on other queries SID which are running fine. I am not sure what exactly is the reason for delay.
Hi, I have created a app in splunk cloud platform through Manage Apps -> Create Apps. The App is created but while searching for events with the index shows the below error.  Search process did... See more...
Hi, I have created a app in splunk cloud platform through Manage Apps -> Create Apps. The App is created but while searching for events with the index shows the below error.  Search process did not exit cleanly, exit_code=111, description="exited with error: Application does not exist: app_name". what could be the issue? Thanks in advance.
Hi   We need to ingest only those events which starts with any of the below strings ; (please note  its starts with not contains)    create, drop, login, logout, alter, delete   For examp... See more...
Hi   We need to ingest only those events which starts with any of the below strings ; (please note  its starts with not contains)    create, drop, login, logout, alter, delete   For example “ login success for user Peter”   "create action success for user Martin"     we have edited the transforms.conf as below   [setnull] REGEX = . DEST_KEY = queue FORMAT = nullQueue   [setparsing] REGEX = login|logout!|create|drop|alter|DELETE DEST_KEY = queue FORMAT = indexQueue   But this is allowing all events which contains above strings. But our requirement is event should start with above strings;   Could you please help us with a PCRE regex for above conditions on transforms.conf?
Hello everyone In the result of my search I got such results (last command was stats values(list) as list, values(standard) as standard by host  fields list and standard are multivalues ho... See more...
Hello everyone In the result of my search I got such results (last command was stats values(list) as list, values(standard) as standard by host  fields list and standard are multivalues host list standard   5 1   1 2   2 3   3 4   I need to compare fields "list" and "standard" make field "result" where will be: lacking records, redundant records and passing records Lacking is record that present in standard but not in list, redundant is present in list but not in standard, and passing is which is in list and standard is equal. so for this example must be: result Passing: 1 2 3 Lacking: 4 Redundant: 5
Greetings Splunk Community,  This will be one of the several community/answers posts in regards to the SSL overall and how we do it in Splunk. It will be an authors explanation of digital certifica... See more...
Greetings Splunk Community,  This will be one of the several community/answers posts in regards to the SSL overall and how we do it in Splunk. It will be an authors explanation of digital certificates and their trust relationships.  Disclaimer I do not claim to be the best expert on digital certificates. I hope the explanation that way will make it easier for everyone to grasp the concept and help with any level of troubleshooting of digital certificates involved. 1. It all begins with Asymmetric encryption. To start the conversation on digital certificate one must be comfortable with basics of encryption. For a quick recap encryption mainly utilizes a key. Encryption Key - An encryption key is a string of bits (data) generated specifically to scramble and unscramble data. So you use a key to produce Encrypted text (cipher text). Or to Decrypt what you have encrypted before. There are primarily 2 types of encryption used for data transfer: - Symmetric - that is when your key is the same on both ends and you encrypt and decrypt data the same way. Most easy to use and provides the best performance (throughput and resources used to do). - Asymmetric - that is when you have a pair of 2 keys. Key 1 used for Encrypting data and Key 2 used for Decrypting. Hope I did not lose you so far? 2. Asymmetric is AWESOME! But why not use it everywhere? So Asymmetric encryption is awesome. But it’s terribly resource greedy and provides a huge overhead. Why? well when you encrypt using it you provide a much bigger output then what you put into it. So it makes no sense at current stage of development of technology to use it in your large bulky data transfer. So we use asymmetric encryption a lot but for two absolutely different use cases: When you need to provide data that anyone can read. But none can really change it. How? Easy. Give everyone Part of your Asymmetric key which they can use for decryption. But do not give the encryption key. That way anyone can decrypt what you show them. But none can re-encrypt the data to pretend to be You. In short that’s what you can hear or refer to be “Digital Signature” The other way round. You give everyone your key for Encrypting the data, and then keep your key for Decrypting it to be very private. What is that used for? Easy! For establishing the secure connection using the symmetric key. TL:DR: send everyone your public key. Now they can generate their own Symmetric key AND then send it to you Encrypted with your key. And none can De-crypt it, because only you have that key. 3. And that brings us to digital certificates and Trust So what we have learned so far brought us right to certificates. Digital Certificates - are a combination of both things discussed above. They contain the Public → encryption key of the server you are connecting too. AND! Ideally they are “Digitally Signed” by someone you Trusted beforehand (or not) Important note! (see Spoiler below): Digital certificates not necessary have to be trusted. In browser you would then have a red warning and can just say “proceed anyway” or in Splunk we configure: sslVerifyServerCert = <boolean> Which is false by default. This can be seen in several configurations files depending on the connection usage. Examples are: Global setting in server.conf [sslConfig] stanza. Forwarder to Deployment server specific connection deploymentclient.conf Forwarder to Indexer connection outputs.conf In a browser You see this for not trusted:   But what is Trust then? And that’s a very good question. So in every software in the world which uses digital certificates there’s a certain collection of certificates that is either built-in or was configured by admins. Allow me to introduce some more concepts of Trust Store and Self-Signed. Trust Store - is a collection of Certificates which have the asymmetric keys in them. Which you or administrators have configured as coming from trusted sources. Most of the time issued by Servers to themselves so they are signed by themselves → hence the term Self-Signed The moment you add a certificate to a trusted store. It means → whatever the holder of this Asymmetric key has Digitally Signed (still remember that part?) now you trust that to be true and good for you. That they not gonna harm you. You get this kind of a green (or lock) for Web Sites that are good/trusted: To have that that configured in your conventional operating systems for example: your Windows would have a Trust store (Certificate Manager) → certmgr.msc And your Apple Mac OS would have trust store: AND? You guessed it. In Splunk we also configure somewhat of a trust store. BUT! For each component separately. Most common use in server.conf :     [sslConfig] sslRootCAPath = <path> * Full path to the root CA (Certificate Authority) certificate store on the operating system. * The <path> must refer to a PEM (Privacy-Enhanced Mail) format file containing one or more root CA certificates concatenated together. * Required for Common Criteria. * This setting is valid on Windows machines only if you have not set 'sslRootCAPathHonoredOnWindows' to "false". * No default.      I will cover Splunky way in a bit more detail of chapter 5 of this article.  4. Tying it all up in Trust chains Now that you are familiar with concept of certs and where we store so called “trusted” certs. Let’s explore further. Let’s see what are trust relationships so to say. So because each trusted store is stored individually each machine decided who it trusts absolutely independently. And that’s the thing you must always remember. Each machine builds it’s own trust chain and trusts independently and checks certificates only when it receives them from someone as to whether they are trusted or not. Now we go further. Trust chains and how do we check if something is trusted. Each machine running some kind of SSL/use of certificate connection will have it’s own certificate → Identity Certificate In Splunk Identity Certificate is the one you configure as serverCert in server.conf:     serverCert = <path> * The full path to the PEM (Privacy-Enhanced Mail) format server certificate file.     So when we connect to any server using say HTTPS we will see this machine's identity certificate AND!! it will be digitally signed by someone we trust. So it will be looking like this: Subject → who is the machine who had the cert (server you are connecting to) Certificate Authority → who signed the certificate for you Now the most interesting part to tie it all up: The computer (for example machine running Splunk Forwarder) tries to connect to a server (Deployment server for example or Indexer) Server to which you are connecting to will introduce itself with certificate. IMPORTANT! and now originating machine (Forwarder in this example) will be building it’s trust chain. Yes it is like that. And no matter what. To trust or not to trust will be decided on what we call client side of the connection → that is machine which originates the connection. Since we now got server’s Identity Certificate - that’s what it’s usually called as that’s how server will identify itself. There are 2 ways to check it’s trust depending on the software itself. The difficult way of matching the certificates or AKI to SKI → Authority Key Identifier to Subject Key Identifier I will cover in separate article. The easy (lazy) way → Check who issued the certificate by name. Look in this machine's trusted store for the issuer field of the certificate. And something just that simple you just check by the Name of the issuer and that’s it like so: 5. Splunk way of trust in Trust chains Now that you know all of the above let's get into Splunk. Splunk takes it's certs in PEM format: PEM (originally “Privacy Enhanced Mail”) is the most common format for X. 509 certificates, CSRs, and cryptographic keys.  If you open your PEM certificate in any text editor you going to see something like that:     -----BEGIN CERTIFICATE----- MIIDejCCAmICCQCNHBN8tj/FwzANBgkqhkiG9w0BAQsFADB/MQswCQYDVQQGEwJV UzELMAkGA1UECAwCQ0ExFjAUBgNVBAcMDVNhbiBGcmFuY2lzY28xDzANBgNVBAoM BlNwbHVuazEXMBUGA1UEAwwOU3BsdW5rQ29tbW9uQ0ExITAfBgkqhkiG9w0BCQEW EnN1cHBvcnRAc3BsdW5rLmNvbTAeFw0xNzAxMzAyMDI2NTRaFw0yNzAxMjgyMDI2 NTRaMH8xCzAJBgNVBAYTAlVTMQswCQYDVQQIDAJDQTEWMBQGA1UEBwwNU2FuIEZy YW5jaXNjbzEPMA0GA1UECgwGU3BsdW5rMRcwFQYDVQQDDA5TcGx1bmtDb21tb25D QTEhMB8GCSqGSIb3DQEJARYSc3VwcG9ydEBzcGx1bmsuY29tMIIBIjANBgkqhkiG 9w0BAQEFAAOCAQ8AMIIBCgKCAQEAzB9ltVEGk73QvPlxXtA0qMW/SLDQlQMFJ/C/ tXRVJdQsmcW4WsaETteeWZh8AgozO1LqOa3I6UmrWLcv4LmUAh/T3iZWXzHLIqFN WLSVU+2g0Xkn43xSgQEPSvEK1NqZRZv1SWvx3+oGHgu03AZrqTj0HyLujqUDARFX sRvBPW/VfDkomHj9b8IuK3qOUwQtIOUr+oKx1tM1J7VNN5NflLw9NdHtlfblw0Ys 5xI5Qxu3rcCxkKQuwz9KRe4iijOIRMAKX28pbakxU9Nk38Ac3PNadgIk0s7R829k 980sqGWkd06+C17OxgjpQbvLOR20FtmQybttUsXGR7Bp07YStwIDAQABMA0GCSqG SIb3DQEBCwUAA4IBAQCxhQd6KXP2VzK2cwAqdK74bGwl5WnvsyqdPWkdANiKksr4 ZybJZNfdfRso3fA2oK1R8i5Ca8LK3V/UuAsXvG6/ikJtWsJ9jf+eYLou8lS6NVJO xDN/gxPcHrhToGqi1wfPwDQrNVofZcuQNklcdgZ1+XVuotfTCOXHrRoNmZX+HgkY gEtPG+r1VwSFowfYqyFXQ5CUeRa3JB7/ObF15WfGUYplbd3wQz/M3PLNKLvz5a1z LMNXDwN5Pvyb2epyO8LPJu4dGTB4jOGpYLUjG1UUqJo9Oa6D99rv6sId+8qjERtl ZZc1oaC0PKSzBmq+TpbR27B8Zra3gpoA+gavdRZj -----END CERTIFICATE-----     This is a text representation of a cert. But what interests you is that it has beginning and ending of the cert.  But that's just one! For a trust chain you need more then one. More so when you have several different certificates that all need to be trusted.  In Splunk we rely on the certificates being present in single pem file mentioned in configurations before server.conf . As of time of this article there are several other configurations available depending on the purpose for inputs and for deployment client but the fallback is always to sslRootCAPath.  How do we do it? It's that simple! just take your certs in text format and copy paste into a single file. Important note include beginning and end. Like so:     -----BEGIN CERTIFICATE----- ... (certificate for your server)... -----END CERTIFICATE----- -----BEGIN CERTIFICATE----- ... (the intermediate certificate)... -----END CERTIFICATE----- -----BEGIN CERTIFICATE----- ... (the certificate authority certificate)... -----END CERTIFICATE-----     Splunk’s official document link here: How to prepare TLS certificates for use with the Splunk platform - Splunk Documentation This you can do in any text editor (e.g. notepad, notepad++, nano, vim, etc.). Just keep adding certificates as needed to the same file. That will create your trusted store that we talked about before. A single file where you store chains to trust.  You can also use an easy *nix way of doing that:     cat myServerCertificate.pem myIntermidiateServer.pem myCACertificate.pem > Combined.pem     In this example it just reads all 3 files and then you output that to another file ( > means output instead of screen somewhere else) Note!! As of the time of this article. The certs can be pasted in any order and still work. If you read it in just normal openssl x509 command it will show you just one cert.  To view all certificates in one PEM file use openssl command like example below. You can add -text parameter if you want to see more of the expiration date on certs and so on.     openssl crl2pkcs7 -nocrl -certfile Combined.pem | openssl pkcs7 -print_certs -noout subject=/C=NL/postalCode=5705 CN/L=City/street=Example 20/O=Foobar B.V./OU=ICT/OU=Wildcard SSL/CN=*.example.com issuer=/C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Organization Validation Secure Server CA subject=/C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Organization Validation Secure Server CA issuer=/C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Certification Authority    
Hi all, I am getting data in via an API (using the add on builder) but having  creating a regex which splits it into a better format rather than 1 big event. Here is an example of the event:   ... See more...
Hi all, I am getting data in via an API (using the add on builder) but having  creating a regex which splits it into a better format rather than 1 big event. Here is an example of the event:     "@odata.context": "https://example-app-env.aa01.aaa.aaaa-ad/odata/$metadata#Jobs", "@odata.count": 111, "value": [ { "Key": "aaa1a111-aa11-11aa-a11a-11a1aa11a111", "StartTime": "2023-01-20T14:08:34.607Z", "EndTime": "2023-01-20T14:08:49.517Z", "State": "Successful", "JobPriority": "Normal", "Source": "Agent", "SourceType": "Agent", "BatchExecutionKey": "aaa1a111-aa11-11aa-a11a-11a1aa11a111", "Info": "Job completed", "CreationTime": "2023-01-20T14:08:34.607Z", "StartingScheduleId": null, "ReleaseName": "RobotProdLogin_DEV", "Type": "Attended", "InputArguments": "", "OutputArguments": "{}", "HostMachineName": "AAAAAAAA11111", "HasMediaRecorded": false, "PersistenceId": null, "ResumeVersion": null, "StopStrategy": null, "RuntimeType": "Development", "RequiresUserInteraction": true, "ReleaseVersionId": 1111, "EntryPointPath": null, "OrganizationUnitId": 1, "OrganizationUnitFullyQualifiedName": "Default", "Reference": "", "ProcessType": "Process", "ProfilingOptions": null, "ResumeOnSameContext": false, "LocalSystemAccount": "AAAAAA01\\AAA11AA", "OrchestratorUserIdentity": null, "Id": 00000 }, { "Key": "aaa1a111-aa11-11aa-a11a-11a1aa11a111", "StartTime": "2023-01-20T14:08:34.607Z", "EndTime": "2023-01-20T14:08:49.517Z", "State": "Successful", "JobPriority": "Normal", "Source": "Agent", "SourceType": "Agent", "BatchExecutionKey": "aaa1a111-aa11-11aa-a11a-11a1aa11a111", "Info": "Job completed", "CreationTime": "2023-01-20T14:08:34.607Z", "StartingScheduleId": null, "ReleaseName": "RobotProdLogin_DEV", "Type": "Attended", "InputArguments": "", "OutputArguments": "{}", "HostMachineName": "AAAAAAAA11111", "HasMediaRecorded": false, "PersistenceId": null, "ResumeVersion": null, "StopStrategy": null, "RuntimeType": "Development", "RequiresUserInteraction": true, "ReleaseVersionId": 1111, "EntryPointPath": null, "OrganizationUnitId": 1, "OrganizationUnitFullyQualifiedName": "Default", "Reference": "", "ProcessType": "Process", "ProfilingOptions": null, "ResumeOnSameContext": false, "LocalSystemAccount": "AAAAAA01\\AAA11AA", "OrchestratorUserIdentity": null, "Id": 00000 },   How i want it to look. Event 1   "@odata.context": "https://example-app-env.aa01.aaa.aaaa-ad/odata/$metadata#Jobs", "@odata.count": 111, "value": [ {   Event 2   { "Key": "aaa1a111-aa11-11aa-a11a-11a1aa11a111", "StartTime": "2023-01-20T14:08:34.607Z", "EndTime": "2023-01-20T14:08:49.517Z", "State": "Successful", "JobPriority": "Normal", "Source": "Agent", "SourceType": "Agent", "BatchExecutionKey": "aaa1a111-aa11-11aa-a11a-11a1aa11a111", "Info": "Job completed", "CreationTime": "2023-01-20T14:08:34.607Z", "StartingScheduleId": null, "ReleaseName": "RobotProdLogin_DEV", "Type": "Attended", "InputArguments": "", "OutputArguments": "{}", "HostMachineName": "AAAAAAAA11111", "HasMediaRecorded": false, "PersistenceId": null, "ResumeVersion": null, "StopStrategy": null, "RuntimeType": "Development", "RequiresUserInteraction": true, "ReleaseVersionId": 1111, "EntryPointPath": null, "OrganizationUnitId": 1, "OrganizationUnitFullyQualifiedName": "Default", "Reference": "", "ProcessType": "Process", "ProfilingOptions": null, "ResumeOnSameContext": false, "LocalSystemAccount": "AAAAAA01\\AAA11AA", "OrchestratorUserIdentity": null, "Id": 00000 },     Can you help?
Good morning Support, I am writing to you regarding a problem. I am a member of Phantom Community, I have downloaded Phantom's.ova, i.e. OVA Image 5.2.1.78411 and installed on Wmware esxi environmen... See more...
Good morning Support, I am writing to you regarding a problem. I am a member of Phantom Community, I have downloaded Phantom's.ova, i.e. OVA Image 5.2.1.78411 and installed on Wmware esxi environment but I have problems, services, ports and consequently applications do not go up. Could you help me and understand what is the problem?
Hi folks, I have a field alias for my all sourcetypes        [default] FIELDALIAS-cliente = index AS client         but I want to exclude some sourcetypes for example I dont' want... See more...
Hi folks, I have a field alias for my all sourcetypes        [default] FIELDALIAS-cliente = index AS client         but I want to exclude some sourcetypes for example I dont' want this field alias for stash or internal log. is it possible?
Hi everyone, I have a column called "SCRN_NM"  (name of screen) and only want to extract English data, not non-English data by Splunk Search   I have searched but all of examples are not working ... See more...
Hi everyone, I have a column called "SCRN_NM"  (name of screen) and only want to extract English data, not non-English data by Splunk Search   I have searched but all of examples are not working in my status. Please share your knowledge. Thx!
Hi, I have installed UF in one of the drive ( E drive) on a windows server. I want to fetch logs from another drive (Netwrok drive) .This drive is present on the same server itself and the server h... See more...
Hi, I have installed UF in one of the drive ( E drive) on a windows server. I want to fetch logs from another drive (Netwrok drive) .This drive is present on the same server itself and the server has access to this drive.  I have placed montirong stanza in splunk_home/etc/system/local in E drive mentioning the path of the logs present in Network drive. But i am not seeing any logs in Splunk getting indexed. So can you suggest how can i fetch logs from another drive when Splunk UF is installed in E drive. P.s i cannot change Splunk UF location from E drive because of compliance.   Thanks
Hi, I took over a Splunk Cluster with Splunk on c:\program files\splunk which produces plenty of problems due to long path restrictions. Activating long path support does not work for splunk so I hav... See more...
Hi, I took over a Splunk Cluster with Splunk on c:\program files\splunk which produces plenty of problems due to long path restrictions. Activating long path support does not work for splunk so I have to reinstall all instances on c:\splunk. But when trying to uninstall I get errors due to missing permission. I have admin priviliges, installed a test-cluster with all new instances but can't uninstall splunk for some reason. Of course I stopped splunk before trying to uninstall it. Can anyone tell me why even a freshly installed instance can't be uninstalled? I tried with 9.0.3 and 9.0.4. Kind Regards
Hi SMEs, I have a unique requirement which need one of my extracted filed name = actual_time to be mapped with _time field. As while searching past 30 days data i am also getting the older data while... See more...
Hi SMEs, I have a unique requirement which need one of my extracted filed name = actual_time to be mapped with _time field. As while searching past 30 days data i am also getting the older data while looking at actual_time field. I think if i map actual_time with _time or by other mean i should be able to get the actual outcome. thanks in advance
How to deploy Splunk Connect for Syslog (sc4s), I see the deployment above the official document https://splunk.github.io/splunk-connect-for-syslog/main/gettingstarted/quickstart_guide/ Only this par... See more...
How to deploy Splunk Connect for Syslog (sc4s), I see the deployment above the official document https://splunk.github.io/splunk-connect-for-syslog/main/gettingstarted/quickstart_guide/ Only this part has more detailed documents To illustrate the specific configuration of collecting logs
Hi all,   I  want to replace random substrings in path: C:\Users\sjfklsj\Appdata\.... -> C:\Users\---\Appdata\.... C:\Users\aegdfedg\Appdata\.... -> C:\Users\---\Appdata\.... etc.. So I w... See more...
Hi all,   I  want to replace random substrings in path: C:\Users\sjfklsj\Appdata\.... -> C:\Users\---\Appdata\.... C:\Users\aegdfedg\Appdata\.... -> C:\Users\---\Appdata\.... etc.. So I want to remove the random username from the path. Thank you!
Hi, I've been working in Splunk for 1 years. I want to do a architect certification. Please let me know the steps to do so. Thanks,
Hi, I am trying to extract data from one of the column in lookup file. Regex expression is working in rex tool. I want to use that regex expression in rex command in splunk . Rexgex expression  ... See more...
Hi, I am trying to extract data from one of the column in lookup file. Regex expression is working in rex tool. I want to use that regex expression in rex command in splunk . Rexgex expression  ^.*(?= \[) Example: Want to extract the highlighted bold data. This data is present in csv lookup file. TAX PLATFORM [12998] CPOI [0639]   | inputlookup Meta.csv | rex field=Application "^.*(?<name>= \[)"    
I have a SHC with 3 search head with Enterprise  when  I add user, field, dashboard in GUI, everting is good to replicate to other SHC member. BUT add  anything  in 'Setting' -> 'Data inputs' meau... See more...
I have a SHC with 3 search head with Enterprise  when  I add user, field, dashboard in GUI, everting is good to replicate to other SHC member. BUT add  anything  in 'Setting' -> 'Data inputs' meau, not replicated to other member. any idea?  Thanks