All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I solved this by making a new searchhead cluster with the same machines with the same names. When I ran the command everything went fine splunk edit cluster-config -mode searchhead -manager_uri http... See more...
I solved this by making a new searchhead cluster with the same machines with the same names. When I ran the command everything went fine splunk edit cluster-config -mode searchhead -manager_uri https://10.152.31.202:8089 -secret newsecret123 -auth login:password   The problem was initially that I installed the deployer on the manager node. When I was about to install the enterprise security instance, it needed to be installed on the deployer for some reason. Now everything works as intended, I hope
@gcusello    I have one indexer and inside that i have created one index and i couldn't fetch data of that index from search head but i can fetch it from the indexer. Thanks
Hi @RAVISHANKAR , can you access other indexes or not? Ciao. Giuseppe
It works! Thank you very much MuS!
@gcusello - do we need to check anything else further ??
can you post what you ended up with and accept an answer that helped as the solution? even if it's your own ( i believe you can do that) Glad to hear you got where you needed to go!
@gcusello - yes this is done and it showing as status up and replication was successfull. Thanks
@gcusello  -   could you please explain a bit more in detail..   configured Distributed Search in Settings, configuring the Indexers for searching? - in indexer or in search head ?? Thanks
Hi guys, I have a set of data in the following format: This is a manually exported list, and my requirements are as follows: - Objective: I need to identify hosts that haven't connected to the... See more...
Hi guys, I have a set of data in the following format: This is a manually exported list, and my requirements are as follows: - Objective: I need to identify hosts that haven't connected to the server for a long time and track the daily changes in these numbers. - Method: Since I need daily statistics, I must perform the import action daily. However, without any configuration changes, Splunk defaults to using "Last Communicaiton" as "_time", which is not what I want. I need "_time" to reflect the date of the import. This way, I can track changes in the count of "Last " records within each day's imported data. I can't use folder or file monitoring for this because it only adds new data, so my only options are to use oneshot or to perform the import via the Web interface. Is my approach correct? If not, what other methods could be used to handle this?   I could use splunk oneshot to upload the file to the Splunk indexer, but I couldn't adjust the date to the import day or specific day.   The example I used the command:   splunk add oneshot D:\upload.csv -index indexdemo     I want the job will run automatically. So I don't want to change any content to the file. How could I do?  
Hi @RAVISHANKAR , did you configured Distributed Search in Settings, configuring the Indexers for searching? Ciao. Giuseppe
Hi @Real_captain , these are the options to define colours of the areas: <option name="charting.legend.labels">[YES,NO,UND]</option> <option name="charting.seriesColors">[0xff3f31,0x0dc681,0xe1dfdf... See more...
Hi @Real_captain , these are the options to define colours of the areas: <option name="charting.legend.labels">[YES,NO,UND]</option> <option name="charting.seriesColors">[0xff3f31,0x0dc681,0xe1dfdf]</option> you have to insert the values in the first option and the colours in the second one. Ciao. Giuseppe
I hope you did the following configuration to connect search head with indexer. If not, then do it as mentioned below, else verify the configuration. Configure the Indexer as a Search Peer Log in ... See more...
I hope you did the following configuration to connect search head with indexer. If not, then do it as mentioned below, else verify the configuration. Configure the Indexer as a Search Peer Log in to the Splunk web interface on your search head. Go to Settings > Distributed Search > Search Peers. Click Add New to add a new search peer (indexer). Enter the management port (usually 8089) and the hostname or IP address of the indexer. If required, enter the username and password of the indexer to establish the connection. Click Save to add the indexer as a search peer.   ------ If you find this solution helpful, please consider accepting it and awarding karma points !!
Please show your search, how you are using earliest and latest, and explain how it is not working for you?
Hello, I have configured an index inside an indexer and when i try to fetch data from that index in search head not getting any data. when i search that same index in indexer i could get the data... See more...
Hello, I have configured an index inside an indexer and when i try to fetch data from that index in search head not getting any data. when i search that same index in indexer i could get the data from the index but not from search head. Could you please assist what configuration needs to be checked on my search head and indexer ? Note - it's not clustered setup.   Thanks  
This YouTube video on Search Optimization in Splunk is highly useful https://www.youtube.com/watch?v=U3A1zxag_lc ------ If you find this solution helpful, please consider accepting it and awarding... See more...
This YouTube video on Search Optimization in Splunk is highly useful https://www.youtube.com/watch?v=U3A1zxag_lc ------ If you find this solution helpful, please consider accepting it and awarding karma points !!  
Hi All, We Are using earliest and latest commands in splunk test environment search and those are working fine but in production environment earliest and latest commands are not working in SPL query... See more...
Hi All, We Are using earliest and latest commands in splunk test environment search and those are working fine but in production environment earliest and latest commands are not working in SPL query due to some reason. Can you please help me with alternative commands for those commands and provide the solution to fix this issue why earliest and latest commands are not working in production environment.   Thanks, Srinivasulu S
Try this : <your_search>|rex field=source "\/audit\/logs\/(?<environment>[^\/]*)\/(?<hostname>[^-]*)\-(?<component>[^-]*)\-(?<filename>.*$)" ------ If you find this solution helpful, please consid... See more...
Try this : <your_search>|rex field=source "\/audit\/logs\/(?<environment>[^\/]*)\/(?<hostname>[^-]*)\-(?<component>[^-]*)\-(?<filename>.*$)" ------ If you find this solution helpful, please consider accepting it and awarding karma points !!
Hi @karthi2809 , you can use this regex: | rex field=source "^\/\w+\/\w+\/(?<environment>\w+)\/\w+-(?<component>[^-]+)-(?<filename>.*)" you can test this regex at https://regex101.com/r/0VJvAw/1 ... See more...
Hi @karthi2809 , you can use this regex: | rex field=source "^\/\w+\/\w+\/(?<environment>\w+)\/\w+-(?<component>[^-]+)-(?<filename>.*)" you can test this regex at https://regex101.com/r/0VJvAw/1 Ciao. Giuseppe
Adding sourcetype additionally in props.conf fulfilled my requirement. Thanks
How to extract fields from below source. /audit/logs/QTEST/qtestw-core_server4-core_server4.log I need extract QTEST as environment qtestw as hostname core_server4 as component core_server4.log as ... See more...
How to extract fields from below source. /audit/logs/QTEST/qtestw-core_server4-core_server4.log I need extract QTEST as environment qtestw as hostname core_server4 as component core_server4.log as filename