All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@SN1  If you're moving the entire instance (including historical data, configs, and users) from one machine to another: Migrate a Splunk Enterprise instance from one physical machine to another | S... See more...
@SN1  If you're moving the entire instance (including historical data, configs, and users) from one machine to another: Migrate a Splunk Enterprise instance from one physical machine to another | Splunk Docs Restore Archived Indexed Data (Bucket-Level Transfer): If you want to move specific historical data (e.g., cold/frozen buckets) to another instance: Restore archived indexed data | Splunk Docs You can copy bucket files into the thaweddb directory of the target index on the new instance. This is ideal for selective historical data recovery.
Hi @SN1 , let me understand: you have two stand alone Splunk servers and you want to send data of an index from the second to the first, is it correct? if this is your requirement, the first questi... See more...
Hi @SN1 , let me understand: you have two stand alone Splunk servers and you want to send data of an index from the second to the first, is it correct? if this is your requirement, the first question should be why? but anyway, I need other two information for your solution: is there another Heavy forwarder forwarding these logs? do you want to forward all the data or only the ones of one index? if logs passing through another Splunk full instance (Heavy Forwarder), you have to work on it otherwise on the ServerB. You have to create a fork following the instructions at https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/forwarding-and-receiving-data/9.4/perform-advanced-configuration/forward-data-to-third-party-systems  if you want to forward all logs, you can configure forwarding and receiving [Settings > Forwarding and Receiving > Forwarding] with the option "Index and forwardiung", in this way you forward all logs maintaining a local copy of them, for more information see at https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/forwarding-and-receiving-data/9.4/perform-advanced-configuration/forward-data-to-third-party-systems#forward-all-data-0  If instead you want to forward only a subset of data you have to use the configurations at https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/forwarding-and-receiving-data/9.4/perform-advanced-configuration/forward-data-to-third-party-systems#forward-a-subset-of-data-0  Ciao. Giuseppe
Hi @Paul_Szoke  It looks like the developers of the apps no longer list this on their site at all, Ive searched high and low! It could be that they no longer maintain this but may be able to provide... See more...
Hi @Paul_Szoke  It looks like the developers of the apps no longer list this on their site at all, Ive searched high and low! It could be that they no longer maintain this but may be able to provide an earlier version. I think the best action here would be to contact them (https://www.rojointegrations.com/contact) for more info and to see if they can provide the app.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @SN1  Can you confirm - is this historic data that has already been indexed, or new data which is being received currently? If you are currently receiving data into A and want to send to B then ... See more...
Hi @SN1  Can you confirm - is this historic data that has already been indexed, or new data which is being received currently? If you are currently receiving data into A and want to send to B then check out https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/forwarding-and-receiving-data/9.4/perform-advanced-configuration/forward-data-to-third-party-systems#forward-all-data-0 If you are looking to move old indexes from A to B then the easiest way is to copy the buckets from one system to the other, if they are both standalone instances with unique GUID then this should be fine - just make sure you define the indexes in indexes.conf Check out https://help.splunk.com/en/splunk-enterprise/get-started/install-and-upgrade/9.4/upgrade-or-migrate-splunk-enterprise/migrate-a-splunk-enterprise-instance-from-one-physical-machine-to-another for more information on how to migrate from one to the other.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
hello i have an index  (A) on indexer and other index (B) on Search head (we are making it standalone) . i want to send data from index A to B . How to proceed . I have admin rights.
Apigee API Management Monitoring App for Splunk | Splunkbase The "Visit Site" link give this message:  "404 Sorry this page is not available The link you followed may be broken, or the page may ha... See more...
Apigee API Management Monitoring App for Splunk | Splunkbase The "Visit Site" link give this message:  "404 Sorry this page is not available The link you followed may be broken, or the page may have been removed." Are there any alternative ways to download?
Hi @Praz_123 @kiran_panchavat  You need to set MAX_DAYS_AGO=5000 and MAX_TIMESTAMP_LOOKAHEAD=20  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it... See more...
Hi @Praz_123 @kiran_panchavat  You need to set MAX_DAYS_AGO=5000 and MAX_TIMESTAMP_LOOKAHEAD=20  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
@kiran_panchavat  I got the answer thanks as I used the below props.conf  , @livehybrid  thanks for your help  [ <SOURCETYPE NAME> ] LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|... See more...
@kiran_panchavat  I got the answer thanks as I used the below props.conf  , @livehybrid  thanks for your help  [ <SOURCETYPE NAME> ] LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_FORMAT=%Y-%m-%dT%H:%M:%S TIME_PREFIX="ds":\s" MAX_TIMESTAMP_LOOKAHEAD=20 MAX_DAYS_AGO=5000
@livehybrid  THANKS for your help it worked finally ,  
@Praz_123  Select source type something like, I can see it's showing default ********* Ignore this *****
@kiran_panchavat  Same issue again if I take in json or in txt both are not giving me results   
Hi @Praz_123  I think the issue here could be that in the original data I had for the my example, the date is in 2023, however in this example the data is in 2012. In props.conf there is a MAX_DAYS... See more...
Hi @Praz_123  I think the issue here could be that in the original data I had for the my example, the date is in 2023, however in this example the data is in 2012. In props.conf there is a MAX_DAYS_AGO setting which defaults to 2000 - which is some time in 2019 - If the date you want to extract is prior to this date then you need to increase MAX_DAYS_AGO! Try setting MAX_DAYS_AGO=5000 [yourSourceType] SHOULD_LINEMERGE=false LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20 MAX_DAYS_AGO=5000 If this doesnt work then please show the error by hovering over the error icon.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
@Praz_123  Check the data which you uploaded it should be .json format not .txt format.  [ jsontest ] CHARSET=UTF-8 LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]... See more...
@Praz_123  Check the data which you uploaded it should be .json format not .txt format.  [ jsontest ] CHARSET=UTF-8 LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true SHOULD_LINEMERGE=true category=Custom pulldown_type=true    
@bowesmana and @PrewinThomas give you two different approaches.  I will put a different spin on Prewin27's append method. (BTW, there should be no need to sort by _time after timechart.)  To avoid se... See more...
@bowesmana and @PrewinThomas give you two different approaches.  I will put a different spin on Prewin27's append method. (BTW, there should be no need to sort by _time after timechart.)  To avoid searching the same data multiple times, I use map. In the following example, I simplify interval split by restricting total search window to -1d@d - -0d@d. | tstats count where index=_internal earliest=-1d@d latest=-0d@d | addinfo ``` just to extract boundaries ``` | eval point1 = relative_time(info_min_time, "+7h"), point2 = relative_time(info_min_time, "+17h") | eval interval = mvappend(json_object("earliest", info_min_time, "latest", point1), json_object("earliest", point1, "latest", point2), json_object("earliest", point2, "latest", info_max_time)) | mvexpand interval | spath input=interval | eval span = if(earliest == point1, "10m", "1h") ``` the above uses prior knowledge about point1 and point2 ``` | map search="search index=_internal earliest=$earliest$ latest=$latest$ | timechart span=$span$ count" Obviously if your search window is not one 24-hour period, interval split becomes more complex.  But the same logic can apply to any window.
Hmm that is odd. It might be worth checking for any custom distsearch.conf settings on your production environment which might be blocking things. Please can you do a btool against distsearch and loo... See more...
Hmm that is odd. It might be worth checking for any custom distsearch.conf settings on your production environment which might be blocking things. Please can you do a btool against distsearch and look for anything which is in local? $SPLUNK_HOME/bin/splunk cmd btool distsearch list --debug  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hi @Praz_123  Did the time extraction I provided in the previous thread not work for you for some reason?   TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20 The r... See more...
Hi @Praz_123  Did the time extraction I provided in the previous thread not work for you for some reason?   TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20 The reason for your error is the extra "." you have in your TIME_PREFIX which is causing it to skip the first character of the year. Also you need to specify the MAX_TIMESTAMP_LOOKAHEAD. Below is my previous response incase you missed it. @livehybrid wrote: [yourSourceType] SHOULD_LINEMERGE=false LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
@kiran_panchavat  Not working for me are you using different event breaks and timestamp  I used the below props.conf  [ <SOURCETYPE NAME> ] CHARSET=AUTO SHOULD_LINEMERGE=true LINE... See more...
@kiran_panchavat  Not working for me are you using different event breaks and timestamp  I used the below props.conf  [ <SOURCETYPE NAME> ] CHARSET=AUTO SHOULD_LINEMERGE=true LINE_BREAKER=([\r\n]+) TIME_FORMAT=%Y-%m-%dT%H:%M:%S TIME_PREFIX= "ds":\s*"
@Praz_123  Check this  
@kiran_panchavat  Same data would be like :- { "version": "200", "predictions": [ { "ds": "2023-01-01T01:00:00", "y": 25727, "yhat_lower": 23595.643771045987, "yhat_upper": 26531.7862039... See more...
@kiran_panchavat  Same data would be like :- { "version": "200", "predictions": [ { "ds": "2023-01-01T01:00:00", "y": 25727, "yhat_lower": 23595.643771045987, "yhat_upper": 26531.786203915904, "marginal_upper": 26838.980030149163, "marginal_lower": 23183.715141246714, "anomaly": false }, { "ds": "2023-01-01T02:00:00", "y": 24710, "yhat_lower": 21984.478022195697, "yhat_upper": 24966.416390280523, "marginal_upper": 25457.020250925423, "marginal_lower": 21744.743048120385, "anomaly": false }, { "ds": "2023-01-01T03:00:00", "y": 23908, "yhat_lower": 21181.498740796877, "yhat_upper": 24172.09825724038, "marginal_upper": 24449.705257711226, "marginal_lower": 20726.645610860345, "anomaly": false },
@Praz_123  Let me try in my lab and get back to you shortly.