All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@kiran_panchavat  I got the answer thanks as I used the below props.conf  , @livehybrid  thanks for your help  [ <SOURCETYPE NAME> ] LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|... See more...
@kiran_panchavat  I got the answer thanks as I used the below props.conf  , @livehybrid  thanks for your help  [ <SOURCETYPE NAME> ] LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_FORMAT=%Y-%m-%dT%H:%M:%S TIME_PREFIX="ds":\s" MAX_TIMESTAMP_LOOKAHEAD=20 MAX_DAYS_AGO=5000
@livehybrid  THANKS for your help it worked finally ,  
@Praz_123  Select source type something like, I can see it's showing default ********* Ignore this *****
@kiran_panchavat  Same issue again if I take in json or in txt both are not giving me results   
Hi @Praz_123  I think the issue here could be that in the original data I had for the my example, the date is in 2023, however in this example the data is in 2012. In props.conf there is a MAX_DAYS... See more...
Hi @Praz_123  I think the issue here could be that in the original data I had for the my example, the date is in 2023, however in this example the data is in 2012. In props.conf there is a MAX_DAYS_AGO setting which defaults to 2000 - which is some time in 2019 - If the date you want to extract is prior to this date then you need to increase MAX_DAYS_AGO! Try setting MAX_DAYS_AGO=5000 [yourSourceType] SHOULD_LINEMERGE=false LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20 MAX_DAYS_AGO=5000 If this doesnt work then please show the error by hovering over the error icon.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
@Praz_123  Check the data which you uploaded it should be .json format not .txt format.  [ jsontest ] CHARSET=UTF-8 LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]... See more...
@Praz_123  Check the data which you uploaded it should be .json format not .txt format.  [ jsontest ] CHARSET=UTF-8 LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true SHOULD_LINEMERGE=true category=Custom pulldown_type=true    
@bowesmana and @PrewinThomas give you two different approaches.  I will put a different spin on Prewin27's append method. (BTW, there should be no need to sort by _time after timechart.)  To avoid se... See more...
@bowesmana and @PrewinThomas give you two different approaches.  I will put a different spin on Prewin27's append method. (BTW, there should be no need to sort by _time after timechart.)  To avoid searching the same data multiple times, I use map. In the following example, I simplify interval split by restricting total search window to -1d@d - -0d@d. | tstats count where index=_internal earliest=-1d@d latest=-0d@d | addinfo ``` just to extract boundaries ``` | eval point1 = relative_time(info_min_time, "+7h"), point2 = relative_time(info_min_time, "+17h") | eval interval = mvappend(json_object("earliest", info_min_time, "latest", point1), json_object("earliest", point1, "latest", point2), json_object("earliest", point2, "latest", info_max_time)) | mvexpand interval | spath input=interval | eval span = if(earliest == point1, "10m", "1h") ``` the above uses prior knowledge about point1 and point2 ``` | map search="search index=_internal earliest=$earliest$ latest=$latest$ | timechart span=$span$ count" Obviously if your search window is not one 24-hour period, interval split becomes more complex.  But the same logic can apply to any window.
Hmm that is odd. It might be worth checking for any custom distsearch.conf settings on your production environment which might be blocking things. Please can you do a btool against distsearch and loo... See more...
Hmm that is odd. It might be worth checking for any custom distsearch.conf settings on your production environment which might be blocking things. Please can you do a btool against distsearch and look for anything which is in local? $SPLUNK_HOME/bin/splunk cmd btool distsearch list --debug  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hi @Praz_123  Did the time extraction I provided in the previous thread not work for you for some reason?   TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20 The r... See more...
Hi @Praz_123  Did the time extraction I provided in the previous thread not work for you for some reason?   TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20 The reason for your error is the extra "." you have in your TIME_PREFIX which is causing it to skip the first character of the year. Also you need to specify the MAX_TIMESTAMP_LOOKAHEAD. Below is my previous response incase you missed it. @livehybrid wrote: [yourSourceType] SHOULD_LINEMERGE=false LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
@kiran_panchavat  Not working for me are you using different event breaks and timestamp  I used the below props.conf  [ <SOURCETYPE NAME> ] CHARSET=AUTO SHOULD_LINEMERGE=true LINE... See more...
@kiran_panchavat  Not working for me are you using different event breaks and timestamp  I used the below props.conf  [ <SOURCETYPE NAME> ] CHARSET=AUTO SHOULD_LINEMERGE=true LINE_BREAKER=([\r\n]+) TIME_FORMAT=%Y-%m-%dT%H:%M:%S TIME_PREFIX= "ds":\s*"
@Praz_123  Check this  
@kiran_panchavat  Same data would be like :- { "version": "200", "predictions": [ { "ds": "2023-01-01T01:00:00", "y": 25727, "yhat_lower": 23595.643771045987, "yhat_upper": 26531.7862039... See more...
@kiran_panchavat  Same data would be like :- { "version": "200", "predictions": [ { "ds": "2023-01-01T01:00:00", "y": 25727, "yhat_lower": 23595.643771045987, "yhat_upper": 26531.786203915904, "marginal_upper": 26838.980030149163, "marginal_lower": 23183.715141246714, "anomaly": false }, { "ds": "2023-01-01T02:00:00", "y": 24710, "yhat_lower": 21984.478022195697, "yhat_upper": 24966.416390280523, "marginal_upper": 25457.020250925423, "marginal_lower": 21744.743048120385, "anomaly": false }, { "ds": "2023-01-01T03:00:00", "y": 23908, "yhat_lower": 21181.498740796877, "yhat_upper": 24172.09825724038, "marginal_upper": 24449.705257711226, "marginal_lower": 20726.645610860345, "anomaly": false },
@Praz_123  Let me try in my lab and get back to you shortly. 
@kiran_panchavat  am getting the error as I need the same date and time extraction while using the time format and time prefix am getting the below error    Below is my props.conf  ... See more...
@kiran_panchavat  am getting the error as I need the same date and time extraction while using the time format and time prefix am getting the below error    Below is my props.conf  [ _json ] SHOULD_LINEMERGE=false LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_FORMAT=%Y-%m-%dT%H:%M:%S TIME_PREFIX=\[|ds\"\:\s\".
@Praz_123  \b\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}(?:\.000)?\b  
You can technically achieve this through post processing of the timechart data. All you do is create your timechart in the smaller span, then add up the 6 X 10 minute blocks outside your time range a... See more...
You can technically achieve this through post processing of the timechart data. All you do is create your timechart in the smaller span, then add up the 6 X 10 minute blocks outside your time range and remove the unnecessary ones.  Here's an example using streamstats/eventstats - there are probably other ways, but this works  index=_audit | timechart span=10m count | eval t=strftime(_time, "%H") | streamstats window=6 sum(eval(if(t>=7 AND t<19, null(), count))) as hourly by t | eventstats max(hourly) as hourly_max min(hourly) as hourly_min by t | where hourly=hourly_min OR isnull(hourly) | eval hourly=hourly_max | fields - hourly* t You could make it simpler depending on your total search time range. You will see the X axis will not change, but you will only have hourly data points in the 19-07 hours.
Need to write a regex for  same as time and same as event given below in image   
@wjrbrady  Splunk timechart command’s span argument must be a fixed value per search execution—you cannot dynamically change the span within a single timechart based on the hour of the day. How... See more...
@wjrbrady  Splunk timechart command’s span argument must be a fixed value per search execution—you cannot dynamically change the span within a single timechart based on the hour of the day. However, you can achieve similar logic using a combination of eval, bin, and append Eg: using append ( search ... earliest=@d latest=now | eval hour=strftime(_time,"%H") | where hour > 7 AND hour < 19 | timechart span=10m sum(count) as count ) | append ( search ... earliest=@d latest=now | eval hour=strftime(_time,"%H") | where hour <= 7 OR hour >= 19 | timechart span=1h sum(count) as count ) | sort _time Also if you want a single timeline but with custom buckets, you can create your own time buckets using eval and bin Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos. Thanks!
Hello @PickleRick , I have tried implementing that, the timetsamp in lookup is not correctly passed in this search, I was bit confused. Whatever the lookup has in timestamp value, it was passed as... See more...
Hello @PickleRick , I have tried implementing that, the timetsamp in lookup is not correctly passed in this search, I was bit confused. Whatever the lookup has in timestamp value, it was passed as diffrent value in the search, I feels strange. please let me know if i have missed anything. Thanks!
@wipark  Check for Replication Quarantine or Bundle Issues Large or problematic files (e.g., big CSV lookups) can cause replication to fail or be quarantined. Review the metrics.log and splunkd.lo... See more...
@wipark  Check for Replication Quarantine or Bundle Issues Large or problematic files (e.g., big CSV lookups) can cause replication to fail or be quarantined. Review the metrics.log and splunkd.log on all SHC members for replication errors or warnings Test Manual Change Make a simple change to a standard file (e.g. props.conf) via the UI or REST API and see if it replicates. If standard files replicate but your custom file does not, it’s likely a file location or inclusion issue. If the cluster is out of sync - Force Resync if required eg: splunk resync shcluster-replicated-config Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos. Thanks!