All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi Thanks very much for this great answer. This worked very well. CHeers 
@mchoudhary  You can try below, | tstats summariesonly=false dc(Message_Log.msg.header.message-id) as Blocked from datamodel=pps_ondemand where (Message_Log.filter.routeDirection="inbound") AND... See more...
@mchoudhary  You can try below, | tstats summariesonly=false dc(Message_Log.msg.header.message-id) as Blocked from datamodel=pps_ondemand where (Message_Log.filter.routeDirection="inbound") AND (Message_Log.filter.disposition="discard" OR Message_Log.filter.disposition="reject" OR Message_Log.filter.quarantine.folder="Spam*") earliest=-6mon@mon latest=now by _time | eval Source="Email" | eval MonthNum=strftime(_time, "%Y-%m"), MonthName=strftime(_time, "%b") | stats sum(Blocked) as Blocked by Source MonthNum MonthName | eventstats sum(Blocked) as Total by Source | appendpipe [ stats values(Total) as Blocked by Source | eval MonthNum="9999-99", MonthName="Total" ] | sort MonthNum | eval Month=MonthName | table Source Month Blocked Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos. Thanks!  
The main question here is whether you want to send new data or move existing data.
Be aware though that not all aggregation functions are further aggregatable. For example - sum or max/min can be aggregated from smaller spans into a correct overall value but avg cannot.
Ok. Not knowing your use case, there are some general tips: 1) Don't overwrite _time unless you're absolutely sure what you're doing. If you must overwrite time by other value extracted calculated f... See more...
Ok. Not knowing your use case, there are some general tips: 1) Don't overwrite _time unless you're absolutely sure what you're doing. If you must overwrite time by other value extracted calculated from your event high chance is that your source onboarding wasn't done correctly 2) As @gcusello a;ready pointed out - typically the best way of handling timestamps is using the unix epoch-based value, not a strftimed string representation. These are general rules and sometimes there are border cases when you need to do otherwise. But here comes another painful truth 3) Be wary of timezones
Instead of computing month before xyseries, it's better to carry _time into xyseries and use transpose to get your final layout.  Unlike xyseries, transpose preserves row order into column order. Bu... See more...
Instead of computing month before xyseries, it's better to carry _time into xyseries and use transpose to get your final layout.  Unlike xyseries, transpose preserves row order into column order. But then, given that you only have one prescribed "source", I wonder if xyseries and streamstats are a waste.  How about | tstats summariesonly=false dc(Message_Log.msg.header.message-id) as Blocked from datamodel=pps_ondemand where (Message_Log.filter.routeDirection="inbound") AND (Message_Log.filter.disposition="discard" OR Message_Log.filter.disposition="reject" OR Message_Log.filter.quarantine.folder="Spam*") earliest=-6mon@mon latest=now by _time span=1month@month | eval Month=strftime(_time, "%b") | transpose header_field=month column_name=Source | eval Source = "Email" | fillnull value=0 | addtotals Here, I removed the first stats sum because by using span=1mon@mon in tstats, that calculation is already done.  I also removed eventstats and streamstats because total on row is more easily performed with addtotals.
Hi @smanojkumar , in general, to compare timestamps it's always better to transform both of them in epochtime format (using the strptime function of the eval command). Ciao. Giuseppe
@SN1  If you're moving the entire instance (including historical data, configs, and users) from one machine to another: Migrate a Splunk Enterprise instance from one physical machine to another | S... See more...
@SN1  If you're moving the entire instance (including historical data, configs, and users) from one machine to another: Migrate a Splunk Enterprise instance from one physical machine to another | Splunk Docs Restore Archived Indexed Data (Bucket-Level Transfer): If you want to move specific historical data (e.g., cold/frozen buckets) to another instance: Restore archived indexed data | Splunk Docs You can copy bucket files into the thaweddb directory of the target index on the new instance. This is ideal for selective historical data recovery.
Hi @SN1 , let me understand: you have two stand alone Splunk servers and you want to send data of an index from the second to the first, is it correct? if this is your requirement, the first questi... See more...
Hi @SN1 , let me understand: you have two stand alone Splunk servers and you want to send data of an index from the second to the first, is it correct? if this is your requirement, the first question should be why? but anyway, I need other two information for your solution: is there another Heavy forwarder forwarding these logs? do you want to forward all the data or only the ones of one index? if logs passing through another Splunk full instance (Heavy Forwarder), you have to work on it otherwise on the ServerB. You have to create a fork following the instructions at https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/forwarding-and-receiving-data/9.4/perform-advanced-configuration/forward-data-to-third-party-systems  if you want to forward all logs, you can configure forwarding and receiving [Settings > Forwarding and Receiving > Forwarding] with the option "Index and forwardiung", in this way you forward all logs maintaining a local copy of them, for more information see at https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/forwarding-and-receiving-data/9.4/perform-advanced-configuration/forward-data-to-third-party-systems#forward-all-data-0  If instead you want to forward only a subset of data you have to use the configurations at https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/forwarding-and-receiving-data/9.4/perform-advanced-configuration/forward-data-to-third-party-systems#forward-a-subset-of-data-0  Ciao. Giuseppe
Hi @Paul_Szoke  It looks like the developers of the apps no longer list this on their site at all, Ive searched high and low! It could be that they no longer maintain this but may be able to provide... See more...
Hi @Paul_Szoke  It looks like the developers of the apps no longer list this on their site at all, Ive searched high and low! It could be that they no longer maintain this but may be able to provide an earlier version. I think the best action here would be to contact them (https://www.rojointegrations.com/contact) for more info and to see if they can provide the app.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @SN1  Can you confirm - is this historic data that has already been indexed, or new data which is being received currently? If you are currently receiving data into A and want to send to B then ... See more...
Hi @SN1  Can you confirm - is this historic data that has already been indexed, or new data which is being received currently? If you are currently receiving data into A and want to send to B then check out https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/forwarding-and-receiving-data/9.4/perform-advanced-configuration/forward-data-to-third-party-systems#forward-all-data-0 If you are looking to move old indexes from A to B then the easiest way is to copy the buckets from one system to the other, if they are both standalone instances with unique GUID then this should be fine - just make sure you define the indexes in indexes.conf Check out https://help.splunk.com/en/splunk-enterprise/get-started/install-and-upgrade/9.4/upgrade-or-migrate-splunk-enterprise/migrate-a-splunk-enterprise-instance-from-one-physical-machine-to-another for more information on how to migrate from one to the other.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
hello i have an index  (A) on indexer and other index (B) on Search head (we are making it standalone) . i want to send data from index A to B . How to proceed . I have admin rights.
Apigee API Management Monitoring App for Splunk | Splunkbase The "Visit Site" link give this message:  "404 Sorry this page is not available The link you followed may be broken, or the page may ha... See more...
Apigee API Management Monitoring App for Splunk | Splunkbase The "Visit Site" link give this message:  "404 Sorry this page is not available The link you followed may be broken, or the page may have been removed." Are there any alternative ways to download?
Hi @Praz_123 @kiran_panchavat  You need to set MAX_DAYS_AGO=5000 and MAX_TIMESTAMP_LOOKAHEAD=20  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it... See more...
Hi @Praz_123 @kiran_panchavat  You need to set MAX_DAYS_AGO=5000 and MAX_TIMESTAMP_LOOKAHEAD=20  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
@kiran_panchavat  I got the answer thanks as I used the below props.conf  , @livehybrid  thanks for your help  [ <SOURCETYPE NAME> ] LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|... See more...
@kiran_panchavat  I got the answer thanks as I used the below props.conf  , @livehybrid  thanks for your help  [ <SOURCETYPE NAME> ] LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_FORMAT=%Y-%m-%dT%H:%M:%S TIME_PREFIX="ds":\s" MAX_TIMESTAMP_LOOKAHEAD=20 MAX_DAYS_AGO=5000
@livehybrid  THANKS for your help it worked finally ,  
@Praz_123  Select source type something like, I can see it's showing default ********* Ignore this *****
@kiran_panchavat  Same issue again if I take in json or in txt both are not giving me results   
Hi @Praz_123  I think the issue here could be that in the original data I had for the my example, the date is in 2023, however in this example the data is in 2012. In props.conf there is a MAX_DAYS... See more...
Hi @Praz_123  I think the issue here could be that in the original data I had for the my example, the date is in 2023, however in this example the data is in 2012. In props.conf there is a MAX_DAYS_AGO setting which defaults to 2000 - which is some time in 2019 - If the date you want to extract is prior to this date then you need to increase MAX_DAYS_AGO! Try setting MAX_DAYS_AGO=5000 [yourSourceType] SHOULD_LINEMERGE=false LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20 MAX_DAYS_AGO=5000 If this doesnt work then please show the error by hovering over the error icon.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
@Praz_123  Check the data which you uploaded it should be .json format not .txt format.  [ jsontest ] CHARSET=UTF-8 LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]... See more...
@Praz_123  Check the data which you uploaded it should be .json format not .txt format.  [ jsontest ] CHARSET=UTF-8 LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true SHOULD_LINEMERGE=true category=Custom pulldown_type=true