All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi Everyone! I wrote a search query to get the blocked count of emails for last 6months and below is my query- | tstats summariesonly=false dc(Message_Log.msg.header.message-id) as Blocked from d... See more...
Hi Everyone! I wrote a search query to get the blocked count of emails for last 6months and below is my query- | tstats summariesonly=false dc(Message_Log.msg.header.message-id) as Blocked from datamodel=pps_ondemand where (Message_Log.filter.routeDirection="inbound") AND (Message_Log.filter.disposition="discard" OR Message_Log.filter.disposition="reject" OR Message_Log.filter.quarantine.folder="Spam*") earliest=-6mon@mon latest=now by _time | eval Source="Email" | eval Month=strftime(_time, "%b") | stats sum(Blocked) as Blocked by Source Month | eventstats sum(Blocked) as Total by Source | appendpipe [ stats values(Total) as Blocked by Source | eval Month="Total" ] | xyseries Source Month Blocked | fillnull value=0   and its output looks something like this - The only issue is in the output the month field is not chronologically sorted instead it is alphabetical. I intend to sort it chronologically. I tried with the below query as well to achieve the desired output but no go- | eval MonthNum=strftime(_time, "%Y-%m"), MonthName=strftime(_time, "%b") | stats sum(Blocked) as Blocked by Source MonthNum MonthName | eventstats sum(Blocked) as Total by Source | appendpipe [ stats values(Total) as Blocked by Source | eval MonthNum="9999-99", MonthName="Total" ] | sort MonthNum | eval Month=MonthName | table Source Month Blocked   Could someone please help here! Thanks In advance
Hi @ralphsteen  There is some free Veteran training over at https://workplus.splunk.com/veterans as part of the WorkPlus+ scheme, so you may be able to use this to get onto the Enterprise Security (... See more...
Hi @ralphsteen  There is some free Veteran training over at https://workplus.splunk.com/veterans as part of the WorkPlus+ scheme, so you may be able to use this to get onto the Enterprise Security (ES) training, however if its specifically for CompTIA Security+ then you might need to contact them through their site to see if they can determine why there is a cost showing against the training.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Is there a Special Log In for Veterans Workforce Program?    Am I currently signed in as a regular user? I signed up for the Veteran's Workforce Program a while back and thought I got a confirmation... See more...
Is there a Special Log In for Veterans Workforce Program?    Am I currently signed in as a regular user? I signed up for the Veteran's Workforce Program a while back and thought I got a confirmation but now can't find it. Under that program is there a free program for Splunk Enterprise Security?  When I find it under this login there is a price for that course. That course is pre approved by CompTIA for PDUs to renew my Security X so that's why I want to take it. Any help would be appreciated. Ralph P Steen Jr
Hi @berrybob  When testing with Curl, were you using the same Pod address as used in DSDL, or directly on the Pod IP? Are you able to hit port 5000 on the container host and reach the API within the... See more...
Hi @berrybob  When testing with Curl, were you using the same Pod address as used in DSDL, or directly on the Pod IP? Are you able to hit port 5000 on the container host and reach the API within the Pod?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
[yourSourceType] SHOULD_LINEMERGE=false LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%... See more...
[yourSourceType] SHOULD_LINEMERGE=false LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Okay @Praz_123  Lets try again!   [yourSourceType] SHOULD_LINEMERGE=false LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_... See more...
Okay @Praz_123  Lets try again!   [yourSourceType] SHOULD_LINEMERGE=false LINE_BREAKER=([\S\s\n]+"predictions":\s\[\s*)|}(\s*\,\s*){|([\s\n\r]*\][\s\n\r]*}[\s\n\r]*) NO_BINARY_CHECK=true TIME_PREFIX="ds":\s" TIME_FORMAT=%Y-%m-%dT%H:%M:%S MAX_TIMESTAMP_LOOKAHEAD=20  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Still wait the answer
When importing playbooks from the Splunk Research repository https://research.splunk.com/playbooks/  the imported playbooks appear with "Input" status and cannot be activated through the standard int... See more...
When importing playbooks from the Splunk Research repository https://research.splunk.com/playbooks/  the imported playbooks appear with "Input" status and cannot be activated through the standard interface. Additionally, attempts to delete these inactive playbooks result in errors or incomplete deletion processes. Question is :  1. Is there a best way to import and activate it? (However, it still needs configuration like an API) 2. Why can't I delete this from the playbook list even though I have logged in with an admin privilege account ?      
@livehybrid  Able to break down the events , but still can't extract the date-time information ,getting error     
Ah sorry about that! Leave it with me, just working on it locally to check.
I will keep it short. We found a solution to the errors.  If you just restart the Indexer or Searchhead which throws the errors or make a new connection as search peer won´t help. We shut down the... See more...
I will keep it short. We found a solution to the errors.  If you just restart the Indexer or Searchhead which throws the errors or make a new connection as search peer won´t help. We shut down the whole Splunk farm. Indexer, SH, Licence Server, Deploymentserver etc... When all server are off you can start them again.  Everything resumed to working fine without errors.
@livehybrid  Now it came like in 1 event   
Hi @Praz_123  Under Advanced try setting a LINE_BREAKER to "predictions"\s*:\s*\[|}\s*,\s*{|}\s*\]?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marki... See more...
Hi @Praz_123  Under Advanced try setting a LINE_BREAKER to "predictions"\s*:\s*\[|}\s*,\s*{|}\s*\]?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Thanks kiran for the support
Need to know while am adding the data in splunk am getting the below error  Same data would be like :- { "version": "200", "predictions": [ { "ds": "2023-01-01T01:00:00", "y"... See more...
Need to know while am adding the data in splunk am getting the below error  Same data would be like :- { "version": "200", "predictions": [ { "ds": "2023-01-01T01:00:00", "y": 25727, "yhat_lower": 23595.643771045987, "yhat_upper": 26531.786203915904, "marginal_upper": 26838.980030149163, "marginal_lower": 23183.715141246714, "anomaly": false }, { "ds": "2023-01-01T02:00:00", "y": 24710, "yhat_lower": 21984.478022195697, "yhat_upper": 24966.416390280523, "marginal_upper": 25457.020250925423, "marginal_lower": 21744.743048120385, "anomaly": false }, { "ds": "2023-01-01T03:00:00", "y": 23908, "yhat_lower": 21181.498740796877, "yhat_upper": 24172.09825724038, "marginal_upper": 24449.705257711226, "marginal_lower": 20726.645610860345, "anomaly": false },
Thank you for the response I tried your solution but still have results only for one day.  I wonder maybe this line may affect the unwanted one-day results:  status latest(test) as tests l... See more...
Thank you for the response I tried your solution but still have results only for one day.  I wonder maybe this line may affect the unwanted one-day results:  status latest(test) as tests latest(_time) as _time maybe I shouldn't use 'latest' agg function for 'test' and '_time'? But I don't know how to pass these values in a different way to 'timechart' function.
Hi @Karthickb2308  As others have mentioned, there arent currently any Splunkbase apps to write back to ManageEngine ITSM with Splunk for CMDB synchronization and automated ticket creation from Ente... See more...
Hi @Karthickb2308  As others have mentioned, there arent currently any Splunkbase apps to write back to ManageEngine ITSM with Splunk for CMDB synchronization and automated ticket creation from Enterprise Security alerts, however you can achieve this in a couple of ways: Custom App - You could use the ManageEngine API (https://www.manageengine.com/products/service-desk/sdpod-v3-api/SDPOD-V3-API.html) to build a custom app using Splunk UCC Framework - UCC is a great way to start building inputs (to import your CMDB data) and also create modular alert actions (to raise incidents from Enterprise Security).  Also see https://dev.splunk.com/enterprise/docs/devtools/python/sdk-python/howtousesplunkpython/howtocreatemodpy/ for more background on creating inputs. Use the REST API Modular Input add-on app to use the same Manage Engine API from within SPL, you can use scheduled searches to utilise the app's "curl" command against ManageEngine's REST API to fetch CMDB data. You could create a macro to write incidents using the same command and call this at the end of searches where you would normally create an alert action. Note - the curl command doesnt actually use curl, so not every parameter is supported, it uses python requests under-the-hood (see https://www.baboonbones.com/php/markdown.php?document=rest/README.md) Hopefully one of these two options helps you move forwards with your integration with ManageEngine into Splunk - please let me know you have any questions  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
To clarify, there are two distinct aspects in your requirements: If the date of the event matches that in the lookup, do not send alert no matter what search result is. Only on days that do not ma... See more...
To clarify, there are two distinct aspects in your requirements: If the date of the event matches that in the lookup, do not send alert no matter what search result is. Only on days that do not match any date in the lookup, send alert if search result is 0 or greater than 1. If this is true, event count must be before date match or together with date match. index=xxxxxx | eval HDate=strftime(_time,"%Y-%m-%d") | lookup Date_Test.csv HDate output HDate as match | stats count values(match) as match by HDate | where isnull(match) AND count != 1 The by HDate clause is to validate event date in case the search crosses calendar dates.
@Karthickb2308  No one-click integration for CMDB or ticketing, but REST API and Splunk alert actions make it achievable. Use the ServiceDeskPlus Splunk app for supported ticket actions(If you have... See more...
@Karthickb2308  No one-click integration for CMDB or ticketing, but REST API and Splunk alert actions make it achievable. Use the ServiceDeskPlus Splunk app for supported ticket actions(If you have Splunk SOAR), or build your own with Python/REST. For CMDB, use exports/API to sync data into Splunk for enrichment and correlation. Also a simple alternative -If you can’t use the API, configure Splunk to send alert emails to ManageEngine’s ticket creation email address (less flexible, but simple).
Thanks @PrewinThomas , Do you have sample custom response handler which outputs both status code and body.