All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, This is what did the trick for me - save it as "dashboard-carousel.js" (function() { // List of dashboard URLs to cycle through const urls = [ "http://10.......", "http:... See more...
Hi, This is what did the trick for me - save it as "dashboard-carousel.js" (function() { // List of dashboard URLs to cycle through const urls = [ "http://10.......", "http://10.......", "http://10......." ]; // Time interval for cycling (in milliseconds) const interval = 10000; // 10 seconds // Get the current index from the URL query parameter (default to 0) const urlParams = new URLSearchParams(window.location.search); let currentIndex = parseInt(urlParams.get("index")) || 0; // Function to redirect to the next dashboard function cycleDashboards() { // Increment the index for the next cycle currentIndex = (currentIndex + 1) % urls.length; // Redirect to the next URL with the updated index as a query parameter window.location.href = `${urls[currentIndex]}?index=${currentIndex}`; } // Start the cycling process after the specified interval setTimeout(cycleDashboards, interval); })(); Then reference it like this in your dashboard XML <dashboard version="1.0" hideChrome="true"  script="dashboard-carousel.js">
I'm not familiar with these add ons so I'm not sure how your process works. If you're indeed receiving data on the HEC input, it's up to you on the source side to export only a subset of your events.... See more...
I'm not familiar with these add ons so I'm not sure how your process works. If you're indeed receiving data on the HEC input, it's up to you on the source side to export only a subset of your events. That's usually the most effective way because it's better to not send the data than to send it, receive and then filter out wasting resources on stuff you don't need and don't want. If you cannot do that, the document I pointed you to describes how you can filter your events.
@Jayanthan  In this case, as mentioned before you can use ingest action. It allows you to filter, mask, or route events before they are indexed. For reference check this out #https://lantern.splu... See more...
@Jayanthan  In this case, as mentioned before you can use ingest action. It allows you to filter, mask, or route events before they are indexed. For reference check this out #https://lantern.splunk.com/Splunk_Platform/Product_Tips/Data_Management/Using_ingest_actions_in_Splunk_Enterprise Settings > Data > Ingest Actions and create rule set. Alternatively, you can use the traditional method with your Heavy Forwarder by configuring props.conf and transforms.conf Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi @RanjiRaje  Rather than trying to piggyback on someone else's issue, please start a new topic in Answers with details of your specific usecase, what you have tried so far, and the issues you are ... See more...
Hi @RanjiRaje  Rather than trying to piggyback on someone else's issue, please start a new topic in Answers with details of your specific usecase, what you have tried so far, and the issues you are facing.
Hi @livehybrid  I am right now using "Splunk Add on for Microsoft Cloud services" and Splunk Add on for AWS"  using HEC to collect logs from the cloud
My problem here is that when I connect applications hosted on cloud to Splunk Enterprise using Add-ons, I am getting  lot of unwanted logs which when ingested shoots up the license utilization or inc... See more...
My problem here is that when I connect applications hosted on cloud to Splunk Enterprise using Add-ons, I am getting  lot of unwanted logs which when ingested shoots up the license utilization or increases server overhead and usage due to continuous filtering which induces a lag in Splunk system as whole.   I wanted to know if there is any method to filter the logs coming from cloud before ingesting it in Splunk. So that the processing load and license usage of Splunk will  be less.
Hi @Jayanthan  The Ingest Actions and props/transforms options are both suitable for Splunk Enterprise as well as Splunk Cloud.  The article @PickleRick posted gives a good overview of how to filte... See more...
Hi @Jayanthan  The Ingest Actions and props/transforms options are both suitable for Splunk Enterprise as well as Splunk Cloud.  The article @PickleRick posted gives a good overview of how to filter data using props/transforms. Ingest Actions gives a more UI friendly approach to very similar concepts if you are less familiar with props/transforms. Check out https://lantern.splunk.com/Splunk_Platform/Product_Tips/Data_Management/Sampling_data_with_ingest_actions_for_data_reduction which is quite a good overview. How are you currently getting your data? Is this sent from cloud apps to Splunk via HEC/UF/HF or are you pulling the data in with a specific app like AWS TA?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi, I am facing the same issue. did you find any solution ? 
@SN1  You can create a Webhook in Microsoft Teams and use splunk alert action. #https://learn.microsoft.com/en-us/microsoftteams/platform/webhooks-and-connectors/how-to/add-incoming-webhook?tab... See more...
@SN1  You can create a Webhook in Microsoft Teams and use splunk alert action. #https://learn.microsoft.com/en-us/microsoftteams/platform/webhooks-and-connectors/how-to/add-incoming-webhook?tabs=newteams%2Cdotnet Splunk alert action webhook - #https://docs.splunk.com/Documentation/Splunk/9.4.2/Alert/Webhooks Also you can check Splunk App #https://splunkbase.splunk.com/app/5335 Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!  
Hi @SN1  Have you already got the Splunk Alerts for Microsoft Teams app installed we discussed on a previous thread?  There is some good info at https://splunk.github.io/splunk-alerts-for-microsoft... See more...
Hi @SN1  Have you already got the Splunk Alerts for Microsoft Teams app installed we discussed on a previous thread?  There is some good info at https://splunk.github.io/splunk-alerts-for-microsoft-teams/Configuration/ covering how to set this up.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Here is another version which removes the limitations from append, this might be a more efficient search: (index=_internal sourcetype=splunkd_access source=*splunkd_access.log method=POST useragent ... See more...
Here is another version which removes the limitations from append, this might be a more efficient search: (index=_internal sourcetype=splunkd_access source=*splunkd_access.log method=POST useragent IN (axios*, curl*, python-requests*, splunk-sdk-python*, node*) NOT user IN (splunk-system-user, "-")) OR (index=_audit "info=completed" "action=search" NOT user IN (splunk-system-user, "-")) | rex field=uri_path ".*\/search(\/v2)?\/jobs\/(?<extracted_search_id>[^\/]+)" | eval extracted_search_id = "'" . extracted_search_id . "'" | eval search_id = coalesce(search_id,extracted_search_id) | where isnotnull(search_id) AND !like(search_id, "'export'") | stats first(_time) as _time, values(host) as host, first(clientip) as clientip, first(search) as search, first(user) as user, first(useragent) as useragent by search_id | table _time host clientip user useragent search_id search Note - I also tweaked the regex so you can extract the search_id from clients hitting the v2 endpoints.    Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @Na_Kang_Lim  join is very rarely the way to go with this, you could try the following which uses append to join them together, although even this has limitations (50,000 events I believe) - I wi... See more...
Hi @Na_Kang_Lim  join is very rarely the way to go with this, you could try the following which uses append to join them together, although even this has limitations (50,000 events I believe) - I will put together a version without append too... index=_internal sourcetype=splunkd_access source=*splunkd_access.log method=POST useragent IN (axios*, curl*, python-requests*, splunk-sdk-python*, node*) NOT user IN (splunk-system-user, "-") | rex field=uri_path ".*\/search(\/v2)?\/jobs\/(?<search_id>[^\/]+)" | eval search_id = "'" . search_id . "'" | where isnotnull(search_id) AND !like(search_id, "'export'") | append [search index=_audit "info=completed" "action=search" NOT user IN (splunk-system-user, "-")] | stats first(_time) as _time, values(host) as host, first(clientip) as clientip, first(search) as search, first(user) as user, first(useragent) as useragent by search_id    Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
I want to send alerts to ms teams channel how to do it
I want to get the `search` of the requests that made to the search head using API (not UI) like Splunk Python SDK. My idea is to parse the _internal log to get the search_id, then join it with _audi... See more...
I want to get the `search` of the requests that made to the search head using API (not UI) like Splunk Python SDK. My idea is to parse the _internal log to get the search_id, then join it with _audit log to get the search. So here is my SPL: index=_internal sourcetype=splunkd_access source=*splunkd_access.log method=POST useragent IN (axios*, curl*, python-requests*, splunk-sdk-python*, node-fetch*) NOT user IN (splunk-system-user, "-") | rex field=uri_path ".*/search/jobs/(?<search_id>[^/]+)" | eval search_id = "'" . search_id . "'" | where isnotnull(search_id) AND !like(search_id, "'export'") | join search_id [ search index=_audit action=search info=granted | fields search_id search ] | table _time host clientip user useragent search_id search However, this query `search` column returned nothing, though the search_id column has the correct value as `'<search_id>'`. If I take out the `'<search_id>'` and make a query like: index=_audit action=search info=granted search_id="'<search_id>'" | table _time search I could get the corresponding search. Somehow my `join` command is not working. 
Don't go for the 10 beta unless you're gonna do beta testing. Just spin up a normal 9.x environment and try there. Unless you're testing for compatibility between the app and Spiunk 10... Anyway, a... See more...
Don't go for the 10 beta unless you're gonna do beta testing. Just spin up a normal 9.x environment and try there. Unless you're testing for compatibility between the app and Spiunk 10... Anyway, as far as I remember, there was a beta license file somewhere around the place where you downloaded the installer from. You must use that file, not your "general" license. And I think there is no free/trial license tier with the beta release at this time.
There is a whole chapter in docs on that. https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/forwarding-and-receiving-data/9.4/perform-advanced-configuration/route-and-filter-data... See more...
There is a whole chapter in docs on that. https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/forwarding-and-receiving-data/9.4/perform-advanced-configuration/route-and-filter-data But that's only if you can't configure your inputs (and with cloud services I suppose you'd be using a pull-mode API inputs or something similar) to only give you a subset of your logs.
Hi @TestAdminHorst  There is a good explanation around this and workaround at https://splunk.my.site.com/customer/s/article/The-Splunk-Add-on-for-Office-365-is-not-collecting-any-logs-for-certain-in... See more...
Hi @TestAdminHorst  There is a good explanation around this and workaround at https://splunk.my.site.com/customer/s/article/The-Splunk-Add-on-for-Office-365-is-not-collecting-any-logs-for-certain-inputs-that-use-the-Microsoft-Graph-API-and-the-USGovGCCHigh-endpoint which is worth looking at.  Ultimately you might have more success by sending the logs from 365 to an Azure Event Hub and then ingest them that way.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing.
Hi @PrewinThomas , Thanks for the reply, But the solutions you suggested though they may help in filtering logs, they work only on Splunk Cloud Platform. I wanted a solution where we can filter th... See more...
Hi @PrewinThomas , Thanks for the reply, But the solutions you suggested though they may help in filtering logs, they work only on Splunk Cloud Platform. I wanted a solution where we can filter the logs from applications hosted on AWS or Azure and ingest them in Splunk Enterprise. Splunk Enterprise Security 
Thank you for your detailed response @acharlieh! I am sad to hear that but understand and also assumed after all the reading I did that this was indeed not currently supported. I will follow your r... See more...
Thank you for your detailed response @acharlieh! I am sad to hear that but understand and also assumed after all the reading I did that this was indeed not currently supported. I will follow your recommendation and open a feature request, hopefully it won't take too long
Hi @yash7172  Can you confirm that you have installed the beta license for Splunk 10? Non-beta licenses do not work on beta versions.  Was this a fresh install or an upgrade from non-beta version? ... See more...
Hi @yash7172  Can you confirm that you have installed the beta license for Splunk 10? Non-beta licenses do not work on beta versions.  Was this a fresh install or an upgrade from non-beta version?  Please can you share a screenshot of the license warning or exact log messages? I’m fairly sure this isn’t specifically related to the SentinelOne app as I don’t think this has its own license.  What does the license usage page show? What is the expiry and ingest limit on the license? I think earlier versions of the beta license expired at the end of June but I haven’t checked for a while to see if this was extended!   Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing.