All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Forget you ever heard about Search Filters.  They usually cause more problems than they solve. TRANSFORMS are index-time operations so they will mask data for everyone. What you want is Field Filte... See more...
Forget you ever heard about Search Filters.  They usually cause more problems than they solve. TRANSFORMS are index-time operations so they will mask data for everyone. What you want is Field Filters.  They automatically mask fields in search results based on user roles.  See https://docs.splunk.com/Documentation/Splunk/9.4.2/Security/searchfieldfilters for more information.
No license is needed for a standalone server that only searches thawed data since there is no ingest.
Hello folks, We use Splunk cloud platform (managed by Splunk) for our logging system. We want to implement role based search filtering to mask JWT tokens and Emails in the logs for certain users. E... See more...
Hello folks, We use Splunk cloud platform (managed by Splunk) for our logging system. We want to implement role based search filtering to mask JWT tokens and Emails in the logs for certain users. Ex.  Roles: User, RestrictedUser Both roles have access to the same index: main Users can query as normal, but if a RestrictedUser searches the logs then they should get the logs with the token and email data masked. Documentation/community posts/gemini recommended adding regex for filtering in transforms conf and updating some other conf files like so # transforms.conf [redact_jwt_searchtime] REGEX = (token=([A-Za-z0-9-]+\.[A-Za-z0-9-]+\.[A-Za-z0-9-_]+)) FORMAT = token=xxx.xxx.xxx SOURCE_KEY = _raw [redact_email_searchtime] REGEX = ([A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Za-z]{2,}) FORMAT = xxx@xxx.xxx SOURCE_KEY = _raw # props.conf [*] TRANSFORMS-redact_for_search = redact_jwt_searchtime, redact_email_searchtime # authorize.conf [test_masked_data] srchFilter = search_filters = redact_for_search creating an app and uploading it on the cloud platform. Since the platform is managed by Splunk, I'm not sure if that would be sufficient or even work.   Anyone have suggestions on the best way to apply the role based search filters when on Splunk Cloud rather than on premise?
Hello all   Is the Nutanix TA (version 2.5.0) compatible with Splunk 9.3.4+? It is listed as such on the splunk base (https://splunkbase.splunk.com/app/3103) but when I attempted to upgrade Unable ... See more...
Hello all   Is the Nutanix TA (version 2.5.0) compatible with Splunk 9.3.4+? It is listed as such on the splunk base (https://splunkbase.splunk.com/app/3103) but when I attempted to upgrade Unable to initialize modular input "abc" defined in the app "TA-nutanix": Introspecting scheme=nutanix_health: script running failed (PID 2607550 exited with code 1)..  
I don't see a Splunkbase add-on for Airtable. Is this a private app for Splunk Enterprise? It would be nice to have something that works for Splunk Cloud.
Thanks for the replies. I will clarify. Management wants me to test thawing old data so it is searchable (near term) or can be moved to cloud possibly later this year. DDSS and DDAA will be part of ... See more...
Thanks for the replies. I will clarify. Management wants me to test thawing old data so it is searchable (near term) or can be moved to cloud possibly later this year. DDSS and DDAA will be part of the discussion a bit down the road, but for now I need to test/verify thawing from frozen. We are going to retire our on-prem infrastructure at some point. The thawed data does not have to be to our production cluster, so a standalone splunk single server would work. If I stand up a new single instance server, is there any licensing I need to worry about if I'm just using it to thaw frozen data?
It's not clear how this relates to cloud migrations. If you sign up for Splunk Cloud's Dynamic Data Self Storage (DDSS) service, then data archived in the cloud is the same as data archived on-prem.... See more...
It's not clear how this relates to cloud migrations. If you sign up for Splunk Cloud's Dynamic Data Self Storage (DDSS) service, then data archived in the cloud is the same as data archived on-prem.  You must thaw the data then stand up indexers to process it. If you sign up for Splunk Cloud's Dynamic Data Active Archive (DDAA) service, then you use the GUI to tell Splunk what data to restore for you and it becomes searchable for a limited time (30 days, IIRC).  External data cannot be added to DDAA. Either way, there's no need to migrate currently-frozen data to the cloud.
The HOST field worked! Thanks @new_splunker . I'm trying to ingest a webhook into the Splunk cloud trial instance. I'm getting a SSL certification error - failed to verify the legitimacy of the serv... See more...
The HOST field worked! Thanks @new_splunker . I'm trying to ingest a webhook into the Splunk cloud trial instance. I'm getting a SSL certification error - failed to verify the legitimacy of the server and therefore could not establish a secure connection to it were there any other setting you did to establish the webhook connection?
Ah okay, I'm sorry Im not too familiar with the app, but hopefully someone else on here might have experience with it. Have you seen the "Details" tab on https://splunkbase.splunk.com/app/5365 which ... See more...
Ah okay, I'm sorry Im not too familiar with the app, but hopefully someone else on here might have experience with it. Have you seen the "Details" tab on https://splunkbase.splunk.com/app/5365 which has some setup instructions?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
I have installed this one but i've not been able to get it working. I'm using the same proxy as with the Splunk Add-on for Microsoft Office 365 and  I've put in an incorrect secret key but i don't ge... See more...
I have installed this one but i've not been able to get it working. I'm using the same proxy as with the Splunk Add-on for Microsoft Office 365 and  I've put in an incorrect secret key but i don't get any kind of error like i do with the Splunk Add-on for Microsoft Office 365.
Hi @mikefg  I take it you just need to thaw the data so it can be copied to your Splunk Cloud instance? Is PS doing this work? If so they might have a preference as to where this data is or how its ... See more...
Hi @mikefg  I take it you just need to thaw the data so it can be copied to your Splunk Cloud instance? Is PS doing this work? If so they might have a preference as to where this data is or how its accessed as part of the wider migration piece (there may be other bits of info I'm unaware of) (e.g. is this an Online Smartstore migration, or a Data Copy?) However - personally (and without knowing what I dont know!) I would go with creating an instance connected to your old storage array, you actually only need a standalone Splunk instance to thaw out data and if you are not needing to do searches against this until its moved to Splunk Cloud then you shouldnt need to scale it out too much - unless you really have a lot to thaw out. Once it is thawed it will be in a format which can be used with existing processes for migrating to Splunk Cloud.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi I have created a playbook and am trying to run it from an event. But the playbook does not populate when I click on run playbook. what is that I am doing wrong?
Hi @anlePRH  Are you already producing the table you shared in your original post, or is that what you are wanting to get to? You should be able to use the following after your REX: | stats list(S... See more...
Hi @anlePRH  Are you already producing the table you shared in your original post, or is that what you are wanting to get to? You should be able to use the following after your REX: | stats list(SourceIP) as IPs, count as Count by Subnet  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
I have created a playbook and am trying to run it from an event I have configured. but when I click on the run playbook my playbook does not show in the list what is that I am missing?
Hi @vishalduttauk  Have you seen  Microsoft O365 Email Add-on for Splunk? The description of this include "The Microsoft® O365® Email Add-on for Splunk® ingests O365 emails via Microsoft’s Graph API... See more...
Hi @vishalduttauk  Have you seen  Microsoft O365 Email Add-on for Splunk? The description of this include "The Microsoft® O365® Email Add-on for Splunk® ingests O365 emails via Microsoft’s Graph API." so I think this might give you the email content that you need! Check it out and let me know if you need any further help!  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
This project is to test for a potential on-prem to cloud migration. I need to thaw several terabytes of frozen splunk data. It has been frozen over the past several years from an indexer cluster to ... See more...
This project is to test for a potential on-prem to cloud migration. I need to thaw several terabytes of frozen splunk data. It has been frozen over the past several years from an indexer cluster to offline repos. The storage array where my existing indexer cluster resides doesn't have enough disk space to bring it all back. I have a secondary storage array that I can use that has plenty of space, but I can't move my existing cluster. I need help understanding/deciding: Should I build new indexers on the secondary array, add them to the existing cluster and thaw data to them. Should I build a new cluster with new indexers on the secondary array and thaw the data there.  Maybe it's easiest to just build one new standalone indexer on the secondary array and thaw all data to this one new standalone indexer? The data will need to be searchable/exportable, I have only one search head (no search head cluster).
Hi there,   We have an on prem Exchange mailbox which we monitor via the Exchange logs. We pick out key words from the subject line to trigger alerts.   Our mailbox is moving into Exchange online... See more...
Hi there,   We have an on prem Exchange mailbox which we monitor via the Exchange logs. We pick out key words from the subject line to trigger alerts.   Our mailbox is moving into Exchange online so i've been working with our Azure team and managed to integrate Splunk Enterprise (on prem) with a test online mailbox and so far i am ingesting generic information about the mailbox via the Splunk Add-on for Microsoft Office 365. Information like information like Issue Warning Quota (Byte), Prohibit, Send Quota (Byte) and Prohibit Send/Receive Quota. The 2 inputs i've created are Message Trace and Mailbox (which ingests the mailbox data above). What i want to do is to ingest the emails themselves. The key information like subject, the body (if possible), from address and to address. Is this possible using is add on?
Hello, This is installed directly on the splunk cloud instance. I just started using splunk about a week ago. To my knowledge, I don't have cli access to modify any files. I also don't see why I wou... See more...
Hello, This is installed directly on the splunk cloud instance. I just started using splunk about a week ago. To my knowledge, I don't have cli access to modify any files. I also don't see why I would need to, as there is no mention of a need to in the instructions. They seem to have built everything you would need into the app configuration pages such as fields to input api key and whatnot.   I also found the thread you mentioned, but it seems no one was able to come up with a solution then either.
Hello,   I only have this one app from S1 installed on the indexer/searchhead which is in Splunk cloud.
It would be helpful to know what you've tried already and those efforts failed to meet expectations. Perhaps this will help. | rex field=SourceIP "(?<Subnet>\d+\.\d+\.\d+\.*)" | stats count as Coun... See more...
It would be helpful to know what you've tried already and those efforts failed to meet expectations. Perhaps this will help. | rex field=SourceIP "(?<Subnet>\d+\.\d+\.\d+\.*)" | stats count as Count, values(SourceIP) as IPs by Subnet