All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @chenfan , as I said, upgrade your platform using the upgrade path described in the documentation, don't skip any step! Then you can upgrade your apps. Ciao. Giuseppe
Hi @Keith_NZ , at first, please in addition to the screenshots, add also the code and a sample of your logs in text format using the "Add/Edit Code sample" button. Then, if you are doing an extrac... See more...
Hi @Keith_NZ , at first, please in addition to the screenshots, add also the code and a sample of your logs in text format using the "Add/Edit Code sample" button. Then, if you are doing an extraction from _raw you don't need to explicit it in field option. At least, your first rex expressio is almost correct, you have to declare the format of the field (e.g. if it's numeric you have to add \d, something like this, then you have to declare something to define the string to extract as field, e.g. to extract the postCode, you should use: rex "postCode\\\":\\\"(?<postCode>\d+)" in this specific case beware when you have backslashes because to use in Splunk you have to use an additional backslash. Instead isn't correct the last one: | rex field=_raw reg_str because it isn't a field extraction. Ciao. Giuseppe
Hi @gcusello Thank you for your reply! Can I upgrade the platform from version 7.x.x to version 9.3.x and then uniformly upgrade the Apps/Add-ons to their latest versions? Will this have an imp... See more...
Hi @gcusello Thank you for your reply! Can I upgrade the platform from version 7.x.x to version 9.3.x and then uniformly upgrade the Apps/Add-ons to their latest versions? Will this have an impact on my system?
Hi All, I am new to Power BI. My question is, how do we integrate between Splunk and Power BI. Is there an Official guide or manual from Splunk on how we configure this integration for both ? ... See more...
Hi All, I am new to Power BI. My question is, how do we integrate between Splunk and Power BI. Is there an Official guide or manual from Splunk on how we configure this integration for both ? Cheers Zamir
Let me tell you about the exact phenomenon. Splunk Enterprise is currently running two separate categories: search header server and index server. The server environment is as follows. OS versio... See more...
Let me tell you about the exact phenomenon. Splunk Enterprise is currently running two separate categories: search header server and index server. The server environment is as follows. OS version: CentOS 7 Splunk version: 9.0.4 ram: 256G swap: 16G I'm using about 5% of memory on average, but I'm using 100% of swaps.
As several people urged you, please post a complete sample of event, not screen cutouts.  You can sanitize the sample any way you like, but keep quotation marks, commas, curly brackets, square bracke... See more...
As several people urged you, please post a complete sample of event, not screen cutouts.  You can sanitize the sample any way you like, but keep quotation marks, commas, curly brackets, square brackets in exact place. Meanwhile, the cutouts give me enough info to determine that part of the event is JSON.  Here is an experiment for you. | rex "^[^{]+(?<only_json>.+})" | spath input=only_json See if more fields gets out.
On the deployer right and push it to SHs? And where can I configure this?
@KarthikeyaIf a Heavy Forwarder (HF) is not available, install it on the search head.
tanhks , i chaged config  then reolved problem best regard
We have deployment server which receives data from UF. We have cluster manager and deployer and SHs. Where to install and configure this add-on? in DS or Deployer or SHs? Please confirm I am confused... See more...
We have deployment server which receives data from UF. We have cluster manager and deployer and SHs. Where to install and configure this add-on? in DS or Deployer or SHs? Please confirm I am confused. We don't have HF at the moment. Normally where we need to configure data inputs?
Hi All, In SPL2 Ingest Pipeline I want to assemble a regular expression and then use that in a rex command but I am having trouble. For example this simple test I am specifying the regex as a text ... See more...
Hi All, In SPL2 Ingest Pipeline I want to assemble a regular expression and then use that in a rex command but I am having trouble. For example this simple test I am specifying the regex as a text string on the rex command works: But this version doesnt: Any idea what I am doing wrong? Thanks
The search command cannot search for '*'.  The '=' character also is a challenge.  You can, however, use regex to filter on these and other "special" characters. | eval msxxxt="*Action=Gexxxxdledxxx... See more...
The search command cannot search for '*'.  The '=' character also is a challenge.  You can, however, use regex to filter on these and other "special" characters. | eval msxxxt="*Action=GexxxxdledxxxxReport Duration=853*" | regex "=" | rex "Duration (<?Duration>\d+)" | timechart span=1h avg(Duration) AS avg_response by msxxxt
Hello isoutamo;  Thank you for the links; a lot of useful info. I am not an expert in the area of PKI Certificates etc.  I  have a basic understanding only.  The term leaf certificate was new to me. ... See more...
Hello isoutamo;  Thank you for the links; a lot of useful info. I am not an expert in the area of PKI Certificates etc.  I  have a basic understanding only.  The term leaf certificate was new to me.   Ptrsnk    
Hi, I am new to Ingest Processor and have had some success but am having an issue with the rex command so I have created a very simple example copied from the manual here https://docs.splunk.com/Doc... See more...
Hi, I am new to Ingest Processor and have had some success but am having an issue with the rex command so I have created a very simple example copied from the manual here https://docs.splunk.com/Documentation/SCS/current/SearchReference/RexCommandExamples#2._Regular_expressions_with_character_classes But I am getting this error: Any ideas why? Thanks      
Hi @whar_garbl  I think what you have done with "CHECK_METHOD" in props.conf should work.   [source::<yoursource>] CHECK_METHOD = modtime   However, you may also need to set the crcSalt in input... See more...
Hi @whar_garbl  I think what you have done with "CHECK_METHOD" in props.conf should work.   [source::<yoursource>] CHECK_METHOD = modtime   However, you may also need to set the crcSalt in inputs.conf   [monitor://<path>] crcSalt = <SOURCE>   Here are a few other useful links which might also help! https://community.splunk.com/t5/Getting-Data-In/Ingesting-file-data/td-p/81645 https://community.splunk.com/t5/Knowledge-Management/Modtime-is-newer-than-stored-will-reread-file-with-9-x-x/td-p/677930 << Beware of this possible bug Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will  
Further to my last message - this is a great blog post on getting started with UCC so well worth checking out https://www.splunk.com/en_us/blog/customers/managing-splunk-add-ons-with-ucc-framework.h... See more...
Further to my last message - this is a great blog post on getting started with UCC so well worth checking out https://www.splunk.com/en_us/blog/customers/managing-splunk-add-ons-with-ucc-framework.html Let us know how you get on and if you have any further questions Will
Hi @dolj  If there isnt already a Splunkbase app for the API you want to work with then you may be best using the Splunk Universal Configuration Console (UCC) framework to build yourself a custom ap... See more...
Hi @dolj  If there isnt already a Splunkbase app for the API you want to work with then you may be best using the Splunk Universal Configuration Console (UCC) framework to build yourself a custom app. This has had much more development recently than Add-on builder and is easier to manage moving forwards. Here is a sample app which might give some insight on how it works, this is taken from a Conf talk I did on creating a simple API app in 2023. https://github.com/livehybrid/conf23-dev1091b/ Also have a look at the UCC docs (https://splunk.github.io/addonfactory-ucc-generator/) for more information and to get started. Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will  
Hi @Namdev  Please could you confirm which user the Splunk Forwarder is running as? Is it splunkfwd, splunk or something else? Please could you show a screenshot of the permissions on your /opt/log... See more...
Hi @Namdev  Please could you confirm which user the Splunk Forwarder is running as? Is it splunkfwd, splunk or something else? Please could you show a screenshot of the permissions on your /opt/log files in question.  Did you run anything like this against the directory to give splunk access? setfacl -R -m u:splunkfwd:r-x /opt/log  Are there any logs in splunkd.log relating to these files?  Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
Hi @DaveyJones  Please could you provide the search you ended up using so I can look into this further for you? Thanks Will
Hi @cbyrd  Given that the 400 error is coming from the Google API, I'd start off by checking for config issues on the Google side. Check API Permissions: Ensure that the Google Workspace service ... See more...
Hi @cbyrd  Given that the 400 error is coming from the Google API, I'd start off by checking for config issues on the Google side. Check API Permissions: Ensure that the Google Workspace service account you're using has the necessary permissions to access user data. The service account should have the "Directory API" enabled and the appropriate scopes granted, such as https://www.googleapis.com/auth/admin.directory.user.readonly. Verify API Scopes: Double-check that the OAuth 2.0 scopes configured for the service account include the necessary permissions. You might need to add or adjust scopes in the Google Cloud Console. Customer ID: Ensure that the customer parameter in the API request is correct. It should be the unique ID of your Google Workspace account. You can find this ID in the Admin console under Account settings. View Type: The viewType parameter can be either admin_view or domain_public. Make sure that the view type you are using is appropriate for your use case and that the account has the necessary permissions to access the data with that view type. API Quotas and Limits: Check if you are hitting any API quotas or limits. Google APIs have usage limits, and exceeding them can result in errors. Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will