All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, I am trying to mask some passwords but I cannot figure out the proper props.conf (ha!) for it. It works on the fly but not when I try to set it in props.conf this is my mask on the fly, basic... See more...
Hi, I am trying to mask some passwords but I cannot figure out the proper props.conf (ha!) for it. It works on the fly but not when I try to set it in props.conf this is my mask on the fly, basically just replace the password with some characters: rex mode=sed field=ms_Mcs_AdmPwd "s/ms_Mcs_AdmPwd=(\w+)/###\2/g"\   and this is the raw data from sourcetype: ActiveDirectory Additional Details:                                   msLAPS-PasswordExpirationTime=133579223312233231                                   ms-Mcs-AdmPwd=RlT34@iw4dasdasd   How would I do this in props.conf or transform.conf ?   Oliver
Tje splunk readiness app, cannot determine if Mission Control app is python compatible
There are a few ways to do that.  I like to use rex. | rex field=software "cpe:\/a:(?<software_vendor>[^:]+):(?<software_product>[^:]+):(?<software_version>.*)"  
@AKG11     "One host could have multiple process running on it. Some times even same process running on multiple host"  - Yes would expect that, and that would work fine with the example I provide... See more...
@AKG11     "One host could have multiple process running on it. Some times even same process running on multiple host"  - Yes would expect that, and that would work fine with the example I provided you.  You can make the service process centric if you want and scale the service tree by process. I.e. filter by process and split by process_name_host_name, and similar a separate garbage collector service. You can even have both i.e. host and process, gc entities in one service and filter by entity of either in the KPI.   But to be honest that sounds a but wonky to me. but you know your services best   /Seb 
Currently, I have a search that returns the following: Search: index=index1 sourcetype=sourcetype1 | table host, software{} host                 software hostname       cpe:/a:vendor:product:... See more...
Currently, I have a search that returns the following: Search: index=index1 sourcetype=sourcetype1 | table host, software{} host                 software hostname       cpe:/a:vendor:product:version                             cpe:/a:vendor:product:version                             cpe:/a:vendor:product:version                             cpe:/a:vendor:product:version                             cpe:/a:vendor:product:version hostname       cpe:/a:vendor:product:version                             ...                             ... Here, there are multiple software tied to one hostname, and the software is under one field called software{}. What I am looking for is a way to split the software field into 3 fields by extracting the vendor, the product and the version into 3 separate fields to return: host                 software_vendor                   software_product             software_version hostname       vendor                                       product                                  version                             vendor                                       product                                  version                             vendor                                       product                                  version                             vendor                                       product                                  version                             vendor                                       product                                  version hostname       vendor                                       product                                  version                             ...                             ... Does anyone have any ideas?
Is there a way to use Splunk to find out if wireshark is installed on any of the systems? Is there a query for this
Hi @mythili , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
For Splunk UF, it is a local hard drive For client application, it is a network drive
 @srauhala_splunk  Thanks for response Q. You could have multiple different strategies for this. All does however sound like they host specific.  "process name" will be related to a host, so does ... See more...
 @srauhala_splunk  Thanks for response Q. You could have multiple different strategies for this. All does however sound like they host specific.  "process name" will be related to a host, so does the Garbage collection right? A. That's not the scenario. One host could have multiple process running on it. Some times even same process running on multiple host. In that case we have to use combination of process and host as entity. As multiple process on one host hence multiple GC on same host. We wanted to have process based service because one service could be dependent on another Service.
What is the goal/target of the app?  If it's for internal use then you can ignore the failures.  If it's for Splunk Cloud then you must resolve the web.conf error.  Splunk Support may be able to help... See more...
What is the goal/target of the app?  If it's for internal use then you can ignore the failures.  If it's for Splunk Cloud then you must resolve the web.conf error.  Splunk Support may be able to help you with that, but be prepared for them to say what you want to do is not possible.
Use the frozenTimePeriodInSecs setting in indexes.conf to control how long data lives in the index. [myindex] frozenTimePeriodInSecs = 604800  
Hi @AKG11  * KPI which uses host as entity, * KPI is "service Up" which basically check service is up and in this case entity is "process name".  * KPI for Garbage collection which also has diffe... See more...
Hi @AKG11  * KPI which uses host as entity, * KPI is "service Up" which basically check service is up and in this case entity is "process name".  * KPI for Garbage collection which also has different entity. You could have multiple different strategies for this. All does however sound like they host specific.  "process name" will be related to a host, so does the Garbage collection right?  I would make the KPI searches for "service Up" and "Garbage collection" exposes the host for every result. Use the filter to entities in service by host, and use the split by on a pseudo-entites for example process_name or garbage_collection_name.  Then the result will be that all KPIs are filtered to the entity (host) of the service and split by different entites per KPI. Note this will be a bit wonky if you have multiple hosts by service, for example host1, host2, host3.  In that case, to be able to distinguish a process from one host to another, you would need to create a combination of host and process, etc to split the data by. I.e. | eval my_process_entity = process_name."-".host Filter kpi by hosy  Split kpi by my_process_entity If this widely used use-case also consider creating real entities and entity types for this use-cases to be able to create entity dashboards.  Hope this gives some ideas!  Kind Regards,  Seb 
Hi @Alxender_smith,   I’m a Community Moderator in the Splunk Community.  This question was posted 2 years ago, so it might not get the attention you need for your question to be answered. We... See more...
Hi @Alxender_smith,   I’m a Community Moderator in the Splunk Community.  This question was posted 2 years ago, so it might not get the attention you need for your question to be answered. We recommend that you post a new question so that your issue can get the  visibility it deserves. To increase your chances of getting help from the community, follow these guidelines in the Splunk Answers User Manual when creating your post.   Thank you! 
@newguy2024 - Try username, instead of email. Check your username on Splunk.com.  
Hi, I wanted to import your add-on into the add-on builder so I can view the checkpoint logic and replicate it for my app, but I`m getting the following error when I try to import the project: "The ... See more...
Hi, I wanted to import your add-on into the add-on builder so I can view the checkpoint logic and replicate it for my app, but I`m getting the following error when I try to import the project: "The add-on project could not be imported because a problem occurred while extracting project file." Is there another way to view the checkpoint logic ? Thanks,  Toma
Could someone provide a solution for this problem? If anyone has the solution, please share it. Your assistance would be greatly appreciated.
We are currently indexing big log files (~1 GB in size) in our Splunk indexer using Splunk Universal Forwarder. All the logs data will be stored in a single index. We want to make sure the logs dat... See more...
We are currently indexing big log files (~1 GB in size) in our Splunk indexer using Splunk Universal Forwarder. All the logs data will be stored in a single index. We want to make sure the logs data is deleted after one week from the date it was indexed.  Is there a way to achieve the same?
Hi @asncari  I just started going through this Splunk UI Toolkit and was able to resolve the issue using the following method. There needs to be an update made in the webpack config. My setup: ... See more...
Hi @asncari  I just started going through this Splunk UI Toolkit and was able to resolve the issue using the following method. There needs to be an update made in the webpack config. My setup: node -v  v21.7.1 npm -v 10.5.0 yarn -v  1.22.22 Do the following 1. Install querystring-es3 and querystring using npm npm i querystring-es3 npm i querystring 2. Update the webpack.config.js file (MyTodoList\packages\react-to-do-list\webpack.config.js) const path = require('path'); const { merge: webpackMerge } = require('webpack-merge'); const baseComponentConfig = require('@splunk/webpack-configs/component.config').default; module.exports = webpackMerge(baseComponentConfig, { entry: { ReactToDoList: path.join(__dirname, 'src/ReactToDoList.jsx'), }, output: { path: path.join(__dirname), }, resolve: { fallback: { querystring: require.resolve('querystring-es3'), }, }, });   3. Now re-run the setup steps and let it build successfully and re-link any modules and then head into the react-to-do-list and execute the yarn start:demo command your build should be successful and if you navigate to localhost:8080 you should be able to see the react app. Note: I had a socket error ERR_SOCKET_BAD_PORT NaN exception for port 8080. I updated the build.js file to enforce use of 8080 (/packages/react-to-do-list/bin/build.js)   demo: () => shell.exec('.\\node_modules\\.bin\\webpack serve --config .\\demo\\webpack.standalone.config.js --port 8080'),   If the reply helps, karma would be appreciated.  
HI, I want to embed the dashboard in my own webpage. First, I found the "EDFS" APP, but after installing it and following the steps, I didn't see the "EDFS" option in the input. When this include in... See more...
HI, I want to embed the dashboard in my own webpage. First, I found the "EDFS" APP, but after installing it and following the steps, I didn't see the "EDFS" option in the input. When this include in the HTML file, it will responds with "refused to connect" :   <iframe src="https://127.0.0.1:9999" seamless frameborder="no" scrolling="no" width="1200" height="2500"></iframe>   Also, if add "trustedIP=127.0.0.1" in the server.conf file, when open Splunk web using "127.0.0.1:8000", it shows an "Unauthorized" error. Additionally, I found that adding "x_frame_options_sameorigin = 0" and "enable_insecure_login = true" in the web.conf file, and including this in the HTML file:   <iframe src="http://splunkhost/account/insecurelogin?username=viewonlyuser&password=viewonly&return_to=/app/search/dashboardname" seamless frameborder="no" scrolling="no" width="1200" height="2500"></iframe>   It will show the Splunk web login page, with the error message "No cookie support detected. Check your browser configuration." If try to login with the username and password, it still doesn't work and shows a "Server error" message. If use Firefox's incognito window open the HTML file, it will skip the login page and display "Unauthorized." Is there a way to solve these issues or alternative methods to display the dashboard on an external webpage? Thanks in advance.
Dear Karma,   We tried to use the suggested option. Can you please guide us where to update the file as we suspect on location where we writing Regex. Currently, we have updated windows folder on... See more...
Dear Karma,   We tried to use the suggested option. Can you please guide us where to update the file as we suspect on location where we writing Regex. Currently, we have updated windows folder on deployment server and /etc/system/local/ directory on HF level. Thanks, Suraj