All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Does AppDynamics Machine agent support windows 10. I am able to see message as Machine agent started. Under servers I can see the processes of my system running along with PIDs for the system where m... See more...
Does AppDynamics Machine agent support windows 10. I am able to see message as Machine agent started. Under servers I can see the processes of my system running along with PIDs for the system where my machine agent has been hosted. However, I am not able to get %CPU, Disk Memory related metrics. When I try to access same from Metrics browser ,it says no data to display. Please suggest
Hello I have this search :   index="report" | stats count(Category__Names_of_Patches) as totalNumberOfPatches by Computer_Name | eval exposure_level = case( totalNumberOfPatches >= 1 AND total... See more...
Hello I have this search :   index="report" | stats count(Category__Names_of_Patches) as totalNumberOfPatches by Computer_Name | eval exposure_level = case( totalNumberOfPatches >= 1 AND totalNumberOfPatches <= 5, "Low Exposure", totalNumberOfPatches >= 6 AND totalNumberOfPatches <= 9, "Medium Exposure", totalNumberOfPatches >= 10, "High Exposure", totalNumberOfPatches == 0, "Compliant", 1=1, "<not reported>" )   and i want to create pai for each exposure_level and color the pai in different color how can i do it ?  Thanks
hello , I am Masterschool student and trying to install Splunk on my VM and it doesn t work, anyone can help thank you
How to capture >59+ age users accessing their accounts on daily basis in appdynamics? can this be done using information point or do we have any other method to calculate and get the data?
I have installed a free version of Splunk Enterprise 9.1 in my local system. I would need few logs files from my S3 bucket to be sent to Splunk. I have setup up the Splunk Add-on for AWS. In the app... See more...
I have installed a free version of Splunk Enterprise 9.1 in my local system. I would need few logs files from my S3 bucket to be sent to Splunk. I have setup up the Splunk Add-on for AWS. In the app, under configuration, created an account with access ID and secret access key. Then created an input by specifying the account name, bucket name and indexing details. After creating the input, when I search my index and sourcetype, I could not find the logs from S3. I have waited for more than half an hour, then tried again but no luck. As this is the first time I am trying the setup with AWS add-on, I am not sure whether the issue is happening. Could anyone please help me on this?
Hi, We have been informed about a high-severity vulnerability (CVE-2023-46214) impacting Splunk Enterprise (RCE in Splunk Enterprise through Insecure XML Parsing)  as we are on Splunk Cloud Version:... See more...
Hi, We have been informed about a high-severity vulnerability (CVE-2023-46214) impacting Splunk Enterprise (RCE in Splunk Enterprise through Insecure XML Parsing)  as we are on Splunk Cloud Version:9.0.2303.201. Thanks..
Hi all, I have 2 multiselect dropdowns. One is dependent on other dropdown. The first drop down has groups and second has sub groups. I am facing some problem in appending the subgroup value to the... See more...
Hi all, I have 2 multiselect dropdowns. One is dependent on other dropdown. The first drop down has groups and second has sub groups. I am facing some problem in appending the subgroup value to the respective group. For example, lets assume that group has values a b c and only c has subgroup that is x ,y. I want to append that subgroup as c_x and c_y and pass it to the query. I tried adding suffix in dropdown itself. But when the tokens are selected in any order it is adding the sub group to whole token, that is if i select b,c,a it will add subgroup as b,c,a_x and b,c,a_y.   Any suggestions on how can i correctly append the sub group to respective groups and use it in the query.
25/10/2023 6000 31/10/2023 0 6/11/2023 2500 6/11/2023 500 12/11/2023 -7800 16/11/2023 500   i have a table and i'm trying to create a line chart that starts at 6000, then... See more...
25/10/2023 6000 31/10/2023 0 6/11/2023 2500 6/11/2023 500 12/11/2023 -7800 16/11/2023 500   i have a table and i'm trying to create a line chart that starts at 6000, then has a straight line until it hits the date 6/11/2023 at which point it adds a line 90 degrees and goes up to 8500 and so on .. going up at 90 degrees and down at 90 degrees for the negative values keeping the current total thanks,
Hi, I am fairly new to AppDynamics and I am a bit puzzled by some behaviours with Nodejs Transaction Snapshots. Could anyone explain the following? A HTTP request comes into a Nodejs application an... See more...
Hi, I am fairly new to AppDynamics and I am a bit puzzled by some behaviours with Nodejs Transaction Snapshots. Could anyone explain the following? A HTTP request comes into a Nodejs application and it makes another HTTP request to an external service. All the calls are async and there is no specific correlation setup. I am expecting one outbound request for each inbound request. However, I sometimes see many outbound request calls.  Is this because AppD is just sampling the process, at the time of the snapshot, and showing all outbound calls occurring at that time?  Many Thanks H
I've got a new deployment of 9.1.1, upgraded from a prior version, I can't remember which off the top of my head.  I am running Windows 2019 btw, if there is any relevance.   When I log in I get th... See more...
I've got a new deployment of 9.1.1, upgraded from a prior version, I can't remember which off the top of my head.  I am running Windows 2019 btw, if there is any relevance.   When I log in I get the following message     Failed to upgrade KV Store to the latest version. KV Store is running an old version, service(36). Resolve upgrade errors and try to upgrade KV Store to the latest version again. Learn more. 11/20/2023, 12:04:48 PM       If I shutdown splunkd, then run  splunk.exe migrate migrate-kvstore -v  I'll get the following error.     [App Key Value Store migration] Starting migrate-kvstore. Started standalone KVStore update, start_time="2023-11-20 12:00:29". failed to add license to stack enterprise, err - stack already has this license, cannot add again [App Key Value Store migration] Checking if migration is needed. Upgrade type 1. This can take up to 600seconds. 2023-11-20T17:00:30.187Z W CONTROL [main] net.ssl.sslCipherConfig is deprecated. It will be removed in a future release. 2023-11-20T17:00:30.193Z F CONTROL [main] Failed global initialization: InvalidSSLConfiguration: CertAddCertificateContextToStore Failed The object or property already exists. mongod exited abnormally (exit code 1, status: exited with code 1) - look at mongod.log to investigate. KV Store process terminated abnormally (exit code 1, status exited with code 1). See mongod.log and splunkd.log for details. WARN: [App Key Value Store migration] Service(40) terminated before the service availability check could complete. Exit code 1, waited for 0 seconds. App Key Value Store migration failed, check the migration log for details. After you have addressed the cause of the service failure, run the migration again, otherwise App Key Value Store won't function.     No entries are ever posted to mongod.log. Just to verify, I cleanred out the var/log/splunk directory.  Moving the folder, and upon running the command, the folders are generated, but the mongod.log file is never created.   Any Advice on how to get the kvstore to migrate?  
Has anyone been successful logging command execution events on RedHat and having them be sent to Splunk via rsyslog? The logs get written to tty but they are not making its way to our HF. We can eas... See more...
Has anyone been successful logging command execution events on RedHat and having them be sent to Splunk via rsyslog? The logs get written to tty but they are not making its way to our HF. We can easily log all of auditd and system events but nothing for command execution. 
Hello, Why does long base search not work in drop down list? For example if the base query on id="StudentName" has a long search "Request-URI Too long" the drop down search did not populate, but ... See more...
Hello, Why does long base search not work in drop down list? For example if the base query on id="StudentName" has a long search "Request-URI Too long" the drop down search did not populate, but it worked just fine on the pie chart Please help.  Thank you so much <search id="StudentName"> <query>index=test</query> <earliest>$time_token.earliest$</earliest> <latest>$time_token.latest$</latest> </search> <input type="dropdown" token="StudentTok"> <label>Student Name</label> <fieldForLabel>studentname</fieldForLabel> <fieldForValue>studentname</fieldForValue> <search base="StudentName"> <query>| head 10</query> </search> </input>  
How do I count the number of unique recipients of each type of unique attachment from emails. The same user could receive the same attachment in multiple emails. Using the “dedup” command?
How do I count the number of emails from a search but only get recipients that received ten or more emails?
Hi, I am using an external lookup to basically run a Python script which runs an API call to return the results using a csv.dictwriter to the sys.stdout. There are around 1250 rows being written to... See more...
Hi, I am using an external lookup to basically run a Python script which runs an API call to return the results using a csv.dictwriter to the sys.stdout. There are around 1250 rows being written to the console But only the first 100 rows are being shown in Splunk. How can I disable this 100-row limit on external lookups?   Thank you and have a nice day,   Best,
Hi at all, I have to ingest logs from securelog and I'm able to take and parse linux logs, but I have an issue when parsing windows logs: how can I connect winlogbeat format to a Splunk_TA_Windows ... See more...
Hi at all, I have to ingest logs from securelog and I'm able to take and parse linux logs, but I have an issue when parsing windows logs: how can I connect winlogbeat format to a Splunk_TA_Windows to correctly parse events? in winlogbeat events format is different from the normal windows logs so te Splunk_TA_windows doesn't reach to parse logs. Is there a connector or must I manually parse winlogbeat logs to reduce them to Splunk_TA_windows logs? Thank you for your help. Ciao. Giuseppe
Hi All, I am trying to do a search to compare 2 different sources. Firstly, I created a lookup to catch some rules hitting my search. In the background, my alert is running and appending results to ... See more...
Hi All, I am trying to do a search to compare 2 different sources. Firstly, I created a lookup to catch some rules hitting my search. In the background, my alert is running and appending results to this csv lookup file. Lookup file has also a field which is called Explanation.  What I am trying is doing a search that provide me to update a row if anything is changed in raw data. However, there is an important point. If there is no change in raw data for the lookup field, the field in lookup file should not change and it should keep the explanation. If not, the row should be deleted. Thank you
Hi,  I have this method: public ActionResult MethodEZ(EZDokumentJson dokument) JsonResult: { "Data1": "", "Data2": null, "Data3": null, "Data4": null, "DokumentId": "dvsd-5dsafd-55555-1111-afd... See more...
Hi,  I have this method: public ActionResult MethodEZ(EZDokumentJson dokument) JsonResult: { "Data1": "", "Data2": null, "Data3": null, "Data4": null, "DokumentId": "dvsd-5dsafd-55555-1111-afdfas" } I would like to ask you for help with collecting data from JsonResult.  Here are my last attempts that don't work. My Data Collection look like this: ToString().Split(string/"DokumentId": ).[1].Split(string/,).[0]  toString().split("DokumentId").[1].split(\,).[0] Thanks....
Dear Team, I installed enterprise security on the search head and downloaded Splunk_TA_ForIndexer from ES General settings now i am stuck for UF technology add-on, from where i can find it? no op... See more...
Dear Team, I installed enterprise security on the search head and downloaded Splunk_TA_ForIndexer from ES General settings now i am stuck for UF technology add-on, from where i can find it? no option from the ES interface and i can't find it on splunkbase portal I tried multiple search keyword on splunkbase with no luck