All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello, We identify a fails request by gathering data from 3 different logs. I need to group by userSesnId, and if these specific logs appear in my list, it defines a certain failure. I would like to... See more...
Hello, We identify a fails request by gathering data from 3 different logs. I need to group by userSesnId, and if these specific logs appear in my list, it defines a certain failure. I would like to count each failure by using these logs. I would greatly appreciate your help with write this search query.  I hope this makes sense.. Thank you, I would like to use the information from these logs, grouped by userSesnId Log #1:  msgDtlTxt: [Qxxx] - the quote is declined.  msgTxt: quote creation failed. polNbr: Qxxx Log #2 httpStatusCd: 400 Log #3 msgTxt: Request.   They all share the same userSesnId  userSesnId: 10e30ad92e844d So my results should look something like this: polNbr            msgDtlTxt                         msgTxt                              httpStatusCd             count Qxxx                Validation: UWing           quote creation failed     400                             1
I'm having the same requirement.  Maybe one way would be to run it from an external program using API calls to kick off the searches and wait for them to complete?
Hi @Denise.Perrotta, Thanks for asking your question on the Community. Were you able to find new information or a solution you could share here as a reply? If you still need help, you can contac... See more...
Hi @Denise.Perrotta, Thanks for asking your question on the Community. Were you able to find new information or a solution you could share here as a reply? If you still need help, you can contact AppDynamics Support: How to contact AppDynamics Support and manage existing cases with Cisco Support Case Manager (SCM) 
Hi @Mukesh.Prasad, Since this post is over a year old, it may not get a reply. Since it's been a few days, did you happen to find a solution yourself you can share? If you still need help, you c... See more...
Hi @Mukesh.Prasad, Since this post is over a year old, it may not get a reply. Since it's been a few days, did you happen to find a solution yourself you can share? If you still need help, you can contact AppD Support: How to contact AppDynamics Support and manage existing cases with Cisco Support Case Manager (SCM) 
Hi @ranaveer.mariyagu, While this is an older post, it still may be true as it's the same error. https://community.appdynamics.com/t5/Java-Java-Agent-Installation-JVM-and-Controller-Installation/DB... See more...
Hi @ranaveer.mariyagu, While this is an older post, it still may be true as it's the same error. https://community.appdynamics.com/t5/Java-Java-Agent-Installation-JVM-and-Controller-Installation/DB-Collector-error-ORA-00942-table-or-view-does-not-exist/m-p/32353 They reference an AppD Docs page which is this one: https://docs.appdynamics.com/appd/24.x/24.10/en/database-visibility/add-database-collectors/configure-oracle-collectors
try this: index=_audit action=search is_realtime=1 | eval search_type=case( search_id LIKE "scheduler%", "Scheduled Search", search_id LIKE "rt_scheduler%", "Real-Time Scheduled Search", search_id ... See more...
try this: index=_audit action=search is_realtime=1 | eval search_type=case( search_id LIKE "scheduler%", "Scheduled Search", search_id LIKE "rt_scheduler%", "Real-Time Scheduled Search", search_id LIKE "dashboard%", "Dashboard", search_id LIKE "adhoc%", "Ad-hoc Search", 1=1, "Ad-hoc Search" ) | eval human_readable_time = strftime(_time, "%Y-%m-%d %H:%M:%S") | stats count by user, search_type, _time | rename human_readable_time AS "Time", user AS "User", search_type AS "Search Type", count AS "Search Count" | sort - "Time"
By using join in the query impacting the performance.What is the alternative way with out using join in the above query
By using join in the query impacting the performance.What is the alternative way with out using join in the above query
Aside from the obvious (replace the join command), what do you mean by "optimize"?  What is the problem and what are the goals?  What does the data look like and what should the results be?  How many... See more...
Aside from the obvious (replace the join command), what do you mean by "optimize"?  What is the problem and what are the goals?  What does the data look like and what should the results be?  How many events are being processed? The join command appears to add no value.  It matches (only) the request_correlation_id field from the subsearch to the same field from the main search.  No other fields from the subsearch are included so why bother with the join?
  If I execute the below query for selected time  like 20 hours  its taking longer time and calling events are 2,72,000 .How to simplify this query for getting the result in 15 to 20 seconds.   in... See more...
  If I execute the below query for selected time  like 20 hours  its taking longer time and calling events are 2,72,000 .How to simplify this query for getting the result in 15 to 20 seconds.   index=asvservices authenticateByRedirectFinish (*) | join request_correlation_id [ search index= asvservices stepup_validate ("isMatchFound\\\":true") | spath "policy_metadata_policy_name" | search "policy_metadata_policy_name" = stepup_validate | fields "request_correlation_id" ] | spath "metadata_endpoint_service_name" | spath "protocol_response_detail" | search "metadata_endpoint_service_name"=authenticateByRedirectFinish | rename "protocol_response_detail" as response      
Few servers are hosting in private VPC which are not connected to organisation IT network    how can we onboard those Linux hosts 
Can you also do conditional formatting for a line or column chart, or just a table?  I have a column chart where I would like to highlight one of the columns (series) based on value.  I don't think d... See more...
Can you also do conditional formatting for a line or column chart, or just a table?  I have a column chart where I would like to highlight one of the columns (series) based on value.  I don't think dashboard studio allows this... iirc the classic dashboard does allow it.  Unfortunately this is a major drawback for using dashboard studio :(.  I hope they fix it.
Okay, in this case, please share the search that renders the datamodel. Perhaps you can do the replace there to make sure there are no double quotes returned on Climate.air_temp field.
Thank you so much @PickleRick  it very useful information.
For fortinet logs forwarding to splunk we have to mention the forwarding port as well, To mention the port, an option is not available in GUI. We can use the following commands to add the splunk ser... See more...
For fortinet logs forwarding to splunk we have to mention the forwarding port as well, To mention the port, an option is not available in GUI. We can use the following commands to add the splunk server IP with a custom forwarding port# config log syslogd2 setting set status enable set server 10.10.10.10 set port 2222 end use above example to forward traffic to port 2222 Regards,
Hi @PickleRick  Thank you for your response. I checked the developer tools, and it did show extra spaces on the browser. The PC support asked me this question "if this is a browser issue, then w... See more...
Hi @PickleRick  Thank you for your response. I checked the developer tools, and it did show extra spaces on the browser. The PC support asked me this question "if this is a browser issue, then why did the issue also occur on a different browser?  So, no chance it's a Splunk profile issue? I already cleared, my cache and it's the same issue.     The next step is that I will test it using a different PC used by Splunk user that doesn't have this issue  
OK. So you're not "upgrading" but moving your installation from a 6.10 RHEL box to a 9.x one. The way to go is 1) Install and configure new server 2) Migrate splunk in the same version as you alre... See more...
OK. So you're not "upgrading" but moving your installation from a 6.10 RHEL box to a 9.x one. The way to go is 1) Install and configure new server 2) Migrate splunk in the same version as you already have. See https://docs.splunk.com/Documentation/Splunk/8.2.0/Installation/MigrateaSplunkinstance One thing - if your installation was RPM-based, before moving contents of the old Splunk installation, install a fresh RPM (still the original 8.2 version!) before overwriting it with actual production files so that RPM database is properly populated. Unfortunately, if you're moving from one server to another (having new name, new IP and so one) you have to go through the configs and fix the things that point to the old name, IP and possibly certificate. And we can't know beforehand about all the configuration items you have to change. 3) After you have your 8.2 instance working in a new place, perform the upgrade to the desired version using the official upgrade path (you can't upgrade directly from 8.2 to 9.3)
Hi Thanks for answering. I tried the search you provided with no luck.  
@PickleRick  Thank you for the input!
Hello ITWhiperer,  It is a typo, I just edited the original post. Thank you