All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Sorry, mate, but the level of completness of your description is comparable to "ok, I replaced the flat tyre but now I cannot put the car in gear - can it be a problem with a battery?". We nave no i... See more...
Sorry, mate, but the level of completness of your description is comparable to "ok, I replaced the flat tyre but now I cannot put the car in gear - can it be a problem with a battery?". We nave no idea about what your setup looks like, what hosts you have, what configs. How can we know what's wrong?
As luck would have it This works :     | inputlookup servers | dedup host | sort host | table host host     However, this does not | inputlookup servers where environment = $token$ |... See more...
As luck would have it This works :     | inputlookup servers | dedup host | sort host | table host host     However, this does not | inputlookup servers where environment = $token$ | dedup host | sort host | table host host ......... If I replace $token$ w/ "DEV" (which is what is in the table) it works.  I know the $token$ has the value, out of scope maybe?
I have several apps setup to segregate our various products. I’ve added icons to the apps. My issue is the icon is being placed over the app name. It should place the icon next to the app name. For e... See more...
I have several apps setup to segregate our various products. I’ve added icons to the apps. My issue is the icon is being placed over the app name. It should place the icon next to the app name. For example, the Search and Reporting app has the White arrow on a green background to the left of the app name. How do I get the icon to be placed left of the app name?
The app itself seems to be downloadable and usable on on-prem Splunk Enterprise, but it sends data to an offsite service which does the AI processing work. Src: https://docs.splunk.com/Documentation... See more...
The app itself seems to be downloadable and usable on on-prem Splunk Enterprise, but it sends data to an offsite service which does the AI processing work. Src: https://docs.splunk.com/Documentation/AIAssistant/1.0.3/User/AboutAIAssistant Where Splunk AI Assistant for SPL runs Splunk AI Assistant for SPL runs as a separate component of Splunk Cloud Platform (SCP) which is not metered like searches are against data indexed by Splunk. For version 1.0.0 and higher the SPL generated by the assistant requires a separate step to Open in Search. Searches executed in the Search app work like any other Splunk search, and consume SVC resources accordingly. Splunk AI Assistant for SPL runs on AI Service, a multi-tenant, cloud service, hosted in Splunk Cloud Platform. This AI Service makes GPUs available for generating responses to customer prompts. All the AI compute is offloaded to AI Service and no AI compute is running on the customer's search head.   I don't know whether Splunk plans to release a fully on-prem version. The Splunk company used to provide a preview version which was fully on-prem, but that program is no longer available. https://www.splunk.com/en_us/blog/platform/flatten-the-spl-learning-curve-introducing-splunk-ai-assistant-for-spl.html
Is there a question here?
One thing to consider is that if/when your HEC receiver crashes you will lost those evens unless you have configured indexing ack into use and your HEC sender/client had implemented it into use! First... See more...
One thing to consider is that if/when your HEC receiver crashes you will lost those evens unless you have configured indexing ack into use and your HEC sender/client had implemented it into use! First part is an easy step, but second part isn’t! Also when you are using LB before multiple HEC nodes you will be get some duplicate events time by time.
Under settings, there is an option to change Lookups, it is there that you will find Lookup definitions - add a new one specifying the csv lookup file you want to define.  
As a test, I first created some credit card numbers using a python script. I placed the script, along with inputs and props, on the search head. I only placed props on the indexers. The following... See more...
As a test, I first created some credit card numbers using a python script. I placed the script, along with inputs and props, on the search head. I only placed props on the indexers. The following SEDCMD will  mask the 1st and 3rd set of 4-digits. The two groups (2nd and 4th set of 4-digits) will not be masked. props: [cc_generator] SEDCMD-maskcc = s/\d{4}-(\d{4})-\d{4}-(\d{4})/xxxx-\1-xxxx-\2/g inputs: [script://./bin/my_cc_generator.py] interval = */30 * * * * sourcetype = cc_generator disabled = 0 index = mypython output: xxxx-9874-xxxx-9484
Hi @ITWhisperer  - yes you are correct, that field is populated with subnet values,  the lookup file is like this: cidr                              provider         area                          ... See more...
Hi @ITWhisperer  - yes you are correct, that field is populated with subnet values,  the lookup file is like this: cidr                              provider         area                                     zone                  region 1.1.1.1/24                 Unit 1              Finance                              2                              US 2.2.2.2/27                 Unit 2              HR                                        16                           UK I am unsure of how to go about creating a lookup definition with advanced setting for match type CIDR(cidr)?
Hi, Can you try using service.request.count as your signal (filter by sf_error:true and any other relevant filters) and see if that works?  
Yes, you can create searches using the REST API in Splunk Cloud. Here are the basic steps: Get a Session Key: Authenticate with Splunk to get a session key. Create a Search Job: Use the /servic... See more...
Yes, you can create searches using the REST API in Splunk Cloud. Here are the basic steps: Get a Session Key: Authenticate with Splunk to get a session key. Create a Search Job: Use the /services/search/jobs endpoint to create a search job. You’ll need to send a POST request with your search query in the body. Check Search Status: Use the search ID (sid) returned from the previous step to check the status of your search job. Here’s a simple example using curl: curl -k -u username:password https://<splunk-cloud-url>/services/search/jobs -d search="search index=_internal | head 10" This command will create a search job that retrieves the first 10 events from the _internal index.    
Yes, it is.
I am trying to remove the year from from the time labels on the area chart without it messing up the charts format.  I've tried fieldformat but that would mess up the chart when the new year hap... See more...
I am trying to remove the year from from the time labels on the area chart without it messing up the charts format.  I've tried fieldformat but that would mess up the chart when the new year happens, any help would be great.
| eval description=if('app'=="linux", host. "-" .alert_type',  'app'==windows, host. "-" .severity, "false") You didn't nest the second IF statement | eval description=if('app'=="linux", host. "-" ... See more...
| eval description=if('app'=="linux", host. "-" .alert_type',  'app'==windows, host. "-" .severity, "false") You didn't nest the second IF statement | eval description=if('app'=="linux", host. "-" .alert_type',if('app'==windows, host. "-" .severity, "false"))
I am trying to create a new field called "description" that contains values from two other existing fields.  If field "app" is equal to linux than i want to combine existing fields "host" and "aler... See more...
I am trying to create a new field called "description" that contains values from two other existing fields.  If field "app" is equal to linux than i want to combine existing fields "host" and "alert_type". If field "app" is equal to windows than i want to combine existing field values "host" and "severity" If app equals anything else, i want the value to be false.  Below is the eval i have, buts its not working:   | eval description=if('app'=="linux", host. "-" .alert_type', 'app'==windows, host. "-" .severity, "false")    
Hi Ryan, Unfortunately, the uninstall-smart-agent instructions did not work.  I need to remove the dead/inactive Smart Agent from the controller Agent Management--->Agents--->Smart Agents section.  ... See more...
Hi Ryan, Unfortunately, the uninstall-smart-agent instructions did not work.  I need to remove the dead/inactive Smart Agent from the controller Agent Management--->Agents--->Smart Agents section.  Thanks!
Hi can anybody help with this problem, please? source1: lookup Tab (lookup.csv) att1 att2 att3 F1 1100 12.09.2024 F2 1100 23.04.2024 F3 1100 15.06.2024 F4 1100 16.03.2024 att1 is also in index=... See more...
Hi can anybody help with this problem, please? source1: lookup Tab (lookup.csv) att1 att2 att3 F1 1100 12.09.2024 F2 1100 23.04.2024 F3 1100 15.06.2024 F4 1100 16.03.2024 att1 is also in index=myindex I want to have in a table for all att1 from lookup.csv count of all events from index=myindex att1=$att1$ AND earliest=strptime($att3$, "%d.%m.%Y") output: att1 count(from myindex) att2 att3 F1 count 1100 12.09.2024 F2 count 1100 23.04.2024 F3 count 1100 15.06.2024 F4 count 1100 16.03.2024
@shalomsuresh These solutions don't work if there are more than 3 with the same score, e.g. if "f" had a score of 73 as well.
Hello @Kathryn.Green, I was told you should be having a conversation soon with ApPDynamics about your questions here, as they have reached our privately. 
Is there a way to get Service Endpoint values (response time, load, errors) into Analytics so it can be queried? I have multiple custom service endpoints that are looking at the performance of api c... See more...
Is there a way to get Service Endpoint values (response time, load, errors) into Analytics so it can be queried? I have multiple custom service endpoints that are looking at the performance of api calls from a specific customer.  They are calls like createCart and placeOrder etc. Is there a way for me to get the values like load, response time, and error counts for these service endpoints, in Analytics? I know I can get those metrics for business transactions, but these service endpoints are subsets within the BTs.  I don't want to have to create a custom BT for each of these custom service endpoints if I can avoid that. Thanks, Greg