All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello @Lidiane.Wiesner, @Terence.Chen had some follow-up questions for you to help find a solution to your question? If this is still an issue, reply with that info to keep the conversation going. 
I am using  Splunk Enterprise Version: 9.2.1 and installed IT Essentials Learn but getting error fetching use case families. Is ITSI a prerequisite for ITE? I installed the app using the GUI.
Joining the two searches would require some common field to join on. Since none exists in your example, you'll need to either add an identifier to all related logs at the source, or get creative with... See more...
Joining the two searches would require some common field to join on. Since none exists in your example, you'll need to either add an identifier to all related logs at the source, or get creative with a solution based on time that could get finicky. For example, in your sample data, you have 3 events. The transaction ID in event 1 occurs 2 seconds before the error log. If there can be more than one concurrent transaction, then there doesn't appear to be a way to be certain that the correct transaction ID will be found that corresponds to error. e.g. 240614 04:35:50 Algorithm: Al10: <=== Recv'd TRN: AAA (TQ_HOST -> TQ_HOST) 240614 04:35:51 Algorithm: Al10: <=== Recv'd TRN: BBB (TQ_HOST -> TQ_HOST) 240614 04:35:52 Algorithm: TSXXX hs_handle_base_rqst_msg: Error Executing CompareRBSrules Procedure. 240614 04:35:52 Algorithm: TSXXX hs_handle_base_rqst_msg: Details of ABC error ReSubResult:-1,FinalStatus:H,ErrorCode:-1,chLogMsg:SQL CODE IS -1 AND SQLERRM IS ORA-00001: unique constraint (INSTANCE.IDX_TS_UAT_ABC_ROW_ID) violated,LogDiscription: In this case, does the error belong to transaction AAA or BBB? The second issue will be how much time can elapse between the "Rec'd TRN" log, and any possible error. Without a field linking these logs, you'll have to use some fixed time range to try to bring logs together. Too short, and you'll fail to find the transaction ID, too long and you might find multiple IDs (leading to the issue mentioned above). IF you can assume that logs are synchronous, and there is no interleaving of transactions, then something like this should work:   index=test_index source=/test/instance | sort _time | rex field=_raw "<=== Recv'd TRN:\s+(?<transaction_id>\w+)" | eval failure=if(like(_raw, "%ORA-00001%"), 1, 0) | filldown transaction_id | where failure=1 | table transaction_id, failure, _raw  
Adding a wildcard to a 1000+ lookup table was a pain   but that seems to resolve the issue i was having.  It's a good lesson as well. Thank you and everyone for your recommendations!! 
In fact from https://dev.splunk.com/enterprise/docs/developapps/manageknowledge/custominputs/ it states that,  "In a distributed deployment, the location where a user installs a custom data input d... See more...
In fact from https://dev.splunk.com/enterprise/docs/developapps/manageknowledge/custominputs/ it states that,  "In a distributed deployment, the location where a user installs a custom data input depends on their Splunk Cloud Platform Experience (Classic or Victoria). In Classic Experience, custom data inputs run on the the Inputs Data Manager (IDM). If you deploy an app with a custom data input to the search head or indexer, the input does not run on these components. In Victoria Experience, custom data inputs run on the search head and don’t require the IDM."  
Thank you! This pointed me in the right direction! It turned out that the issue was that the token was somehow picking up the nat_source_address field as well.    
Thank you very much for your comment and share of source code! It has helped me out. I am not very well versed in xml, html, web design etc but this is bringing back some memories and I'm starting to... See more...
Thank you very much for your comment and share of source code! It has helped me out. I am not very well versed in xml, html, web design etc but this is bringing back some memories and I'm starting to get more accustomed to it again.  Ken
Thanks @gcusello for your response.  From the doc, I read that for Splunk Classic experience, it is recommended to install TA on IDM. Whereas in case of Splunk Victoria, it is recommended to install... See more...
Thanks @gcusello for your response.  From the doc, I read that for Splunk Classic experience, it is recommended to install TA on IDM. Whereas in case of Splunk Victoria, it is recommended to install TA on search head. I like the second approach, might as well, strip out the KV store logic out of the TA and place it in App such that whether it is on-prem or cloud, there shouldnt be an issue in updating kvstore data since app is installed on search head and that would take care of updating kv store. Does this sound reasonable?  
Your data that's already in ingested needs to be made CIM complaint, it might be worth spending some time getting your head around the CIM concepts, after this you can look  at developing correlation... See more...
Your data that's already in ingested needs to be made CIM complaint, it might be worth spending some time getting your head around the CIM concepts, after this you can look  at developing correlation rules.    https://lantern.splunk.com/Splunk_Platform/Product_Tips/Data_Management/Complying_with_the_Splunk_Common_Information_model 
+1! « $Request failed with the following error code: 400 » Tx!
This did not work. Thank you.
Hi, basically I had a html button in a panel, next to a text box. When clicked, the button was supposed to run a search which added the text entered in the panel into a lookup. The problem is, the on... See more...
Hi, basically I had a html button in a panel, next to a text box. When clicked, the button was supposed to run a search which added the text entered in the panel into a lookup. The problem is, the only way to get the button to do this was to use the full URL of the search, opening in a new tab. When I tried the javascript approach, I was getting messages about running potentially-unsafe scripts but the original method worked so I stuck with that. I just want to know if it's possible to use a html button to run a search, without opening in a new tab. I have tried various ways but haven't had any success. Thanks
Holy Cow! Per that document, I tried enabling: mc_auto_config = enabled ...and it removed all my indexers from the cluster. Good times. I think I'll just learn to live without those volume dashbo... See more...
Holy Cow! Per that document, I tried enabling: mc_auto_config = enabled ...and it removed all my indexers from the cluster. Good times. I think I'll just learn to live without those volume dashboards, wouldn't be the first time I had to ignore missing functionality. Pro tip:  don't bother taking the advanced troubleshooting class at Splunk.conf -- didn't prepare me for anything useful...
Hi there, I am trying to get some data from MS Defender into a Splunk query.  My original KQL query in azure contains | JOIN KIND INNER. to concat DeviceProcess and DeviceRegistry tables. The Splu... See more...
Hi there, I am trying to get some data from MS Defender into a Splunk query.  My original KQL query in azure contains | JOIN KIND INNER. to concat DeviceProcess and DeviceRegistry tables. The Splunk app I am using:  Splunk https://splunkbase.splunk.com/app/5518    So basically I'd like to do concatenation between DeviceProcess and DeviceRegistry events in advanced hunting query | advhunt in splunk SPL. Is there a suitable Splunk query for this kind of purpose?
@dtburrows3 Would you be able to share your code or a snippet of the relevant function calls? I am trying to create a similar expansion command but have not yet been able to locate the appropriate fu... See more...
@dtburrows3 Would you be able to share your code or a snippet of the relevant function calls? I am trying to create a similar expansion command but have not yet been able to locate the appropriate functions to use in the Splunk Python SDK.
What sort of button are you using? Do you mean the Submit button? Please provide more details of what you tried and how it failed?
Hi, sorry for the lack of reply I'm afraid it didn't work for me, I ended up having to use the button to open a search in a new tab. Is there actually a way to run a search from a button without di... See more...
Hi, sorry for the lack of reply I'm afraid it didn't work for me, I ended up having to use the button to open a search in a new tab. Is there actually a way to run a search from a button without displaying the results?
ah, gotcha. Yes, it's configured, setup is correct, server roles are set, and I use it often for various things -- I can see data in pretty much every other dashboard, and even in the "Index Detail: ... See more...
ah, gotcha. Yes, it's configured, setup is correct, server roles are set, and I use it often for various things -- I can see data in pretty much every other dashboard, and even in the "Index Detail: Deployment" -- it *does* show some volume information, as does "Indexes and Volumes: Deployment". In "opening in search" pretty much any panel in the monitoring console, I can see the query and the macro or rest it uses. But in these volume detail pages, they all are "undefined". The info reporting on the individual indexes is correct -- I use it to trim and set limits on various indexes. I do see in the "Volume Detail: Instance" -- I each indexer populated in the dropdown, but the Volume (token) is empty. To recap:  all my dashboards in Monitoring Console on my Management server have data except for:  VolumeDetail: Instance and VolumeDetail: Deployment. Honestly, if it's NOT configured correctly (the management server/console), then I'm not sure what to fix. I know this doesn't always mean a Good Thing, but I have been using Splunk since v3.x.  
Hey, Your message allowed me to realize that in my question there is missing some information. IBMi data are in Json format and integrate to the HTTP event collector. I didn’t understand what I ha... See more...
Hey, Your message allowed me to realize that in my question there is missing some information. IBMi data are in Json format and integrate to the HTTP event collector. I didn’t understand what I had to do to make Splunk understand the data and make the correlation.
Hello @Maxime, By default Splunk tries to parse the data that got ingested from whatsoever log source it had been onboarded. However, there's no gaurantee that Splunk will be able to understand the ... See more...
Hello @Maxime, By default Splunk tries to parse the data that got ingested from whatsoever log source it had been onboarded. However, there's no gaurantee that Splunk will be able to understand the log source completely and provide you with the fields. There are lots of apps and add-ons available on Splunkbase for the exact same purpose (to collect and parse the data). However, if you do not find associated app/add-on, you can write the sourcetype configuration as per your requirement and you should then be able to get the necessary fields.  Also, if the data generated is in structured format (JSON, XML, CSV, etc.), Splunk has parsing written for those by default. In that case, you'll be able to directly visualize the data. You can find the relevant documentation links below: - https://docs.splunk.com/Documentation/Splunk/latest/Data/WhatSplunkdoeswithyourdata - https://docs.splunk.com/Documentation/Splunk/latest/Data/Overviewofeventprocessing - https://docs.splunk.com/Documentation/Splunk/latest/Data/Createsourcetypes - https://lantern.splunk.com/Splunk_Platform/Product_Tips/Data_Management/Configuring_new_source_types   Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated.