All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello! My trial reached the end, how can I activate the Lite version? Thank you.
Just incase anyone stumbles upon this thread, I just got done trying to get the OpsGenie App for Splunk working and integrated but they stopped fully supporting that app back in Splunk Version 7.1, A... See more...
Just incase anyone stumbles upon this thread, I just got done trying to get the OpsGenie App for Splunk working and integrated but they stopped fully supporting that app back in Splunk Version 7.1, Anything newer and this won't work.  https://jira.atlassian.com/browse/OPSGENIE-1178
@tlmayesSorry for the misunderstanding, it is an unsupported add-on.      
I realize this, since it clearly states created by "Splunk Works".  My question was not directed at Splunk, since this is a "community" board (and Splunk Works is a pseudonym for "Splunk (but we don'... See more...
I realize this, since it clearly states created by "Splunk Works".  My question was not directed at Splunk, since this is a "community" board (and Splunk Works is a pseudonym for "Splunk (but we don't support it)".  Finally, the "download link" is marked "Download Restricted", and when selected responds with "Request Access, Splunk Employees only"
Hi, I just installed the TA-tenable add-on and was going to configure it; however, when I get to the account configuration it does not matter what account type I use, I always get "Error in processi... See more...
Hi, I just installed the TA-tenable add-on and was going to configure it; however, when I get to the account configuration it does not matter what account type I use, I always get "Error in processing the request". Has anyone seen this before? If so, what is the fix?
@danielbb  Warm and cold buckets can be copied safely while Splunk is running. You don’t necessarily need to stop Splunk to perform the cold data migration.  Refer the below link:  https://communi... See more...
@danielbb  Warm and cold buckets can be copied safely while Splunk is running. You don’t necessarily need to stop Splunk to perform the cold data migration.  Refer the below link:  https://community.splunk.com/t5/Deployment-Architecture/How-to-migrate-data-of-cold-and-thawed-path-to-different/m-p/580124 
@fabrizioalleva  Connecting Splunk DB Connect to an on-premises MongoDB instance requires some additional steps compared to using MongoDB Atlas. To connect to your on-premises MongoDB, you’ll need ... See more...
@fabrizioalleva  Connecting Splunk DB Connect to an on-premises MongoDB instance requires some additional steps compared to using MongoDB Atlas. To connect to your on-premises MongoDB, you’ll need the MongoDB JDBC driver. You can download it from https://unityjdbc.com/download.php?type=mongodb  . Make sure to obtain the mongodb_unityjdbc_full.jar from the installation folder. Then, you can configure a JDBC Connection https://unityjdbc.com/mongojdbc/setup/mongodb_jdbc_splunk_dbconnect_v3.pdf  I hope this helps, if any reply helps you, you could add your upvote/karma points to that reply, thanks.  
OK so extract it then filter the events | where isnotnull(Memory)
@danielmtz Check the below links for more information.  https://community.splunk.com/t5/Splunk-Dev/Developer-License-request-form-not-working/m-p/669697  I hope this helps, if any reply helps you, ... See more...
@danielmtz Check the below links for more information.  https://community.splunk.com/t5/Splunk-Dev/Developer-License-request-form-not-working/m-p/669697  I hope this helps, if any reply helps you, you could add your upvote/karma points to that reply, thanks.
@verizonrap2017  I'm not sure what you're looking for; are you looking for Splunk components or the default indexes in Splunk? Please use the links provided below for reference.  https://docs.splun... See more...
@verizonrap2017  I'm not sure what you're looking for; are you looking for Splunk components or the default indexes in Splunk? Please use the links provided below for reference.  https://docs.splunk.com/Documentation/Splunk/9.2.1/Indexer/Aboutmanagingindexes   https://docs.splunk.com/Documentation/Splunk/9.2.1/Capacity/ComponentsofaSplunkEnterprisedeployment   I hope this helps, if any reply helps you, you could add your upvote/karma points to that reply, thanks.
@tlmayes  The Insights App for Splunk (IA4S) at https://splunkbase.splunk.com/app/7186  is a developer app, not a Splunk-supported app. Check out the link below for further details. https://dev.sp... See more...
@tlmayes  The Insights App for Splunk (IA4S) at https://splunkbase.splunk.com/app/7186  is a developer app, not a Splunk-supported app. Check out the link below for further details. https://dev.splunk.com/enterprise/docs/releaseapps/splunkbase/optionsforsubmittingcontent/#Support-content 
Coming in here, years later, to document this for anyone who comes across it. The pagerduty app requires that the user who owns the alert must have, at a minimum, the list_storage_passwords capabi... See more...
Coming in here, years later, to document this for anyone who comes across it. The pagerduty app requires that the user who owns the alert must have, at a minimum, the list_storage_passwords capability. Almost certainly this was the issue here.
Hi @Gustavo.Marconi. Thank you for sharing the solution! 
Hi, This is the path to the element: |spath input=REQUEST output=Memory path=body.equipment{}.memory also this memory is not available in all the REQUEST.So i want only the events with whic... See more...
Hi, This is the path to the element: |spath input=REQUEST output=Memory path=body.equipment{}.memory also this memory is not available in all the REQUEST.So i want only the events with whichever is having the "memory" in the REQUEST. I dont want to display the events without memory  
What is the path to this element?
Using regex and spath commands can be used to extract fields, but it’s easier to INDEXED_EXTRACTIONS= JSON OR KV-mode=json and json data can change. If no events are getting auto extracted then it ... See more...
Using regex and spath commands can be used to extract fields, but it’s easier to INDEXED_EXTRACTIONS= JSON OR KV-mode=json and json data can change. If no events are getting auto extracted then it sounds like your sourcetype may not be applied.  There are some steps/investigations on your part to undertake. Check at the inputs level the data is getting set with your TA props.conf sourcetype you have set - verify this. (The data must be coming in from a JSON file or HEC type of inputs somewhere) Once you know the correct sourcetype, ensure that the KV-mode=json has been applied with other settings such as the below. Note: INDEXED_EXTRACTIONS= SON and KV-mode=json set for the same sourcetype together causes the Splunk software to extract the JSON fields twice: once at index time, and again at search time - advise do not do this, stick to KV-mode=json for now) Analyse the data, and workout out some of the settings – (known as magic 6)  for props.conf such as in the example below. Tip - Ideally you should always place new data into a test index and get the props working and the place into production once its all working as expected.   Example props   [my:json:data:sourcetype] KV_MODE = json #Tune the below to make Splunk more efficient MAX_TIMESTAMP_LOOKAHEAD = (look no further in the data for timestamp) SHOULD_LINEMERGE = false (leave default) TIME_PREFIX = (REGEX before the timestamp) TIME_FORMAT = (Check your time stamp and format it- example - %Y-%m-%d %H:%M:%S%:Z) TRUNCATE = 10000 (Leave as default, may need tuning) LINE_BREAKER = (REGEX to Work out where to break the line)     Apply the above to your TA based on your specific, deploy, test and adjust as required. Also, there may already be a props TA if this data is common data source from Splunkbase have you checked that?
This might be a completely different issue. You're not talking about searching directly from the raw data but some fancy operations and the final result of some more complicated search which doesn't ... See more...
This might be a completely different issue. You're not talking about searching directly from the raw data but some fancy operations and the final result of some more complicated search which doesn't necessarily mean that the ingested data is bad. Try running the subsearch as a separate search and see if it returns (any/proper) results. Also take note of how long the search takes and how many results it returns. Since you're using join with a subsearch, it's quite probable that this might be the culprit here - join is usually best avoided. Especially if used with a search for indexed events. Especially if it's run over a relatively long period.
Hi @m92 , I added eariest and latest because you have _time in your searches, but you can ignore them. Ciao. Giuseppe
Hi @HugheJass , use this field in the conditions status.errorCode=* status.errorCode=0 status.errorCode!=0 or 'status.errorCode'=* 'status.errorCode'=0 'status.errorCode'!=0 Ciao. Giuseppe
Hi @Amadou, the main issue in developing a Splunk search is to know what to search, then you can use the SPL for searching the rules that you defined in your knowledge of the technology to monitor. ... See more...
Hi @Amadou, the main issue in developing a Splunk search is to know what to search, then you can use the SPL for searching the rules that you defined in your knowledge of the technology to monitor. I don't know what's your technology to monitor, as I said in my sample: if you are using windows EventCode=4625 menas log fail. So what are the conditions that you need to search? if you need to search a value in a field (e.g. EventCode=4625) you an use this field, if you need to search a string (e.g. "login successful"), you can search for this string. Did you tried to follow the Splunk Search Tutorial (https://docs.splunk.com/Documentation/SplunkCloud/8.1.0/SearchTutorial/WelcometotheSearchTutorial) to be guided in the use of SPL? Ciao. Giuseppe