All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thanks, this is helping! I can see now that there is indeed separate events indexed into Splunk's own data format. Now... how can I ensure that the specific information within the events are used in ... See more...
Thanks, this is helping! I can see now that there is indeed separate events indexed into Splunk's own data format. Now... how can I ensure that the specific information within the events are used in a SPLUNK search? For example, one of the pieces of information within the event, is a name of a parent group. How can I ensure, that when I run a search, it will look into these events and match my results with the corresponding parent group? Thank you for your patience and please bear with me while I try to work this out!
Does anyone know of a list of component codes and their meanings for at least _internal and _audit? I have asked instructors and Splunk direct with no help so far. 
Thank you for your formatting advice.  This is my first detailed post as I dive into Splunk dashboards, so I will keep that in mind moving forward. The token works fine in general with a wildcard or... See more...
Thank you for your formatting advice.  This is my first detailed post as I dive into Splunk dashboards, so I will keep that in mind moving forward. The token works fine in general with a wildcard or 0, so I didn't add more detail on how it's used because I didn't think that part needs troubleshooting.  The data is there in my results.  I figured there is a simple syntax issue that's stopping me from filtering it properly.  I'm pulling login events from Azure AD.  The field I'm working with here is status.errorCode.  I'm using two tokens - UserID and errorCode.   index="mscloud" userPrincipalName="$UserID$" status.errorCode="$errorCode$" | spath userPrincipalName | search userPrincipalName="*@company.com" | spath status.errorCode | search status.errorCode="*"| sort _time + desc | table _time createdDateTime userPrincipalName appDisplayName status.errorCode status.failureReason status.additionalDetails clientAppUsed conditionalAccessStatus  
I have no idea what "Splunk explorer" you're talking about. Honestly. But it doesn't work the way you think it does. If it's ingested into an index, it's split into separate events and indexed into ... See more...
I have no idea what "Splunk explorer" you're talking about. Honestly. But it doesn't work the way you think it does. If it's ingested into an index, it's split into separate events and indexed into Splunk's own data format. There is no "csv file" of the data anymore on Splunk's side. Assuming you're indeed talking about indexed data, not the lookups.
I did verify it by comparing the inputs.conf and outputs.conf files. They are exactly the same. The files in etc/system/local (because that's where the splunk add monitor create entries as far as I ... See more...
I did verify it by comparing the inputs.conf and outputs.conf files. They are exactly the same. The files in etc/system/local (because that's where the splunk add monitor create entries as far as I remember) might be identical but you may be inheriting some settings from other configs. That's why I asked about btool. Do splunk btool inputs list --debug and splunk btool outputs list --debug to see the effective config on both forwarders. That's first thing to check. Another thing is to verify the props/transforms to see if they - for example - don't match only specific subset of data which is matched by events coming from one host but not from the other. It's hard to advise something specific without knowing your config and your data.  
Thank you for the tip to add the field into my condition, but this produces no results.  Leaving in * and 0 show those results, but even putting "my_field=0" shows none.
hi PickleRick, thank you for your answer. I know where the file is and I can open it from SPLUNK explorer. This indexed file is used in one of my searches, but unfortunately the search has recently s... See more...
hi PickleRick, thank you for your answer. I know where the file is and I can open it from SPLUNK explorer. This indexed file is used in one of my searches, but unfortunately the search has recently stopped providing correct information. When investigating the issue, I discovered that the data pulled from the indexed file, misses values in some columns (which are crucial) and therefore the search results are incorrect. When I open the .csv file directly from its location, the values in all columns are correct. I wanted to open the .csv file in SPLUNK search to see what it will look like, but if this is not possible, I will have to find another way of working this out. 
Hi @HugheJass , try adding the field to use in the condition, e.g. All my_field=* Successful my_field=0 Failed my_field!=0 Ciao. Giuseppe
There are two main command's for lookup's   | inputlookup my_lookup  | lookup my_lookup  - (This is mainly used for enrichment) So start with the  | inputlookup my_lookup  command, if you ca... See more...
There are two main command's for lookup's   | inputlookup my_lookup  | lookup my_lookup  - (This is mainly used for enrichment) So start with the  | inputlookup my_lookup  command, if you can't  see it it's most likely due permissions or the definitions has not ben set. The lookup is a knowledge object and requires permissions, so could be private or shared, of you may have to to to the app its running under . So check this under SplunkGUI>settings>lookups and check lookup table files for the file and then under definitions. Once you have the definition or csv name try that in the | inputlookup command.     This assumes you have created the lookup file and it has permissions 
By the way, the forwarder is a Universal one. I created the config using the splunk commands on the command line, which I created based on Forwarder1. I did verify it by comparing the inputs.conf a... See more...
By the way, the forwarder is a Universal one. I created the config using the splunk commands on the command line, which I created based on Forwarder1. I did verify it by comparing the inputs.conf and outputs.conf files. They are exactly the same. I just changed the host name. The data from Forwarder 2 does get sent to the Indexers correctly, but it just goes into the wrong index. That's the only problem I have.  Both indexers have a props.conf with a stanza named with the source_type and a TRANSFORMS-routetoindex which points to a stanza in a transfroms.conf. The source_type is exactly the same in both Forwarders. Not sure if this will give you a clue to the cause of the problem. Thanks
got it its a typo error we used token correctly($timepicker.earliest$ and $timepicker.latest$)  but data is not matching in dashboard panel and when i open in search may i know what is the issue h... See more...
got it its a typo error we used token correctly($timepicker.earliest$ and $timepicker.latest$)  but data is not matching in dashboard panel and when i open in search may i know what is the issue here.
There is no such thing as "a .csv file saved in SPLUNK, which I believe is indexed ". CSV can be used as a lookup or its contents might have been ingested and indexed but then you need to know how a... See more...
There is no such thing as "a .csv file saved in SPLUNK, which I believe is indexed ". CSV can be used as a lookup or its contents might have been ingested and indexed but then you need to know how and where to it was indexed so that you can look for data from it.
1. If possible, avoid using screenshots. Paste your code into preformatted paragraph or a code block - it's much easier to read/respond this way. 2. Unless I'm blind you don't show how you're using ... See more...
1. If possible, avoid using screenshots. Paste your code into preformatted paragraph or a code block - it's much easier to read/respond this way. 2. Unless I'm blind you don't show how you're using this token.
Hello,  I have a really basic question I have a .csv file saved in SPLUNK, which I believe is indexed - this is not an output of a search but a file feed into SPLUNK from another source. I want t... See more...
Hello,  I have a really basic question I have a .csv file saved in SPLUNK, which I believe is indexed - this is not an output of a search but a file feed into SPLUNK from another source. I want to be able to open the file in SPLUNK search. Can you please advise what command I should use in SPLUNK search to be able to see the content of the .csv? thank you. 
There can be multiple reasons. You're mentioning a HF - do you mean that your events go UF -> HF -> indexer(s)? Are you getting _any_ events from this UF? (especially own UF's logs into _internal i... See more...
There can be multiple reasons. You're mentioning a HF - do you mean that your events go UF -> HF -> indexer(s)? Are you getting _any_ events from this UF? (especially own UF's logs into _internal index) Are you getting any other events through that HF?
I've been trying to get a new Developer License for more than a week and getting the same error message. I've also sent an email to devinfo@splunk.com but have not heard back yet.  Is there any othe... See more...
I've been trying to get a new Developer License for more than a week and getting the same error message. I've also sent an email to devinfo@splunk.com but have not heard back yet.  Is there any other way of getting a Developer License? Error: Developer License Request Error An error occurred while requesting a developer license. Please try again. If this error continues to occur, contact devinfo@splunk.com for assistance.    
@apietsch I want to onboard a SaaS application data to Splunk. What is the process? I think first would be to integrate the SaaS application add on with Splunk. That's the integration I'm talking ab... See more...
@apietsch I want to onboard a SaaS application data to Splunk. What is the process? I think first would be to integrate the SaaS application add on with Splunk. That's the integration I'm talking about.
1. Well, if you have a valid contract with Splunk you're entitled to support. The support portal is here -> https://splunk.my.site.com/customer/s/ (but as far as I remember, you need to have an accou... See more...
1. Well, if you have a valid contract with Splunk you're entitled to support. The support portal is here -> https://splunk.my.site.com/customer/s/ (but as far as I remember, you need to have an account associated with a valid active support contract so not just anyone can request support on behalf of your organization; I might be wrong though here, you need to verify that). 2. Since the 7.x line has been unsupported for some years now, it's hard to find compatibility matrix for such an old indexer and new forwarder. It generally should work, but it's definitely not a supported configuration (at the moment only supported indexer versions are 9.x).  But as long as both ends can negotiate supported s2s protocol version, they should be relatively fine. _How_ did you verify the configs? btool?  
@PickleRick    Thank you so much for answering my questions over such a long period of time. Thanks to you, I understand what was confusing about the data model. Reading the docs again, I realize... See more...
@PickleRick    Thank you so much for answering my questions over such a long period of time. Thanks to you, I understand what was confusing about the data model. Reading the docs again, I realized I had been thinking in a different direction. thank you.
<earliest>timepicker.earliest</earliest> <latest>timepicker.latest</latest> This shows you are not using the tokens correctly