All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

In v9.1, Palo Alto Networks updated their log format for GlobalProtect and the data model has also changed so the RWI dashboard no longer see these logs. Also, can you add Region/State to the VPN da... See more...
In v9.1, Palo Alto Networks updated their log format for GlobalProtect and the data model has also changed so the RWI dashboard no longer see these logs. Also, can you add Region/State to the VPN dash board? I added it to mine and it's really helpful in the US where every state has a Lincoln.
I have an example where logs are not shown in splunk search, and I can see the index name in the inputs file has mixed case, but the actual index name is all lower case. Will this the cause the logs ... See more...
I have an example where logs are not shown in splunk search, and I can see the index name in the inputs file has mixed case, but the actual index name is all lower case. Will this the cause the logs to not get ingested? I also note the sourcetype case is wrong too, so are any/all these fields case sensitive? actual index name: "target_index"   [monitor:///file/path/logfile.log] index = Target_Index sourcetype = mBAS_log disabled = false Thanks for the help.  
Hi, The search I have returns two events. One event has the following field: patches{}.name - This is patches that are to be installed The other has: policies{}.packages{}.name - This is patches... See more...
Hi, The search I have returns two events. One event has the following field: patches{}.name - This is patches that are to be installed The other has: policies{}.packages{}.name - This is patches that failed to install My search is as follows:   index=main sourcetype=_json id=712803 | rename policies{}.packages{}.name AS "Failed to install", patches{}.name AS "Patches to be installed" | table name, "Patches to be installed", "Failed to install"   And this returns the following: name Patches to be installed Failed to install LP-USER-01096   Google Chrome Microsoft OneDrive LP-USER-01096 Microsoft OneDrive Google Chrome     But what I really want is the following:   name Patches to be installed Failed to install LP-USER-01096 Google Chrome Microsoft OneDrive Google Chrome Microsoft OneDrive   Is there a way I consolidate these results onto one row so it looks like the above?
Hi @gcusello , Recently I faced one issue that my old query use to work but now it no longer working. Old Query: |dbquery wmsewprd "select * from sys_code_type where rec_type='C'" What we are wit... See more...
Hi @gcusello , Recently I faced one issue that my old query use to work but now it no longer working. Old Query: |dbquery wmsewprd "select * from sys_code_type where rec_type='C'" What we are witnessing now is that the above query is working only when we are explicitly using the schema name(wmsew) along with the table name(sys_code_type) as below. New Query: |dbquery wmsewprd "select * from wmsew.sys_code_type where rec_type='C'" So I contacted Splunk support and they advised me to use dbxquery instead of dbquery so in-order to do that what  asked me to migrate the  search that uses the "dbquery" command to use "dbxquery" instead. So can you please guide me how to migrate from dbquery to dbxquery. Regards, Rahul  
Hello the Team, hope you are Okey!   I have a question about Fortinet FortiGate Add-On for Splunk which is available in splunkbase.com platform : https://splunkbase.splunk.com/app/2846/#/details.... See more...
Hello the Team, hope you are Okey!   I have a question about Fortinet FortiGate Add-On for Splunk which is available in splunkbase.com platform : https://splunkbase.splunk.com/app/2846/#/details. I am deploying a distributed Splunk Enterprise infrastructure with a Heavy Forwarder, Indexer and Search Head. I don’t know exactly in which instance I should  install the add-on? Is it in the search Head? Should I add data input on Heavy Forwarder Instance? I didn’t really find a clear procedure for the installation and the configuration.   I have to implement a BOSS Of The SoC environment (so the datasets are already available on GitHub web site)   Thanks, In advance. Waiting for your response,
Hey, I have one sourcetype named "my_sourcetype". Since I would like to integrate with Splunk ES, I need to map my field values into expected values. Each data model (eventtype) is defined in even... See more...
Hey, I have one sourcetype named "my_sourcetype". Since I would like to integrate with Splunk ES, I need to map my field values into expected values. Each data model (eventtype) is defined in eventtypes.conf file based on my product name. For example: [Data_Loss_Prevention] search = product="*DLP*" [Malware] search = product="*Malware*" Since one field (for example: action) may have different expected values for each data model, in my props.conf file I need to map the value to the expected value based on the correct DM. Since eventtype happens after EVAL in search-time process, I cannot use eventtype field and I have no way to identify which data model is it (just to mention that I do not want to copy the logic of eventtypes.conf into props.conf to avoid redundancy).   Can you think on a better way to do this?
I'm trying to get list of all fields in a index and oddly enough there's missing fields through the two methods below. Is anyone else experiencing this issue? Queries where i've confirmed im missing... See more...
I'm trying to get list of all fields in a index and oddly enough there's missing fields through the two methods below. Is anyone else experiencing this issue? Queries where i've confirmed im missing fields. FYI, i've included the where clause to exclude uninteresting fields and as many internal fields like (index, source, time, date, etc..) index=myapp sourcetype=myapp | stats dc() as * | transpose| rename "row 1" as values | where values >1 index=myapp sourcetype=myapp | fieldsummary |fields field count distinct_count values| where distinct_count > 1 These two queries above are returning 77 fields for me, but when i run a query and include a table command with a manual entry list of all fields that i think are there, followed by either stats and or fieldsummary, then i get 88 fields.  example of query returning more fields (in this case 87) index=myapp sourcetype=myapp | fields f1 f2 f3 f4 f5.. f200 | fieldsummary |fields field  | where values >1
Hi All, I followed Ian's blog (https://blog.arcusdata.io/splunk-mltk-to-predict-kb-articles) and it is a nice blog. But what I am missing: how to search (makeresult) in the model with a description... See more...
Hi All, I followed Ian's blog (https://blog.arcusdata.io/splunk-mltk-to-predict-kb-articles) and it is a nice blog. But what I am missing: how to search (makeresult) in the model with a description like "Unsupported Java version". When performing a search with this text on the model, I think/expect "KB0020147" (or another KB number that fits better according the model) to be returned as result. I suspect the search string looks like: | makeresults | eval description="Unsupported Java version" | apply <ModelName> as Predicted_KB But I think this won't work because the description needs to be prepared first to fit on the model values (PC_1, PC_2 and PC_3) which are numeric. Does anyone has an idea how the search/makeresult string would look like? Thanks. Regards, Madere
We are ingesting AWS data through HF and I am seeing duplicate values for each field as shown in screenshot. Few of the fields shows correct single value but most of the fields have double values. ... See more...
We are ingesting AWS data through HF and I am seeing duplicate values for each field as shown in screenshot. Few of the fields shows correct single value but most of the fields have double values. I have added below settings in props.conf but no luck. KV_MODE = none AUTO_KV_JSON = false INDEXED_EXTRACTIONS = json    
I am getting my result table from my json log as shown below But i want result of my line number 10 should be like below 1.0.9                                                         ... See more...
I am getting my result table from my json log as shown below But i want result of my line number 10 should be like below 1.0.9                                                                           feature-Tibco Communicator                                                                                                       Secrets Secure                                                                                                        commons splunk dashboard  
Afternoon all, I have an XML dataset that I am struggling to extract fields from. What I need is for the <key> value to be the field name and the  <value> to be the value of that field. For example... See more...
Afternoon all, I have an XML dataset that I am struggling to extract fields from. What I need is for the <key> value to be the field name and the  <value> to be the value of that field. For example: BLAdets.Bladetsmeta.FIELD_1="this is the value of field 1": <BLAdets> <Bladetsmeta> <Metadata><Key>FIELD_1</Key><Label>FIELD 1 test</Label><Value>this is the value of field 1</Value></Metadata> <Metadata><Key>FIELD_2</Key><Label>FIELD 2 test</Label><Value>this is the value of field 2</Value></Metadata> <Metadata><Key>FIELD_3</Key><Label>FIELD 3 test </Label><Value>this is the value of field 3</Value></Metadata> </Bladetsmeta> </BLAdets> I have tried xmlkv but it creates a key field with value FIELD_1. Any ideas would be much appreciated. Thanks.
hello   I use the search below in order to generate an alert if disk size is > 20 search = `diskspace` \ | fields host FreeSpaceKB \ | eval host=upper(host) \ | eval time = strftime(_time, "%m/%d/... See more...
hello   I use the search below in order to generate an alert if disk size is > 20 search = `diskspace` \ | fields host FreeSpaceKB \ | eval host=upper(host) \ | eval time = strftime(_time, "%m/%d/%Y %H:%M") \ | eval FreeSpace = FreeSpaceKB/1024 \ | eval FreeSpace = round(FreeSpace/1024,1) \ | stats latest(time) as time latest(FreeSpace) as FreeSpace by host \ | where FreeSpace >= 20 \ | table host   In the alert message I need to display the host concerned by the alert So I put : The $host$ encounter a disk size issue but the host is not displayed Same thing in the object of the alert : Splunk Disk size alert for the $host$ What is the problem please??
I am currently trying to set up a no reply office 365 smtp email address. When I go to test this and send and email using the following: index=_internal source=*python.log | head 1 | sendem... See more...
I am currently trying to set up a no reply office 365 smtp email address. When I go to test this and send and email using the following: index=_internal source=*python.log | head 1 | sendemail to=email@domain.com sendresults=true server=smtp.office365.com:587 I get the following error: command="sendemail", (554, '5.2.0 STOREDRV.Submission.Exception:SendAsDeniedException.MapiExceptionSendAsDenied; Failed to process message due to a permanent exception with message Cannot submit message. Any suggestions or trouble steps on this would be greatly appreciated. Thanks Koroshi
Hello Everyone. Is there someone actually tried to using Fuse filesystem on their frozen path? Is there any issued that might concern ? Thank you
I created 4 link list below and each correspond to one status. I wanted to place each button to their status. My current code is structured this way but link lists are automatically placed besid... See more...
I created 4 link list below and each correspond to one status. I wanted to place each button to their status. My current code is structured this way but link lists are automatically placed beside each other. Is there a way to arrange them vertically using css only?       <input>...</input> <viz>...</viz> <input>...</input> <viz>...</viz>        
Can some one help me to convert  the time format ( hh:mm:ss:nnn) which in string  ( example 0:00:00.041) into seconds, the answer should be for this 0:00:00.041 is 0 seconds. example : 1. 0:00:00.04... See more...
Can some one help me to convert  the time format ( hh:mm:ss:nnn) which in string  ( example 0:00:00.041) into seconds, the answer should be for this 0:00:00.041 is 0 seconds. example : 1. 0:00:00.041 is 0 seconds                      2. 0:00:00.500 is 0.5 seconds.                       
why am I getting "Encountered the following error while trying to save: An object with name=prices_lookup already exists", and if search by name, is not anywhere
Has anyone encountered this error before while trying to create an app manifest ? This only happens for apps that do not have a version (i.e. some of the Splunk OOTB apps) slim generate-manifest... See more...
Has anyone encountered this error before while trying to create an app manifest ? This only happens for apps that do not have a version (i.e. some of the Splunk OOTB apps) slim generate-manifest: [ERROR] The combination of group and name from the [id] stanza of app.conf (launcher) must equal the name of the app folder (.) Basically, I'm running /opt/splunk/bin/slim generate-manifest . > app.manifest on /opt/splunk/etc/apps/launcher, and currently, my app.conf looks like this: [install] is_configured = true allows_disable = false[ui] label = Home is_visible = true[package] id = launcher[id] version = 0.0.0 name = launcher Is there something I'm missing here? This is also happening with the following apps: user-prefs SplunkLightForwarder SplunkForwarder splunk_internal_metrics legacy learned The only common theme between these app is that they do not actually have a version #. We are not seeing this issue with other Splunk apps that have versioning. Basically, I'm trying to create an app-manifest since that is one of the prerequisites for packaging an app for an AppInspect scan since we are migrating a number of our Splunk apps to Splunk Cloud. Here are the steps: https://docs.splunk.com/Documentation/SplunkCloud/8.0.2006/DevApp/Deployingtheapp I'm specifically referencing the "Generate the app manifest" section. Now, if the apps that I listed above will end up getting shipped with our Splunk Cloud instance and won't require us to send them over to Splunk Cloud Services for vetting, then we'll proceed with excluding these from our Splunk Cloud migration package.  
Good Evening,   I am in the process of scheduling a test for a Splunk Certification with Pearson Vue and would like to know how I can locate my Splunk ID?    Thank You that would be a great help.... See more...
Good Evening,   I am in the process of scheduling a test for a Splunk Certification with Pearson Vue and would like to know how I can locate my Splunk ID?    Thank You that would be a great help. 
I got a search query but I need help displaying the failed scans of the IP or devices. What field I use for that particular search.