All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have the outcome of my search results but I want to filter by only OS.  I was able to get all the results but need to filter it down to Windows Server OS's.  What am I missing? Current search:  in... See more...
I have the outcome of my search results but I want to filter by only OS.  I was able to get all the results but need to filter it down to Windows Server OS's.  What am I missing? Current search:  index="myindex" "eventcode=NUMBER"
Hello All, I have configured EventHub and getting old data . Previously when retention on Azure was 7 day's I use to see data of current date -7 when I reduce retention now its showing data of Curre... See more...
Hello All, I have configured EventHub and getting old data . Previously when retention on Azure was 7 day's I use to see data of current date -7 when I reduce retention now its showing data of Current date -1  . I tried to  check all the configs but not able to get any clue. Splunk Version 7.2.9  and App used to configure Event Hub is Microsoft Azure Add On for Azure.
Up until recently, the InfoSec app VPN dashboards were populating just fine. Recently however they stopped and upon further investigation, it seems like data from our Cisco ASAs are not being extract... See more...
Up until recently, the InfoSec app VPN dashboards were populating just fine. Recently however they stopped and upon further investigation, it seems like data from our Cisco ASAs are not being extracted correctly. After some troubleshooting I backed up the Splunk_TA_cisco-asa folder and reinstalled the add-on from Splunkbase. One of the fields that is not being extracted is message_id; Going to the transforms and copying the regex for one of the message_id fields I was able to match events to it, but for some reason Splunk is not extracting it. The issue continues even after the fresh install of the add-on.
I have just install Microsoft Graph Security API Add-on and set up Application / Accesses at Azure end, however when I go into the configuration tab to add a new account, I just see a loading scroll ... See more...
I have just install Microsoft Graph Security API Add-on and set up Application / Accesses at Azure end, however when I go into the configuration tab to add a new account, I just see a loading scroll and dont see any button to "ADD" account config. Have anyone encounter the same? if so please share how you resolve it. 
We see lots of email alerts, and they come from a wide variety of places.  I want to understand them better.  So... I thought I would have Splunk monitor the offensive email directory on my own mach... See more...
We see lots of email alerts, and they come from a wide variety of places.  I want to understand them better.  So... I thought I would have Splunk monitor the offensive email directory on my own machine (elmx files).  Splunk doesn't automatically break events by file, though.   So then I tried to configure a custom sourcetype using regex, I just captured the beginning and ending lines, and the sourcetype works for a single email file.   But when I tried to monitor the entire directory using my sourcetype, no data was ingested at all. Why is this so hard? Splunk already knows that there's different sources, do I want Splunk to stop event breaking altogether? I'll try that...   thanks in advance!
Hi, any help with this would be appreciated!   rex field=msg.message "loc=(?<place>\d+)" | search place="16" | stats count by "place" The extracted field is place. The specific place I am search... See more...
Hi, any help with this would be appreciated!   rex field=msg.message "loc=(?<place>\d+)" | search place="16" | stats count by "place" The extracted field is place. The specific place I am searching for is "16". Is there a more efficient way to search for a specific place before calling stats? 
I am completing the process of installing the credentials package to our universal forwarders to send data to the cloud like we did with our application service servers and I ran into an issue on the... See more...
I am completing the process of installing the credentials package to our universal forwarders to send data to the cloud like we did with our application service servers and I ran into an issue on the majority of the machines where I would get the following error: /opt/splunkforwarder/bin/./splunk install app /tmp/splunkclouduf.spl This command [POST /services/apps/local/] needs splunkd to be up, and splunkd is down. Splunkd was definitely running. I even restarted it for good measure. Thirty percent of the machines executed the command fine, prompted me for UF credentials, and confirmed the installation was completed. Is this something you’ve run into before?
My logs are that kind : <July 13, 2020 10:55:02,572 PM CDT> So i used TIME_FORMAT=%b %d, %Y %H:%M:%S, %3N%p%z But it is not parsing and showing me error that " could not use strptime to parse time... See more...
My logs are that kind : <July 13, 2020 10:55:02,572 PM CDT> So i used TIME_FORMAT=%b %d, %Y %H:%M:%S, %3N%p%z But it is not parsing and showing me error that " could not use strptime to parse timestamp from "july 13, 2020 10:52:03,907 PM CDT> Please let me know how to solve this issue.  
how can I compare information from two different hosts? For exemple, On a host I have the name, number and phone calls. In another I have Name, number, phone calls and location. Can I compare the ... See more...
how can I compare information from two different hosts? For exemple, On a host I have the name, number and phone calls. In another I have Name, number, phone calls and location. Can I compare the two information based on logic? Example: if phone calls (host1) = 0 and phone calls (host2) = 0, status = ok my logic index=zzzz host=yyyy or host=xxxxx | name, phone_call, location eval description=case(phone_call(host1) == 0 AND phone(call(host2) == 0, "ok") | table description   Thanks, sorry  for my bad english
Hi Team, I am using the AppDynamics Pro trial version. If I want to add a license to this account. Does the old data remain as it is or it will be overwritten?
Team,   Need help to build a dashboard . WH.csv content XXX YYY I want to search in two different sources but wanna use the same variable from inputlookup variable.   existing Query | inputl... See more...
Team,   Need help to build a dashboard . WH.csv content XXX YYY I want to search in two different sources but wanna use the same variable from inputlookup variable.   existing Query | inputlookup WH.csv | table ware_house | map search="search index=wh source=$ware_house$_WH_OVERVIEW| head 1 | stats list(Routes) AS ROUTE list(source) AS WH | appendcols [ search index=wh source=$ware_house$_WH_SHIPPING | head 5 | stats list(LabelsCreated) AS LabelsCreated by LabelType | stats sum(LabelsCreated) AS SUMMARY ] "   Issue : second search is not getting the variable $ware_house$ so it does not return any result. As soon as the base search work would like to add it in the Dashboard.  
Hi @LukeMurphey  Is is possible to install the network toolkit app on a UF ? We have too many servers that we want to run a ping test against. But with Splunk SaaS, it wont be possible to do so from... See more...
Hi @LukeMurphey  Is is possible to install the network toolkit app on a UF ? We have too many servers that we want to run a ping test against. But with Splunk SaaS, it wont be possible to do so from cloud.   Thanks R
Hi Splunkers,     my search is like that  and it makes table with data and error message.  But error message includes like 50sentences.  And  I want to show this all sentences in table. But ... See more...
Hi Splunkers,     my search is like that  and it makes table with data and error message.  But error message includes like 50sentences.  And  I want to show this all sentences in table. But as you can image that table is too long and my dashboard looks like ugly.  So i want to show only first 3sentences with arrow or something.  Which means when user click arrow show all sentences for error, but before click you can see only 3 sentences. Can i do this in splunk?  Thanks for help !   
Assume I have a simple search that lists in a table the email addresses of those who recently sent an email: index=email | table sender  The email index does not have a field that identifies the co... See more...
Assume I have a simple search that lists in a table the email addresses of those who recently sent an email: index=email | table sender  The email index does not have a field that identifies the country the sender address is from; however, it is known that the listed sender addresses are from many different countries. If I have a lookup that contains all the email addresses located in the US using the format: email country address1@mail.com US address2@mail.com US ...  How can I filter my search results to only contain sender email addresses from those located in the US (based off of the lookup), while also adding a field to the table that shows US?
What I'd like to be able to do is when the user clicks on iPhone (there are 2 of them which is what presents the problem) I'd like to also include the OS in the search as to be able to uniquely ident... See more...
What I'd like to be able to do is when the user clicks on iPhone (there are 2 of them which is what presents the problem) I'd like to also include the OS in the search as to be able to uniquely identify which iPhone. I know how to configure a token to grab "iPhone" but what I don't know is how to grab the OS value at the same time. Also, is it best to put all the info in 1 token or 2 tokens? Or am I going about this completely wrong?
I have added the global account settings in the "Data input parameters" and also enabled the basic authentication in the "data input definition. The test succeeded. The issue is that when I insta... See more...
I have added the global account settings in the "Data input parameters" and also enabled the basic authentication in the "data input definition. The test succeeded. The issue is that when I install this app on another instance of Splunk it does not popup for user name and password. any idea why?
Hello Can someone recommend me the necessary steps to be able to correctly monitoring a .csv in the Heavy Forwarder instance and be able to search in the index generated from the Search Head? It sou... See more...
Hello Can someone recommend me the necessary steps to be able to correctly monitoring a .csv in the Heavy Forwarder instance and be able to search in the index generated from the Search Head? It sounds easy, but I can't do it. I have a deployment server in another instance and I suppose that I should deploy the configuration from there, but I can't connect it. Any help is welcome. Thank you.
Good afternoon, I'm a regular reader, but this is my first time writing, so I'll introduce myself. I'm Luis and work with Splunk. The searches, alerts and that kind of things do not give me problems ... See more...
Good afternoon, I'm a regular reader, but this is my first time writing, so I'll introduce myself. I'm Luis and work with Splunk. The searches, alerts and that kind of things do not give me problems and usually everything works out fine. Although there are things that are very confusing to me. When I started with only one instance in my PC to learn, everything was wonderful. However, with my client's Splunk, things change because it is multi-instance. We have 1 Search Head, 1 Deployment Server-Master Cluster, 1 UF, 1 HF and a cluster with two indexers. My doubt comes for the following reason: What would be the right way to deploy an app or a configuration? Since we have so many instances and the files are duplicated and tripled in some cases, it seems to me a mess to know which one commands over others, which tasks to do in the graphical environment, which ones through configuration files, when to restart or not... To top it all, now I fail to apply the bundle actions after trying to deploy an application that reads from an API. Would someone please explain the hierarchy to me so I can understand which files "command" over others? How could I solve the problem of bundle actions if the results don't describe what happens? Should I copy by hand each of the files and replicate them in the other instances? In which cases? For a custom app to work properly, do I need to copy it to the Search Head, HF and DS-MC? Excuse the pile of questions but I have already tried by my own means and I can't understand it. Thank you very much.  
Hi All, Can we add dynamic multiple under label values as a token and show in the trellis single value?   Regards, Manish 
I am unable to get additional columns from a CSV I have referenced in an SPL query that I have written.  In the CSV there are numerous columns however I want to add additional columns based on my sea... See more...
I am unable to get additional columns from a CSV I have referenced in an SPL query that I have written.  In the CSV there are numerous columns however I want to add additional columns based on my search.  From a data view, my index has details about devices for example 3 fields (user, status, machinename).  My query searches for "status = "in use"" and then tries to match this up to the CSV in question.  The output I am after is that on a match to show the detail from the CSV as required (i.e. columns are labelled firstname, lastname, type)   My query is index=devices status="in use" | rename prim_user as userid | search userid="*" [|inputlookup mycatalog.csv | search type="valued" | fields userid] | table userid type machinename   When I run this query a table is produced but the "type" column is not returned from the CSV.  I tried editing my search to be    index=devices status="in use" | rename prim_user as userid | search userid="*" [|inputlookup mycatalog.csv | search type="valued" | fields userid, type] | table userid type machinename   This however returns a null value.  I then tried to use a lookup but this fails too   index=devices status="in use" | rename prim_user as userid | search userid="*" [|inputlookup mycatalog.csv | search type="valued" | fields userid] | lookup mycatalog.csv prim_user as userid OUTPUT userid, type | table userid type machinename   This returns an error in my lookup advising "could not find all of the specified lookup fields in the lookup table even though they are there.   Any help appreciated.