All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Anyone have access to the latest TA_Microsoft-Sharepoint?  We have 0.2.0 which has issues with field extractions, and before fixing up, want to get the latest. It is no longer available on splunkbas... See more...
Anyone have access to the latest TA_Microsoft-Sharepoint?  We have 0.2.0 which has issues with field extractions, and before fixing up, want to get the latest. It is no longer available on splunkbase... https://splunkbase.splunk.com/app/1908/... 404 Error: Page not found No github or other location via google, etc.
we have a lot of operational data that comes into Splunk, and based on certain conditions this can cause a service impact.  These conditions then trigger alert actions which update our Remedy and NMS... See more...
we have a lot of operational data that comes into Splunk, and based on certain conditions this can cause a service impact.  These conditions then trigger alert actions which update our Remedy and NMS tools.  my question is how do other Splunkers handle this? I've typically done, run on cron schedule every 1 min and look back by1 min. Is this the correct way to do it? I've ran into some issues where events get dropped due to index processing taking a bit more time, and it gets pick up on the sequential alert. I really want to utilize the data in Splunk to update  help desk tickets, triage service outages, and all of that cool integration stuff, but I want to make sure I'm writing my alerts properly.
Hello SMEs....Seeking helping hand I got stuck while putting EVAL-<field-name> in props.conf using case command and it is not at all working while the same is working in search bar in GUI. As sugges... See more...
Hello SMEs....Seeking helping hand I got stuck while putting EVAL-<field-name> in props.conf using case command and it is not at all working while the same is working in search bar in GUI. As suggestion would be highly appreciated   EVAL-XYZ = case(src== "AAA", field1, src== "BBB", field2 , src== "CCC", field3)
Hi, I have a field named operating_system. it can contain multiple values examples being "Windows 10", "Windows Server 2016", "Windows 7" and others. I want to set a field named os_group based on t... See more...
Hi, I have a field named operating_system. it can contain multiple values examples being "Windows 10", "Windows Server 2016", "Windows 7" and others. I want to set a field named os_group based on the value of operating_system. So if the value operating_system value contains "Windows Server*" i want to set the os_group to "Server". If the value of operating_system is "Windows 7", "Windows 8", or "Windows 10", I want to set os_group to "Workstation" I was trying to do it using seperate lines as below. But can i combine these into one eval function where i can specify multiple values to filter on? I will need to use wildcards. | eval os_group=if(like(operating_system, "%Server%", "Server", os_group) | eval os_group=if(like(operating_system, "%Windows 7%", "Workstation", os_group) Thanks
I have a KV store collection that is populated.  I have a lookup definition pointing to the KV store.  If you use the kvstore lookup definition in a search, I get matching results and everything work... See more...
I have a KV store collection that is populated.  I have a lookup definition pointing to the KV store.  If you use the kvstore lookup definition in a search, I get matching results and everything works as expected.     index=* source=jello | lookup kvstore_lookup ip as srcip outputnew city as src_city   However, if I move that into an automatic lookup it does not work.  Before using the kvstore I was using a csv lookup and the automatic lookups where working fine.  The csv grew to 122mb so I populated a kvstore with the below.     | inputlookup old_csv_lookup | outputlookup kvstore_lookup   Permissions on the automatic lookups are global, everyone read, admin write.  I can see in the search log that its calling the automatic lookup "Will use Lookup: Lookup-......" but the the fields that are supposed to be added in from the lookup dont populate. Also, I am using matchtype=CIDR for this lookup definition. Any ideas why the automatic lookup is not working now that its using the kvstore? 
Hello, I've around questions and answers but I cannot find the one I need. I'm selecting previous week in the time range when searching, and Splunk is starting the week on Sunday. How or What I nee... See more...
Hello, I've around questions and answers but I cannot find the one I need. I'm selecting previous week in the time range when searching, and Splunk is starting the week on Sunday. How or What I need to change to make it start the week on Monday for all the users? I want to change it in the search bar, without making people to add earliest/latest or similar in the search. I'm sure is something easy, but I cannot find it. Thank you in advance!
Hi, I've been struggling with the horseshoe visualisation for a couple days. I have a specific scenario I would like to present but not sure if this can be done: I have a table with 2 rows and 3 fi... See more...
Hi, I've been struggling with the horseshoe visualisation for a couple days. I have a specific scenario I would like to present but not sure if this can be done: I have a table with 2 rows and 3 fields per row: source, count, status. Example: source,count,status sourceA,200,0(not breached) sourceB,100,0(not breached) sourceC, 100,1(breached) I would like to show a horseshoe per source (trellis) where the dial shows the count but the color is based on the status. So, for instance, sources B and C would be similar looking (count-wise) but the color would be different as one is breached and the other one is not... Is that possible? I've seen some posts about showing a different value as a token, but this doesn't work because 1-I need to work with multiple sources and 2-Each row may have its own threshold.... Thanks!
what a clear idea  about the  storage of the data.
There is no "install apps from file" button for Splunk Cloud. Where as it's available for Splunk on-prem.
Hi, I'm trying to update the Splunk UF on a machine, but when running the MSI installer I'm getting a "The specified account already exists" and then the MSI fails to install.  I've googled some ... See more...
Hi, I'm trying to update the Splunk UF on a machine, but when running the MSI installer I'm getting a "The specified account already exists" and then the MSI fails to install.  I've googled some generic failures around this, but none have worked so far. Has anyone experienced this or able to flag to how troubleshoot it? Thanks.
I am using Splunk Enterprise Version 8.0.5.1 Consider an index with half a million events being generated every day. There are three fields in the index I am particularly interested in. sourcetype... See more...
I am using Splunk Enterprise Version 8.0.5.1 Consider an index with half a million events being generated every day. There are three fields in the index I am particularly interested in. sourcetype has 20 different values, but I am interested in one sourcetype that accounts for 10,000 events per day objectid is populated on each event and there are multiple events for the same objectid. versionid can be two different values : 1 or 2 - objects move from 1 to 2, but never back to 1. For the query period, there should be no objectids with a versionid of 1. For those objectids that have an event with versionid 1, I want to know when they changed to a 2. The problem I have is that there are so many 2s in the index, querying all of them just to then join to the 1s is taking forever and generates a job of over 1GB. So, what I'd really like to do is to query the 1s first, and then feed that list into a subsequent search where it only finds the rows with the objectid in the results of the first search for the 1s. If I do this, it just takes forever...       index=myindex sourcetype=mysourcetype versionid=1 | reverse | table _time objectid | dedup objectid | join objectid [ index=myindex sourcetype=mysourcetype versionid=2 | reverse | eval fixed=_time | table fixed objectid | dedup objectid ] | table _time objectid fixed   What I want to do is to reference the results of the first query in an IN statement inside the second, but I can't find a way to do that.  If I could create a dashboard with a base query, and a panel uses the results of that base query in an IN statement, that might work, but at the moment I am stuck. I know that Splunk is not SQL, but to make it a bit clearer what I am trying to achieve... SELECT * FROM MYINDEX WHERE OBJECTID IN (SELECT OBJECTID FROM MYINDEX WHERE VERSIONID=1) AND VERSIONID=2 i.e. it evaluates the versionid=1 objectids first, and then the outer query only returns rows that match those ids.  When I do this manually with a small number of IDs and put them in an explicit IN clause, it runs very quickly. Any suggestions would be much appreciated.
Hello All,   We have 20 indexers and 5 HF's in our Environment. HF's are forwarding their data to indexers. I'd need to find out,  which servers are sending logs to HF's. is there any SPL query i c... See more...
Hello All,   We have 20 indexers and 5 HF's in our Environment. HF's are forwarding their data to indexers. I'd need to find out,  which servers are sending logs to HF's. is there any SPL query i can use for same ? Thanks
Hello everyone,   Could you please point me in the right direction ? I'm trying to get a universal fowarder to talk to my splunk instance (mono-instance). I've set the deployer Server correctly o... See more...
Hello everyone,   Could you please point me in the right direction ? I'm trying to get a universal fowarder to talk to my splunk instance (mono-instance). I've set the deployer Server correctly on the forwarder (done and checked multiple times, used with other forwarders).   On the forwarder : debian 10, w splunf uf 8.0.4 or  8.1.2 or 8.0.5 tcptraceroute to my-manager.fr:8089 => ok telnet => opening connection curl => empty reply from server in splunkd.log : DC:DeploymentClient - channel=tenantService/handshake Will retry sending handshake message to DS; err=not_connected   On the manager : debian 9, splunk 8.0.4.1 I've got on index=_internal : HttpListener - Socket error from xx.xx.xx.xx:39944 while idling: Read Timeout where xx.xx.xx.xx is the IP of the forwarder.   I've logs on both machines, with DEBUG strategically placed in log.cfg. I still don't get it. I don't even understand what is wrong.   Any idea ? Thanks in advance, Regards, Ema
Hello, I have created a dashboard using Splunk® Dashboards App. Multiple area and line charts were created but none of them shows the time in X Axis. I can change the X Axis Title to "Time" but... See more...
Hello, I have created a dashboard using Splunk® Dashboards App. Multiple area and line charts were created but none of them shows the time in X Axis. I can change the X Axis Title to "Time" but the bar does not show the time spans, so I have to guess when event was triggered. Only if I use a drilldown and jump into search-app I can see the time bar. No specific options were used that hide the X Axis. An example code of an area chart is the following:   { "type": "viz.area", "options": { "axisTitleX.text": "Time", "legent.placement": "bottom" }, "dataSources": { "primary": "ds_EffgdHH7" } }   Any idea if I am missing something? Thanks in advance. Chris
Hi Community, how do i combine where and eval? Available field are "Gear" and "Torque_Crankshaft" Discribed in my human brain / language eval "Torque_Wheel" = (where ("Gear"=1 == ("Torque_Cran... See more...
Hi Community, how do i combine where and eval? Available field are "Gear" and "Torque_Crankshaft" Discribed in my human brain / language eval "Torque_Wheel" = (where ("Gear"=1 == ("Torque_Crankshaft"*5.0000)), ("Gear"=2 == ("Torque_Crankshaft"*3.2000")), ("Gear"=3 == ("Torque_Crankshaft"*2.1429)))
I am new at splunk.i have got a task to do and its like kind of monitoring home network security and for that i have to send the router data to splunk for analysis. any suggestions about how to do t... See more...
I am new at splunk.i have got a task to do and its like kind of monitoring home network security and for that i have to send the router data to splunk for analysis. any suggestions about how to do this? Thanks in advance.
Hi Folks, We have a use case where we need to send OS logs from Chef /Puppet instead of UF to Splunk Indexer. Can we do that? Has anyone done it before? Please confirm ASAP.        
I need to search for a string composed of the month - year in Italian. Example: "March-2021" If I enter "March-2021" in the search, everything works but if I put the eval variable (month year) or t... See more...
I need to search for a string composed of the month - year in Italian. Example: "March-2021" If I enter "March-2021" in the search, everything works but if I put the eval variable (month year) or the strcat variable (completo), it doesn't work. I have :  |eval anno = strftime(_time,"%Y") | eval mesi=strftime(_time,"%m") | eval mese=case( mesi="01","Gennaio-", mesi="02","Febbraio-", mesi="03","Marzo-", mesi="04","Aprile-", mesi="05","Maggio-", mesi="06","giugno-", mesi="07","Luglio-", mesi="08","Agosto-", mesi="09","Settembre-", mesi="10","Ottobre-", mesi="11","Novembre", mesi="12","Dicembre-", 1=1, "INV") |eval meseanno= mese.anno |strcat mese anno completo |search AMBITO = meseanno  so it doesn't work if I use |search AMBITO = "March-2021" works Can you help me understand how to look for a chained string? Tks Bye Antonio
Hi Our client have the next (kind of query) runs as a schedule. It can found events or not, based on current situation     <base search> | where isnotnull(joblist) | dedup joblist | map search="|... See more...
Hi Our client have the next (kind of query) runs as a schedule. It can found events or not, based on current situation     <base search> | where isnotnull(joblist) | dedup joblist | map search="| dbxquery connection=con_A query=\"select a, b, c from xx where x='AAA'\" |appendcols [| dbxquery connection=con_A query=\"select (select max([rows]) from sys.partitions with (nolock) where object_id=object_id('dbo.$joblist$')) as rowCnt,sum(len(cast(xmlrecord as varchar(max)))) as sum from $joblist$ (nolock)\"]" | <rest of query>       This works find where the base query found events and joblist is defined. BUT when  base search cannot find any events, then the query/schedule will fail with error:  Error in 'map': Did not find value for required attribute 'joblist'. I have tried to found answers, but couldn't  found / get ideas how to skip the rest of query, starting from map, if there is no event. Any helps / ideas appreciated! https://community.splunk.com/t5/Splunk-Search/How-to-write-a-search-where-if-a-specific-value-for-FIELD1-is/m-p/146634 This didn't work and not those where have proposed to use fillnull. r. Ismo  
Query1 : index="*" earliest=-1mon@mon latest=@mon | stats count O/P : 25,419,925,723 Query2 : index="*" earliest=-2mon@mon latest=-1mon@mon | stats count as Twomonthsbeforecount | appendcols [ ... See more...
Query1 : index="*" earliest=-1mon@mon latest=@mon | stats count O/P : 25,419,925,723 Query2 : index="*" earliest=-2mon@mon latest=-1mon@mon | stats count as Twomonthsbeforecount | appendcols [ search index="*" earliest=-1mon@mon latest=@mon | stats count as Onemonthbeforecount ] | eval Difference=Onemonthbeforecount-Twomonthsbeforecount | table Difference Onemonthbeforecount Twomonthsbeforecount O/P : Difference Onemonthbeforecount Twomonthsbeforecount -26541517755 169524875 26711042630   Query 1 output should match the Query 2 "Onemonthbeforecount " column value, but why is it differing? Am i missing out something to check?