All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have a CSV that I am monitoring.  The CSV has lots of fields and my extraction works appropriately.  What I have noticed is that depending on the item in the CSV the field either has a value or not... See more...
I have a CSV that I am monitoring.  The CSV has lots of fields and my extraction works appropriately.  What I have noticed is that depending on the item in the CSV the field either has a value or not.  I have noticed that this appears to be common with fields all prefixed with the same term.  An example of the data set   comp_domain comp_cputype comp_department last_logon_date Enabled Name   If I run the following SPL then for all the fields EXCEPT comp_*, SPLUNK will populate it with my value   index=foo | fillnull value="Nothing"     So using the above fields   field value comp_domain   comp_cputype   comp_department   last_logon_date Nothing Enabled Nothing Name Nothing   If I run an eval to look for null for one of the value (e.g. comp_domain) I get the same result   index=foo | eval job=if(isnull(comp_domain),"Nothing here",comp_domain)     field value comp_domain     The same will happen for any field with "comp_" prefixed but works fine for fields that don't have a prefix.  
フィールド設定について質問させてください。 以下のログに対してフィールドを設定する際の 方法をご教示頂けないでしょうか? 【ログ例】 ①IPアドレス[001.001.001.001, 002.002.002.002]:ユーザエージェント[Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Ge... See more...
フィールド設定について質問させてください。 以下のログに対してフィールドを設定する際の 方法をご教示頂けないでしょうか? 【ログ例】 ①IPアドレス[001.001.001.001, 002.002.002.002]:ユーザエージェント[Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.61 Safari/537.36] ②ログイン結果[NG]:IPアドレス[003.003.003.003, 004.004.004.004, 005.005.005.005]:セッションID[xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx]:ユーザエージェント[Mozilla/5.0 (Windows NT 6.3; rv:36.0) Gecko/20100101 Firefox/36.0]:パスワード[123123123] 【質問事項】 1.[]の前の文字列でフィールド名を変えることは可能か?  []の前が"IPアドレス"だったらフィールド名を"IPaddr"等にして、値に"001.001.001.001, 002.002.002.002"を入れたり、  []の前が"ユーザエージェント"だったらフィールド名を"UserAgent"等にして、値に"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.61 Safari/537.36"を入れたりしたいのですが、  そのようなことが可能なのでしょうか?   2.レコード毎にレイアウトが異なる場合、どのような設定を行えばよいかご教示いただけないでしょうか? 以上、よろしくお願いします。
I have a CSV file where the header contains the time of each subset of data. I need Splunk to split the columns into different event times, to be referenced as _time. user_ID 6/24/2019 6/17/201... See more...
I have a CSV file where the header contains the time of each subset of data. I need Splunk to split the columns into different event times, to be referenced as _time. user_ID 6/24/2019 6/17/2019 6/10/2019 3 40.34 40.5 44.53 4 36.99 38.64 42.86 5 0 0 0   For instance user_ID 3 has logged in for 40.34 hours for week 6/24/2019,  40.5 hours for week 6/17/2019 etc... The only thing that comes to mind is creating separate csv files for each week, but I believe there is a better way.  I have search but nothing has lined up with what I'm running into. The closest was this one but didn't help. https://community.splunk.com/t5/All-Apps-and-Add-ons/How-can-I-use-the-time-column-name-of-CSV-as-the-index/td-p/456189 Thank you for your time helping me.
Hi,   I have asked this question since we have forwarders that, for some reason, will not be able to upgrade to Win10 or OS that accomodates 7.x onwards.   Is it possible to receive the events, p... See more...
Hi,   I have asked this question since we have forwarders that, for some reason, will not be able to upgrade to Win10 or OS that accomodates 7.x onwards.   Is it possible to receive the events, possibly a csv file, from 6.x going through 7.x to 8.x IDX? If yes, what are the possible implications from it?   Thank you.
Can I draw a line chart in Splunk that has time on both x and y axis? For example: Data\Time 7:00am 7:15am 7:30am 7:45am ETA 2:47pm 2:45pm 3:00pm 3:20pm ETD 5:00pm 5:00pm 5:35pm ... See more...
Can I draw a line chart in Splunk that has time on both x and y axis? For example: Data\Time 7:00am 7:15am 7:30am 7:45am ETA 2:47pm 2:45pm 3:00pm 3:20pm ETD 5:00pm 5:00pm 5:35pm 6:00pm   For the above data, I want to have a line chart wherein x-axis shows the time in column header (7:00am, 7:15am, etc.) and y-axis shows the time for the data based on the lines ETA and ETD that are plotted.
We are trying to ingest some logs for events from different network appliances such as F5 load balancers.  Can you please tell us whether we should be logging them to a syslog and ingesting them from... See more...
We are trying to ingest some logs for events from different network appliances such as F5 load balancers.  Can you please tell us whether we should be logging them to a syslog and ingesting them from there or if we should be collecting them with splunk listening on a UDP port?
I have a report scheduled to run every 5 minutes (*/5 .....). This report gathers summary data from 6 minutes ago, to 1 minute ago, like this (I've removed the index & search criteria, etc., as they ... See more...
I have a report scheduled to run every 5 minutes (*/5 .....). This report gathers summary data from 6 minutes ago, to 1 minute ago, like this (I've removed the index & search criteria, etc., as they aren't germane): index=<index> <search_criteria> earliest=-6m@m latest=-1m@m .... When I run this with a "collect' with testmode=true, I get exactly what I'd expect ... If the current time is 18:00, I get data for 17:54, 17:55, 17:56, 17:57, and 17:58 The same query run as a report, though, with testmode=false, apparently the offset times are changed by 5 minutes and I get data for 17:49, 17:50, 17:51, 17:52, and 17:53 Is there something in the configuration that would have report offsets be altered by 5 minutes? Something I'm missing dealing with reports somehow? Running interactively vs running in a report is clearly offsetting things by 5 minutes. I've used this query to verify the times in my report query: index=<summary index> source="summary_revenue_by_minute" | convert ctime(info_min_time) AS minTime | convert ctime(info_max_time) AS maxTime | convert ctime(info_search_time) AS searchTime | sort by -_time | table _time minTime maxTime searchTime _raw   I'm scratching my head & reading docs to no avail. Thoughts?
Hi, Every few minutes the dashboard shows me this error: Dag Execution Exception: Search has been cancelled. Search auto-canceled. I tried to change the dispatch.auto_cancel = 0, but the error ... See more...
Hi, Every few minutes the dashboard shows me this error: Dag Execution Exception: Search has been cancelled. Search auto-canceled. I tried to change the dispatch.auto_cancel = 0, but the error returned. Please help me find the solution for this error. If it helps, this search is 30 seconds real time, and 1 min refresh. Thanks.      
Can't seem to find inputs-config for ServiceNow's RITM / Requested Item / table: sc_req_item --> is this correct? Or I'm not looking properly?
Event1 - Ticket_no = username*, id=111 Event2 - Ticket_no = TKT123, Id =0 Is there any way to merge this 2 events to get stats as : Ticket_no= TKT123, id=111
I have a lookup table which contains a varying low value and a high value for many rows, along with the desired value I wish to grab. For example (these are just dummy values as the actual CSV is muc... See more...
I have a lookup table which contains a varying low value and a high value for many rows, along with the desired value I wish to grab. For example (these are just dummy values as the actual CSV is much larger): Low High Desired Value 1000 1499 550 1500 1599 220 1600 1999 700   I work through my search input and come to a number value I want to test in the range of the low and high. (low < myNumber AND high > myNumber), to then pull the desired value. The typical input for "myNumber" is a multivalue, containing typically three or more. I've found I have more success breaking this up with mvexpand and rejoining it again later with stats. Im just noting this in case it provides any value for the answer. So my input before I need the value may look like this: Field1 Field2 Field3 Field4 myNumber Field5 Blah Bleh Meh Whatever 1200 1400 1520 Stuff Just Dummy Data Here 1510 1625 1780 Still or if i break it apart with mvexpand, like this: Field1 Field2 Field3 Field4 myNumber Field5 Blah Bleh Meh Whatever 1200 Stuff Blah Bleh Meh Whatever 1400 Stuff Blah Bleh Meh Whatever 1520 Stuff Just Dummy Data Here 1510 Still Just Dummy Data Here 1625 Still Just Dummy Data Here 1780 Still I want to be able to add the Desired Value to my table based on if myNumber falls between the Low and High. The number does not match the Low or High, but will always fall between one of the ranges. So my final output should look like this when im done: Field1 Field2 Field3 Field4 myNumber Field5 DesiredValue Blah Bleh Meh Whatever 1200 1400 1520 Stuff 550 550 220 Just Dummy Data Here 1510 1625 1780 Still 220 700 700   I can't seem to figure out how to go about this. I have no problem breaking apart the multivalue and rejoining it, I just can't figure out how to do a lookup that falls within a two fields... Routes Ive tried: - If I use inputlookup I can use the where command to filter out values just fine. However, if I try mapping the same search to map search="| inputlookup search here | where low < $myNumber$ AND high > $myNumber$" it doesn't work, meaning I cant seem to find a way to link the two together. - Join doesn't allow passing of vars and there isn't a shared field to join on, so no love there - the lookup command but it doesn't have any evaluations built in that I can find; seems to only work for exact matches. I appreciate any assistance I can get.
Hello - I need help extracting the "hostname" value into a separate field in the following string:   ABC1234: VPN Tunneling: Session started for user with IPv4 address 10.10.10.10, hostname jsmit... See more...
Hello - I need help extracting the "hostname" value into a separate field in the following string:   ABC1234: VPN Tunneling: Session started for user with IPv4 address 10.10.10.10, hostname jsmith-1234s    
Hello,    How can I change the host name displaying in Splunk with out changing /etc/hostname in linux. I did changed in system/local/inputs.conf under default and /local/server.conf under general... See more...
Hello,    How can I change the host name displaying in Splunk with out changing /etc/hostname in linux. I did changed in system/local/inputs.conf under default and /local/server.conf under general.    But nothing has helped me. I am trying to achieve SAML integration with Splunk for SSO. I am getting an error, since my linux host name starts with numbers.   Please do help me.      Thanks in Advance. 
Hello,   I am having problems trying to find duplicate entries within my splunk kvstore. Basically, what I want to do is find duplicates based on a few fields such as FQDN, CVE, and PORT. Then, onc... See more...
Hello,   I am having problems trying to find duplicate entries within my splunk kvstore. Basically, what I want to do is find duplicates based on a few fields such as FQDN, CVE, and PORT. Then, once I found duplicates I just want to output them in a table if their SOURCE field is different. The query I have so far is: | inputlookup vul_kvstore | stats count by fqdn, port, cve | where count>1 | table fqdn, port, cve, source The problem I have now is in my table I do not have access to the source field as it looks like the stats count line basically pulls out only the fqdn, port, and cve data. How do I get access to the source field data? Maybe I just have to revise my original query so I do not loose data to that field but so far nothing I try works. Hopefully someone can provide me some advise to push me through this problem.   Thanks, Joe
Hello I was tasked with setting up "Authentication Tokens" on a splunk 7.0.2 instance. I can not seem to find any documentation for versions later than 8.0. Is Authentication Tokens not available in... See more...
Hello I was tasked with setting up "Authentication Tokens" on a splunk 7.0.2 instance. I can not seem to find any documentation for versions later than 8.0. Is Authentication Tokens not available in version of splunk being used? If they are available could someone please direct me to documentation on how to setup this method of authentication. I understand this is a very old version of splunk. This version is installed on an air gaped network updating is not very easy. Thank you!!
my report generates host, eventcode, time, Message  however, my report generated on Splunk email Body is not formatted properly. how to format a report email body for report to look more Effective a... See more...
my report generates host, eventcode, time, Message  however, my report generated on Splunk email Body is not formatted properly. how to format a report email body for report to look more Effective and well formated??
We have a single searchhead that continually fills up the dispatch directory.  I manually have to go in and clear it out.  There's nothing special about this searchhead and I've even gone so far as t... See more...
We have a single searchhead that continually fills up the dispatch directory.  I manually have to go in and clear it out.  There's nothing special about this searchhead and I've even gone so far as to take it out of our load balancer such that only scheduled searches are being ran against it, yet still the dispatch directory continually ends up filling back up. What could be some areas I can look into to try and troubleshoot why a single searchhead would have such trouble clearing out its dispatch directory?
Greetings, I want to explore the using git repos to populate my splunk configurations.  But git clones all its files inside a central directory.   My question:  If I have $SPLUNK_HOME/etc/system/lo... See more...
Greetings, I want to explore the using git repos to populate my splunk configurations.  But git clones all its files inside a central directory.   My question:  If I have $SPLUNK_HOME/etc/system/local/mygitrepo/server.conf will splunk recursively look inside mygitrepo/  to find the .conf files inside of it when Splunk restarts? Thanks
Our two heavy-forwarders serve as intermediate for our hundreds of universal forwarders. I'm working on overriding/fixing hostnames to 6 of them. Here's the issue: Six of the UFs have hostnames with... See more...
Our two heavy-forwarders serve as intermediate for our hundreds of universal forwarders. I'm working on overriding/fixing hostnames to 6 of them. Here's the issue: Six of the UFs have hostnames with "-NEW" at the end. We want to remove that "-NEW" at parsing/indexing phase (not SH phase). Instead of working with the sysadmins of those 6 servers over a Zoom web conference to fix the hostname at the UF's system-local end, we decided to push a deployment-app to the 2 intermediate heavy-forwarders. The app contains these configs:       # transforms.conf # Of course, I'm using Dragon Ball Z characters as server names so that I don't get fired [hostname_overrider] DEST_KEY = MetaData:Host SOURCE_KEY = MetaData:Host REGEX = host::^(GOKU|VEGITA|GOHAN|GOTEN|TRUNKS|BULMA)(?:\-NEW)$ FORMAT = host::$1       # props.conf [default] TRANSFORMS-hostname_overrider = hostname_overrider     My thinking is that the transforms and props configs will apply to all UFs but will only catch the 6 servers because of the specific REGEX pattern and override the hostname. This method is not working. Any thoughts on why and which is the better approach to take?    
Hi, I am just unable to find my Splunk enterprise login page at all. Could I get some assistance with this or a link to the login page? Thank you.