All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hey,   right now I am a bit messed up in the mind and not sure if I try to find an overly complicated solution to a maybe fairly simple problem.   Secanrio is; I have a set of historical stock ma... See more...
Hey,   right now I am a bit messed up in the mind and not sure if I try to find an overly complicated solution to a maybe fairly simple problem.   Secanrio is; I have a set of historical stock market prices (Ticker, Close-price, Volume). I want to extract several information here: 1. The %Change of a given day compared to the previous (work) day and the respective volume. Additionally the time frame should be defined relatively (set only the date fix, but the range flexible). Table that for the Top10 with their respective volume that day. 2. Use the ouput from above to generate a similar set but now with 1day, week, month later then the given day, but only for the Top10 above and combine that into one result table Desired outcome: Ticker %-Change -1d Volume %-Change +1d %-Change +1w AAA 5.0 10000 1.0 -8.0 BBB 3.0 50000 4.0 4.5 CCC 1.0 70000 -1.0 3.0   I think I solved part 1  with:   index=market_price Close!="" [ search latest="6/23/2020:00:00:00" | addinfo | head 1 | eval latest=info_max_time+86400 | eval earliest=info_max_time-86400 | fields earliest,latest | format "(" "(" "" ")" "OR" ")" ] | bin span=1d _time | stats avg(Close) as Close avg(Volume) as Vol by _time Ticker | streamstats global=f current=f first(Close) as p_close by Ticker | eval delta=round(((Close-p_close)/p_close)*100,2) | where delta!="" | eval Amount_in_Mio=round((Close*Vol)/1000000,0) | where Amount_in_Mio>2000 | table Ticker delta Amount_in_Mio | sort - delta |head 10     Part 2 I can get a set of results for the day after, but not based on the Top 10 from above:   index=market_price Close!="" [ search earliest="6/23/2020:00:00:00" | addinfo | head 1 | eval latest=info_min_time+86400 | eval earliest=info_min_time-86400 | fields earliest,latest | format "(" "(" "" ")" "OR" ")" ] | bin span=1d _time | stats avg(Close) as Close avg(Volume) as Vol by _time Ticker | streamstats global=f current=f first(Close) as p_close by Ticker | eval delta=round(((Close-p_close)/p_close)*100,2) | where delta!="" | table Ticker delta | sort - delta |head 10     I think right now, I think I would need to combine several sub searches for relative times, the first Top10 and the subsequent datasets. Not sure If I moved myself into a dead end here, so any suggestions are welcome.   Thanks  
I'm looking for help on how to output the contents of my dashboard textbox to a kv lookup.  I'm hoping to display and collect the timestamp, user who created the note entry, and the notes in the pane... See more...
I'm looking for help on how to output the contents of my dashboard textbox to a kv lookup.  I'm hoping to display and collect the timestamp, user who created the note entry, and the notes in the panel below. As a bonus, I would also like to make the existing notes entries editable in my dashboard panel and capture the user who edited and timestamp in another column.
Hi, I need to run a search ,take the output, and pass it to a dbxquery search via map search. So when i do this as below without the search and th map, every thing work : | dbxquery query="EXEC [C... See more...
Hi, I need to run a search ,take the output, and pass it to a dbxquery search via map search. So when i do this as below without the search and th map, every thing work : | dbxquery query="EXEC [CTS2_Repository].[dbo].[C_sp_CallTransac] 'DE1', '2021-03-02 00:00:00', '2021-03-02 23:59:59'" connection=10VM_CI_BI_UAT But when i transform that as below, i get the error "Search Factory: Unknown search command 'cts2'." | eval cust_id="DE1" | map search=\" | dbxquery procedure=\\\"{{call [CTS2_Repository].[dbo].[C_sp_CallTransac](?,?,?)}}\\\" connection=\"10VM_CI_BI_UAT\" params=\\\"\\\"$cust_id$\\\", 2021-03-03 00:00:00, 2021-03-03 23:59:59\\\"\" Any help is appreciated.    
Dear Sir I can not install on Ubuntu, Best Regards  
HI All, Need help in comparing 2 fields or join 2 values to build a table for another 2 field. CODE 1:    index=opennms "Cisco-WLC-AP-DOWN/AP*" | table AP_NAME, Time,downtime,   OUTPUT 1:  ... See more...
HI All, Need help in comparing 2 fields or join 2 values to build a table for another 2 field. CODE 1:    index=opennms "Cisco-WLC-AP-DOWN/AP*" | table AP_NAME, Time,downtime,   OUTPUT 1:  AP_NAME Time Ticket_ID AP6412 3/6/2021 19:11 INC00001 AP6412 3/6/2021 18:45 INC00002 AW 3/6/2021 17:08 INC00003 AE 3/6/2021 16:29 INC00004 AP6412 3/6/2021 15:15 INC00005 AR 3/6/2021 14:31 INC00006   CODE 2:     index=moogsoft_e2e | table AP_NAME, Time,Ticket_ID,   OUTPUT 2:  AP_NAME Time downtime AP6412 3/6/2021 19:11 4:18:55 AB 3/6/2021 18:02 1:21:25 AC 3/6/2021 17:08 1:23:45 AP6412 3/6/2021 10:12 7:45:23 AP6412 3/6/2021 15:15 2:21:34 AE 3/6/2021 14:31 8:12:23   Expected final output Table : AP_NAME Time Ticket_ID downtime AP6412 3/6/2021 19:11 INC00001 4:18:55 AP6412 3/6/2021 15:15 INC00005 2:21:34   I want both AP_NAME & Time  to match the Ticket_ID & downtime.  
How do I update the Apps,  Add-ons or TAs in Splunk that need updating? Am told that not a good idea to connect Splunk to the Internet. So how do apps or add ons get the latest list for example list ... See more...
How do I update the Apps,  Add-ons or TAs in Splunk that need updating? Am told that not a good idea to connect Splunk to the Internet. So how do apps or add ons get the latest list for example list of dark sites on the web? 
Q1:  is there a way to import a matrix into Splunk?  Q2:  What SPL command gives me all values set to true and tells me x, y coordinates?  (object here is to create table) Here is an example of the... See more...
Q1:  is there a way to import a matrix into Splunk?  Q2:  What SPL command gives me all values set to true and tells me x, y coordinates?  (object here is to create table) Here is an example of the matrix that I would like to import.  (Q3:  What would be SPL command provided the field name for the x, y that is true)   a b c d e f 1 0 1 0 1 1 0 2 1 0 1 0 0 1 3 0 1 0 1 1 0 4 1 0 1 1 1 0 5 0 1 1 0 1 1 6 1 0 1 0 1 0
Has anyone ingested f5 Silverline asm data? I've got the data from f5 Silverline via syslog, but wondering how I should ingest it into Splunk. Currently I have the add-on installed and using sourcety... See more...
Has anyone ingested f5 Silverline asm data? I've got the data from f5 Silverline via syslog, but wondering how I should ingest it into Splunk. Currently I have the add-on installed and using sourcetype=f5:bigip:syslog, but it doesn't recognize it further than that. It doesn't seem to be supported by the Splunk Add-on for F5 BIG-IP. Thanks
“BLUF: Looks like a TLS/cipher problem in addition to ca_bundel. I was able to connect without errors after specifying the ca_bundle file and explicitly specifying TLS version and ciphers.” I'd then... See more...
“BLUF: Looks like a TLS/cipher problem in addition to ca_bundel. I was able to connect without errors after specifying the ca_bundle file and explicitly specifying TLS version and ciphers.” I'd then modified inputs.conf   [SSL] cipherSuite = ecdhe-rsa-aes-128-gcm-sha-256   In addition, I'd added ca_bundel $splunk home dir%/etc/auth/ I am still getting SSL error.  Any idea how to get around getting the input working?
I read that in 8.1.2 it's less painful to update HEC configs, no longer requiring a restart for CRUD operations. Should I keep my HEC on HWF or move it directly to indexers?
Hello Splunk Community, I have read through the Q&A 7 pages in and read through several instructions on how to do this, but still cant seem to find what I need to do. I am trying to add 1 hr and 15 ... See more...
Hello Splunk Community, I have read through the Q&A 7 pages in and read through several instructions on how to do this, but still cant seem to find what I need to do. I am trying to add 1 hr and 15 min to the event latestEvent. Can obtain some guidance on how to add 1 hr and 15 min to a field? In this case the filed is latestEvent.  (index="xyz" event=submission ) OR (index="abc ) | eval latestEvent=case(event="submission", timeofsub) | eval Scheduled_Ingestion_Time=relative_time(latestEvent, "+3615") | stats latest(latestEvent) as latestEvent values(Scheduled_Ingestion_Time) as Scheduled_Ingestion_Time  
Hey everyone, happy friday to you all.  I'm currently looking into upgrade our older Splunk 7.0 software to at least version 8.0 (if not higher).  But I wanted to get some advice from some users who'... See more...
Hey everyone, happy friday to you all.  I'm currently looking into upgrade our older Splunk 7.0 software to at least version 8.0 (if not higher).  But I wanted to get some advice from some users who've been through this before. My biggest question/concern is the upgrade process itself.  Reading the documentation makes it sound simple: 1. unpacking the new version of Splunk in the same directory as the original 2. letting the migration script run 3. re-indexing our data.  Again... seems a bit to easy.  And I've read that most people have to upgrade along these lines: * Go to version 7.0 - 7.1 - 7.2 -7.3 - 8.0 Further more there is also ensure App and TA combability is still a thing.  Something I'm working through listing out. But because of how detailed this upgrade feels I wanted to ask the communities advice on what I should expect.  And if there are any pain points I might not be aware of going forward. Thanks - Titan
Hello Community,  I am trying to connect to SAP database via DB connect (3.4.2) and Splunk version is 8.0.0. Can anyone please suggest the right approach and configuration about how to connect to SA... See more...
Hello Community,  I am trying to connect to SAP database via DB connect (3.4.2) and Splunk version is 8.0.0. Can anyone please suggest the right approach and configuration about how to connect to SAP database. I have already tried and referred to below links but still not able to proceed with these solutions  https://docs.splunk.com/Documentation/DBX/3.4.2/DeployDBX/databasespec https://docs.splunk.com/Documentation/DBX/latest/DeployDBX/Installdatabasedrivers#Install_drivers_for_other_databases   Thanks  jbhojak
I'm running a simple transform to change the index from "tenable" to "tenable-dc" for one of my sourcetypes. Props.conf [tenable:sc:vuln] TRANSFORMS-~dcfilter = dcfilter Transforms.conf [dcfilter... See more...
I'm running a simple transform to change the index from "tenable" to "tenable-dc" for one of my sourcetypes. Props.conf [tenable:sc:vuln] TRANSFORMS-~dcfilter = dcfilter Transforms.conf [dcfilter] REGEX = ([Dd][Cc]01) FORMAT = $0-dc DEST_KEY = _MetaData:Index   The problem that I'm having is that the transform is not catching every event. I have 164 events that the filter should catch, but only 156 events are indexed in the new index (tenable-dc). If I run the following search command, it catches all 164 events: index=tenable* sourcetype=tenable:sc:vuln | regex _raw = "([Dd][Cc]01)"   I can't find any similarities between the 8 "missed" events or differences between those events and the 156 "captured" events. My first thought was that the regex was wrong, but the search-time regex works.  Does anyone have any experience with index-time extractions missing events?
I am getting the below error, looking the splunkd.log file. DC:DeploymentClient - channel=tenantService/handshake Will retry sending handshake message to DS; err=not_connected Please any suggestio... See more...
I am getting the below error, looking the splunkd.log file. DC:DeploymentClient - channel=tenantService/handshake Will retry sending handshake message to DS; err=not_connected Please any suggestion ?    
Is there a way to fully automate phantom warm-standby flip? Current steps are manual and needs
I'm trying to install a universal forwarder on one of my systems. I originally tried with the main Linux package in the link below however I came to find out that my system isn't x86_64 but it's i686... See more...
I'm trying to install a universal forwarder on one of my systems. I originally tried with the main Linux package in the link below however I came to find out that my system isn't x86_64 but it's i686.  http://www.splunk.com/download/universalforwarder Is there a UF that is supported for i686 architecture? I don't care if it's an older release if it does exist
redhat 7 created a splunk user in linux - added user to wheel group and sudoers Installed splunk UF for linux 7.3.7.1 all files chown splunk:splunk configured splunk to runas splunk user all suc... See more...
redhat 7 created a splunk user in linux - added user to wheel group and sudoers Installed splunk UF for linux 7.3.7.1 all files chown splunk:splunk configured splunk to runas splunk user all successfull - ps shows splunk pids as splunk user even if started from root... as splunk user go to var/log/ -rw-r--r--. 1 root root 57814 Mar 5 11:17 tftpd.log -rw-------. 1 root root 1810830 Mar 5 12:00 kern.log -rw-------. 1 root root 1534866879 Mar 5 13:54 user.log -rw-r--r--. 1 root root 411594197 Mar 5 15:45 mrtg.log -rw-------. 1 root root 161108311 Mar 5 15:46 cron.log -rw-------. 1 root root 38312402 Mar 5 15:46 secure.log -rw-------. 1 root root 249091058 Mar 5 15:46 daemon.log -rw-------. 1 root root 1578879854 Mar 5 15:47 messages.log as splunk user I can't tail -10 messages.log or secure.log - permission denied if i sudo tail -10 then i can read these files and displayed... setup  Splunk_TA_nix and local/inputs.conf - monitor for secure.log and messages.log - get splunk UF error in splunkd.log - permission denied... What do I need to do to make this work properly as splunk user... can't add sudo to inputs.conf monitor command... so splunk user needs read rights to these files in /var/log for TA_nix to work properly Suggestions welcome please.... thanks in advance... Rich
I am having a similar issue to this thread here, but my drilldown search still won't work (explanation below): https://community.splunk.com/t5/Dashboards-Visualizations/In-a-dashboard-why-can-t-I-... See more...
I am having a similar issue to this thread here, but my drilldown search still won't work (explanation below): https://community.splunk.com/t5/Dashboards-Visualizations/In-a-dashboard-why-can-t-I-configure-a-drill-down-with-a-rex/m-p/405348 I have a panel on my dashboard  with a custom drilldown and search.  The search works perfectly when running it as a search on it's own.  However, in the search string we have a "rex" and those don't play nice with drilldowns and XML.    |rex field=field1 "^(?<field2>[^ ]+)"   Apparently, according to the thread above,  you need to  wrap the data in "<!CDATA[]]?>":   <link target="_blank"><![CDATA[ search?earliest=&latest=&q=|inputlookup = blah |rex field=field2 "^(?<field>[^ ]+)"|search continues....]]></link>   The drilldown will execute and open another tab, but the search stops at    rex field=field2 "^(   I get an error saying "Unbalanced quotes."  The search runs on it's own, but not when using a custom drilldown search and wrapping the search in "CDATA." Any ideas on how to get this search running with rex and no errors in a custom drilldown? Thanks
Hello All, I am looking for assistance with upgrading a single Splunk Enterprise Windows server (no cluster) that is running v6.1.3 and several Splunk Universal Forwarders running v5.02 (Windows) an... See more...
Hello All, I am looking for assistance with upgrading a single Splunk Enterprise Windows server (no cluster) that is running v6.1.3 and several Splunk Universal Forwarders running v5.02 (Windows) and v6.2.6 (Linux). Based on my research, it looks like I would need to upgrade the Enterprise server and Universal Forwarders via the following path: Current Version -> 6.5 -> 7.2 -> 8.1 I found the following article which gives the impression that an upgrade from a version as old 6.6.x is possible: https://docs.splunk.com/Documentation/Splunk/8.1.2/Installation/HowtoupgradeSplunk#Upgrade_paths_to_version_8.1 However, I have been unable to locate any downloads for versions older than 7.2.0. What are my options here?  Am I out of luck? Any assistance in this matter would be greatly appreciated.