All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

This is confusing me. On my Linux server the universal forwarder is installed, and the following sourcetypes are specified in inputs.conf. Nothing more is added. [monitor:///var/log/httpd/access_lo... See more...
This is confusing me. On my Linux server the universal forwarder is installed, and the following sourcetypes are specified in inputs.conf. Nothing more is added. [monitor:///var/log/httpd/access_log] sourcetype=access_combined index = apache [monitor:///var/log/httpd/error_log] sourcetype=apache:error index = apache When I search for this Linux server on Splunk. there are way many sourcetypes coming up. Top 10 values are as follows. It is good to see access_combined and apache:error coming up, but why are the others coming up too? I did not specify them in inputs.conf! access_combined 69,824 74.23% ps 18,353 19.511% bash_history 1,999 2.125% Unix:UserAccounts 936 0.995% cpu 870 0.925% df 580 0.617% usersWithLoginPrivs 360 0.383% protocol 290 0.308% Unix:Update 204 0.217% apache:error 188 0.2% Btw, I installed Splunk App for Unix and Splunk Add-on for Unix and Linux on my Splunk.  But this shall not attribute to the additional sourcetypes coming up on Splunk, because as far as I know I have to first specify the additional sourcetypes (e.g. [monitor:///xxxx], sourcetyp=cpu) in inputs.conf which I have not done so. Could anyone advise? much appreciated. 
Hi Everyone, I'm looking for some Splunk Enterprise Security tips, maybe in the form of a cheatsheeet. Specific topics of interest: 1. Recommended 'base apps' for ES, eg: CIM ESCU CIM-Valida... See more...
Hi Everyone, I'm looking for some Splunk Enterprise Security tips, maybe in the form of a cheatsheeet. Specific topics of interest: 1. Recommended 'base apps' for ES, eg: CIM ESCU CIM-Validator lookup file editor knowledge object explorer more?? 2. Some sort of validator for apps/addons for all required sourcetypes, and info on which peer to install them on. eg. For Azure: SH - App and addon, HF - App and addon 3. And finally ways to quickly validate logs eg: use CIM Validator, pick a log source and match it to a datamodel - verify the required fields exist. if it fails, and the sourcetype is supposed to be CIM compliant, verify you've installed the appropriate app/addon on the SH and/or HF. or use queries like this to validate your logs, based on a table that matches the required fields: |datamodel Intrusion_Detection IDS_Attacks search|dedup sourcetype|rename IDS_Attacks.* as *|table sourcetype action category dest signature src user vendor_product I would greatly appreciate your feedback and better ways to validate your ES installation. Thanks.
We are on Splunk Enterprise 6.5 and would like to upgrade to 8.1.2 using the AWS Market place AMI. What are the important factors that I need to keep in mind and if you have any steps to migrate the ... See more...
We are on Splunk Enterprise 6.5 and would like to upgrade to 8.1.2 using the AWS Market place AMI. What are the important factors that I need to keep in mind and if you have any steps to migrate the Data from 6.5 to 8.1.2. Greatly appreciated... thanks, Somi
I was using the MS Azure add-on for splunk.  Trying to switch to Splunk Add-on for MS cloud services.   One thing i noticed is that the event hub i was using is appending event hub events into the sa... See more...
I was using the MS Azure add-on for splunk.  Trying to switch to Splunk Add-on for MS cloud services.   One thing i noticed is that the event hub i was using is appending event hub events into the same splunk event.   Ie, instead of 8 events in Event Hub, and 8 events in splunk (which i saw in ms azure add-on for splunk), I get 2 events of 4 body.records[].service_principal_name.    The # of appended events is related to the # of partitiions, however, this thing doesn't seem to work w/ 1 partition.  Keep getting can not find partition 0 of 0 when the eventhub is 1 partition.  Formatting is TERRIBLE and it takes 30 seconds to render the 1st record in a search since raw so large. Any ideas what's going on here?   This supposed to be by design?
Splunk監視定義ファイルの更新を以下の手順で行っていました。 【監視定義ファイル更新手順】 ①「savedsearches.conf」の更新 ②リロードコマンドの反映  「curl -k -u admin https://localhost:8089/servicesNS/admin/search/saved/searchs/_reload」 しかし、Splunkのバージョンアップ... See more...
Splunk監視定義ファイルの更新を以下の手順で行っていました。 【監視定義ファイル更新手順】 ①「savedsearches.conf」の更新 ②リロードコマンドの反映  「curl -k -u admin https://localhost:8089/servicesNS/admin/search/saved/searchs/_reload」 しかし、Splunkのバージョンアップ後(Ver7.3.4→Ver8.1.0)に定義ファイルの更新が出来なくなりました。 なお、Splunkサービス再起動を実施すると定義ファイルは更新されます。 ※Ver7.3.4の時はサービス再起動が不要でした。 ※定義ファイルの更新処理は多くのタスクで使用しており、サービス再実行が必須の場合は改修箇所が多く  なるため、極力サービスの再起動を回避したいと考えております。 同事象に直面された方や回避策をご存じの方がいらっしゃいましたら、ご教授願います。
Hello, In the Splunk GUI/Interface, I filter into the following commands to remove some unwanted data from being displayed: | rex mode=sed field=_raw "s/ example: .+?( from |$)/ example: select fro... See more...
Hello, In the Splunk GUI/Interface, I filter into the following commands to remove some unwanted data from being displayed: | rex mode=sed field=_raw "s/ example: .+?( from |$)/ example: select from /g" | rex mode=sed field=_raw "s/ in \(.+?\) / in (...) /g" How would I apply this to props.conf in my forwarder (or is there a better option i.e. transforms.conf)?  I tried the following but did not seem to work for me.  [XX] SEDCMD-first = s/ example: .+?( from |$)/ example: select from /g SEDCMD-second = s/ in \(.+?\) / in (...) /g force_local_processing = true
Under lookups I see a few .csv based & few look up definitions. So where are the KVstore based or scripted based lookups located or are they created manually?
I am trying to get my Ubuntu instance on VirtualBox to ingest my mac's system logs and performance data as remote logs. I already installed the add-on for linux and unix and configured the forwardin... See more...
I am trying to get my Ubuntu instance on VirtualBox to ingest my mac's system logs and performance data as remote logs. I already installed the add-on for linux and unix and configured the forwarding host as 10.0.2.255:9997 on my Mac and receiving port as 9997 on the Ubuntu instance.  I edited the input.config of the add-on for linux and unix and enable all the metrics and put the 'index = mac' on every one of them. I already added the index 'mac' for the admin on the ubuntu instance. However, when I searched 'index = mac' on the ubuntu instance, there is no data.  Is there something important that I am missing? Any help would be appreciated as this is really important as this will determine whether I will have the opportunity. Many thanks!
Hello, I'd like to have my account and all the data associated with it permanently deleted. I've searched the forums and my account dashboard for any indication on how to do so, and only found that ... See more...
Hello, I'd like to have my account and all the data associated with it permanently deleted. I've searched the forums and my account dashboard for any indication on how to do so, and only found that the best way to do it is through the support phone number, which in my country does not exist, and therefore i'm not willing to spend money to make an international phone call to have my account deleted. Therefore I request through this post that my account be deleted. If there's further information that is required to do so please tell me. Thank you
Hi All, I have a deployment app which successfully deployed to a server with the splunk forwarder installed, but which fails to process the files I want. My /opt/splunkforwarder/etc/apps/TA-VPS_Inp... See more...
Hi All, I have a deployment app which successfully deployed to a server with the splunk forwarder installed, but which fails to process the files I want. My /opt/splunkforwarder/etc/apps/TA-VPS_Inputs_Reconciliation/default/input.conf contains the following. There is no inputs.conf in ../local/   (sst)[splunk@myserver default]$ more inputs.conf # Reconciliaiton Report csv written by Incentage # # For report csv files. Using batch input to ensure processed files are deleted [batch:///opt/incentage/dev/components/data/jsonConvToReconciliation/reconciliationreports/*.csv] move_policy = sinkhole index = dev_reconciliation sourcetype = reconciliation:report [batch:///opt/incentage/uat/components1/data/jsonConvToReconciliation/reconciliationreports/*.csv] move_policy = sinkhole index = uat_reconciliation sourcetype = reconciliation:report     The splunk user can see and delete the *.files under the input directory. but the files are not being processed.   (sst)[splunk@myserver default]$ ls -la /opt/incentage/dev/components/data/jsonConvToReconciliation/reconciliationreports/*.csv -rw-rw-r--. 1 imsadm imsadm 1872 Mar 21 14:28 /opt/incentage/dev/components/data/jsonConvToReconciliation/reconciliationreports/2021032114284000005.csv -rw-rw-r--. 1 imsadm imsadm 1872 Mar 21 15:21 /opt/incentage/dev/components/data/jsonConvToReconciliation/reconciliationreports/2021032114284000006.csv     I have no errors or warnings in the splunkd.log but see that the autodeployment worked. I also tried changing from batch:// to monitor:// with no luck (I commented the move_policy= line) I have also tried changing the input path to not include the wildcard *.csv Any ideas where to look?
I have two different drop-down menus and I want to hide and show the second drop-down menu (IP) based on the first option. <input type="dropdown" token="listaCategoria"> <label>Selecione a... See more...
I have two different drop-down menus and I want to hide and show the second drop-down menu (IP) based on the first option. <input type="dropdown" token="listaCategoria"> <label>Selecione a Categoria:</label> <choice value="*">Todos</choice> <fieldForLabel>Categoria</fieldForLabel> <fieldForValue>Categoria</fieldForValue> <search> <query>index=funcionou | table Categoria | dedup Categoria | sort Categoria</query> <earliest>0</earliest> <latest></latest> </search> <default>*</default> <initialValue>*</initialValue> </input> <input type="multiselect" token="multiselect" searchWhenChanged="true"> <label>Selecione os endereços de IP</label> <prefix>IP IN (</prefix> <suffix>)</suffix> <delimiter>, </delimiter> <fieldForLabel>IP</fieldForLabel> <fieldForValue>IP</fieldForValue> <search> <query>index=funcionou | table IP | dedup IP</query> <earliest>0</earliest> <latest></latest> </search> <choice value="*">ALL</choice> <default>*</default> </input>   Another detail, in the search, how should I refer to the two inputs? Ex: Categoria=$listaCategoria$ OR \$multiselect$  
Hi, Splunkers: I have a customer want to reformat the log with Splunk every 1 hour.  Then I created a report runs every 1 hour to export a CSV file but the problem is there will be "," as delimiter,... See more...
Hi, Splunkers: I have a customer want to reformat the log with Splunk every 1 hour.  Then I created a report runs every 1 hour to export a CSV file but the problem is there will be "," as delimiter,and the customer is only want export to log delimit with " ",is there any way to export the search results to log format directly? Or could replace the "," with " " in the results file? By the way, the data size is about 30GB/hour so I don't think python is a good way to process the exported result...  
I have a table abd and I have a drilldown where I can select rows of the table. The problem is, once I click away, the selection (visually) is gone. Then I can't know what row is currently selected. ... See more...
I have a table abd and I have a drilldown where I can select rows of the table. The problem is, once I click away, the selection (visually) is gone. Then I can't know what row is currently selected. How can I fix this? 
Hi, For the scripted input setup i want the script to run only at specific time i.e. 7 AM EST or 8 AM EST. How do i achieve that? lets say below is my current setup, i want this to run at 8 AM EST ... See more...
Hi, For the scripted input setup i want the script to run only at specific time i.e. 7 AM EST or 8 AM EST. How do i achieve that? lets say below is my current setup, i want this to run at 8 AM EST instead of 60  second interval [script://./bin/scripts/massage.sh] sourcetype = mysourcetype index = myindex interval = 60.0
Steps to use included Lookup tables that come with Splunk enterprise & ES. I have over 100 Lookup tables that I have in Splunk Enterprise & about 100 Lookup tables with ES. How ready are they to be u... See more...
Steps to use included Lookup tables that come with Splunk enterprise & ES. I have over 100 Lookup tables that I have in Splunk Enterprise & about 100 Lookup tables with ES. How ready are they to be used. What else do I have to do in order to put them to a good use?
I have the field below and I need the value not to have a comma, because I need to add up. How can I change the "," by a "." The fields are Valor and Frete. Ex: Valor = "164,00" After: Valor = 16... See more...
I have the field below and I need the value not to have a comma, because I need to add up. How can I change the "," by a "." The fields are Valor and Frete. Ex: Valor = "164,00" After: Valor = 164.00 IP="189.80.213.213",Produto="Chuveiro Ducha Advanced Eletronica Turbo Lorenzetti",Valor="164,00",Categoria=Banho,Campanha="2",Vendeu="1",MetododeCompra="1",Bandeira="1",Transportadora="4",Frete="17,26",Time=2021/01/26 19:06:32.179" IP="187.72.163.215",Produto="Home Sleep Painel Para Tv 32 E 42 Acapulco E Panama Simbal",Valor="159,90",Categoria=Sala,Campanha="2",Vendeu="1",MetododeCompra="1",Bandeira="1",Transportadora="3",Frete="16,31",Time=2021/01/26 19:06:31.779" .
Hello you guys! Im new to splunk and I have a BIG question, thanks in advance to everyone who is willing to take on this challenge. My data: events that contain only two fields: 1) ID_CLIENT and a fi... See more...
Hello you guys! Im new to splunk and I have a BIG question, thanks in advance to everyone who is willing to take on this challenge. My data: events that contain only two fields: 1) ID_CLIENT and a field named OP_CODE this last one contains numbers that represent where in a webpage a custumer is at the moment. F.I: The number 34 represents "candy products" and the number 18 represents "stuffed animals" what I want to do: I want to be able to count how many times an ID_CLIENT goes from OP_CPDE=34 to OP_CPDE=18 in a day, or last hour ect... IF YOU CAN HELP ME ABOUT WITH THIS I WILL BE FOREVER THANKFUL ..I know splunk uses a funtion named transaction but Im having a hard time working aorunf the accurcy of the results .. ny help is SO FREAKING HIGHLY appreaciated than you guys so so o much
Hello everyone I hope you are all well and safe! My data= Two fields that contain IDS from clientes of a tea shop, fields= ID_SUGGAR, ID_DOUBLE  What I want to know: I want to be able to identify... See more...
Hello everyone I hope you are all well and safe! My data= Two fields that contain IDS from clientes of a tea shop, fields= ID_SUGGAR, ID_DOUBLE  What I want to know: I want to be able to identify with a function what IDS are in BOTH ID_SUGGAR AND ID_DOUBLE , and also what IDS are only exclusive or only present in ID_SUGGAR (Which means these IDS are not in ID_DOUBLE) for example: ID_SUGGAR="5,1,45,78,100,200,300" ID_DOUBLE="5,1,45,78" My goal is to have a table or a fild that will tell me, the IDS that are in ID_SUGGAR and NOT in ID_DOUBLE are = 100,200,300 Thank you to anyone who can link some documentation about it I Love you all 
Hello Everyone I am trying to figurate out how to impliment a dashboard panel like the one above (which is the one on the splunk official web page) I find it so beautiful and insideful but I have... See more...
Hello Everyone I am trying to figurate out how to impliment a dashboard panel like the one above (which is the one on the splunk official web page) I find it so beautiful and insideful but I have to idea what its name is or where to find documentation about it... Thank you so MUCH to anyone who cna help me out!!! thanks so much
Hi Everyone, I have one requirement. I have multiple dashboards in which data is not coming. I have set the default time range as last 7 days. But the data is not there. so for one dashboard the... See more...
Hi Everyone, I have one requirement. I have multiple dashboards in which data is not coming. I have set the default time range as last 7 days. But the data is not there. so for one dashboard the data is there till 21st Feb  For 2nd dashboard the data is there till 25th Feb. Is there any way in splunk that the dashboard should show the data from the dates available. Like if the data is there  till 25th Feb then it should show the data from Since25th Feb till now If the data is available till 1st March it should show the data  Since 1st March till now My data range drop-dwon code: <input type="time" token="field1" searchWhenChanged="true"> <label>Date/Time</label> <default> <earliest>-7d@h</earliest> <latest>now</latest> </default> </input> Can someone guide me on this. Thanks in advance