All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Good evening, I am trying to configure two radio buttons. I want the first radio button (a csv file in a table form with fields and and values showing multiple columns) when clicked upon to show the... See more...
Good evening, I am trying to configure two radio buttons. I want the first radio button (a csv file in a table form with fields and and values showing multiple columns) when clicked upon to show the field and values by itself in a dashboard panel. I need the same to happen with the second radio button with the other csv file that I need to include on this dashboard.  The goal is to be able to switch back and forth between the two radio buttons to bring up the different csv files without seeing both at one time. I am having trouble configuring the radio button. Some say that I need to construct a Javascript or with Tokens. Can this be done and which is the quickest way to get this done?
Hi, I am using Splunk to grab disk drive metrics but often times I am pulling metrics for drives I don't care about. I want to exclude these from some searches, I am able to do this with a series of ... See more...
Hi, I am using Splunk to grab disk drive metrics but often times I am pulling metrics for drives I don't care about. I want to exclude these from some searches, I am able to do this with a series of NOT/OR commands, as shown below,  but I feel like there is an easier way with less syntax.  How can I exclude instances G thru L ,  in one command, without having to specify each instance as I do below? host=vor* NOT (host="vor-pxy-prd1*" instance=G: OR instance=H: OR instance=I: OR instance=J: OR instance=K: OR instance=L:)   Thanks for any ideas -
We have 3 palo alto firewalls that I'm sending syslog data to a solarwinds kiwi syslog server.  I am having kiwi write the logs to disk and have the splunk universal forwarder send the logs to my spl... See more...
We have 3 palo alto firewalls that I'm sending syslog data to a solarwinds kiwi syslog server.  I am having kiwi write the logs to disk and have the splunk universal forwarder send the logs to my splunk environment.  I have 2 issues which I can't seem to figure out (even after looking at various posts here that mention similar scenarios).   1)  The Palo Alto app states that there is only 1 firewall.  When i look in the logs that "firewall" is the kiwi syslog server.      I have tried adding a Host variable to the inputs (no such luck).  Tried having kiwi just forward the logs               directly to splunk (no such luck).  I have even tried to have kiwi just send the raw data.  In every case kiwi         is appending its own date / time stamp and host value in front of the palo alto messages.  I'm not sure               how to completely strip that information. 2)  The Palo Alto see's the source type as Pan:Logs, but it is not seperating them into their perspective logs:  i.e. Pan:Firewall, Pan:System, Pan:traffic etc..      I have seen suggestions to use a transforms file and / or a props file, but I'm just too new to understand             how to configure them properly. Any help would be appreciated.    
Hello I have the following problem, I need to correlate the FRA-HOR- {Code} data with the string var_sub_fora_ {Code} (LOCAL), but I am not able to correlate the information in the same log: I tr... See more...
Hello I have the following problem, I need to correlate the FRA-HOR- {Code} data with the string var_sub_fora_ {Code} (LOCAL), but I am not able to correlate the information in the same log: I tried this query, but need to finish:   index=teste "FRA-HOR-" | rex field=_raw "FRA-HOR-(?<cod>\d+)" | table _time cod   Log:   2020-10-06T14:34:01.730_I_I_032f03071971a490 [09:06] >>>>>>>>>>>>start interp() _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_list_FraseForaDeHorario(LOCAL) <- STRING[3,10]: "FRA-HOR-02" _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_01(LOCAL) <- STRING[2,133]: "Ol\xE1, nosso hor\xE1rio de atendimento \xE9 das 6:00 \xE0s 23hs de segunda \xE0 domingo. Retorne nesse per\xEDodo e converse com nossos especialistas." _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_02(LOCAL) <- STRING[2,116]: "O nosso hor\xE1rio de atendimento \xE9 de Segunda a Sexta-feira das 08h00 \xE0s 19h00 (hor\xE1rio de Bras\xEDlia), exceto feriados." 2020-10-06T14:34:01.730_I_I_032f03071971a490 [09:06] >>>>>>>>>>>>start interp() _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_list_FraseForaDeHorario(LOCAL) <- STRING[3,10]: "FRA-HOR-05" _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_05(LOCAL) <- STRING[2,133]: "Ol\xE1, nosso hor\xE1rio de atendimento \xE9 das 6:00 \xE0s 23hs de segunda \xE0 domingo. Retorne nesse per\xEDodo e converse com nossos especialistas." _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_02(LOCAL) <- STRING[2,116]: "O nosso hor\xE1rio de atendimento \xE9 de Segunda a Sexta-feira das 08h00 \xE0s 19h00 (hor\xE1rio de Bras\xEDlia), exceto feriados." 2020-10-06T14:34:01.730_I_I_032f03071971a490 [09:06] >>>>>>>>>>>>start interp() _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_list_FraseForaDeHorario(LOCAL) <- STRING[3,10]: "FRA-HOR-03" _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_05(LOCAL) <- STRING[2,133]: "Ol\xE1, nosso hor\xE1rio de atendimento \xE9 das 6:00 \xE0s 23hs de segunda \xE0 domingo. Retorne nesse per\xEDodo e converse com nossos especialistas." _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_03(LOCAL) <- STRING[2,116]: "O nosso hor\xE1rio de atendimento \xE9 de Segunda a Sexta-feira das 08h00 \xE0s 19h00 (hor\xE1rio de Bras\xEDlia), exceto feriados." 2020-10-06T14:34:01.730_I_I_032f03071971a490 [09:06] >>>>>>>>>>>>start interp() _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_list_FraseForaDeHorario(LOCAL) <- STRING[3,10]: "FRA-HOR-04" _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_05(LOCAL) <- STRING[2,133]: "Ol\xE1, nosso hor\xE1rio de atendimento \xE9 das 6:00 \xE0s 23hs de segunda \xE0 domingo. Retorne nesse per\xEDodo e converse com nossos especialistas." _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_03(LOCAL) <- STRING[2,116]: "O nosso hor\xE1rio de atendimento \xE9 de Segunda a Sexta-feira das 08h00 \xE0s 19h00 (hor\xE1rio de Bras\xEDlia), exceto feriados." _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_04(LOCAL) <- STRING[2,116]: "O nosso hor\xE1rio de atendimento \xE9 de Segunda a Sexta-feira das 08h00 \xE0s 19h00 (hor\xE1rio de Bras\xEDlia), exceto feriados." _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_03(LOCAL) <- STRING[2,116]: "O nosso hor\xE1rio de atendimento \xE9 de Segunda a Sexta-feira das 08h00 \xE0s 19h00 (hor\xE1rio de Bras\xEDlia), exceto feriados." 2020-10-06T14:34:01.730_I_I_032f03071971a490 [09:06] >>>>>>>>>>>>start interp() _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_list_FraseForaDeHorario(LOCAL) <- STRING[3,10]: "FRA-HOR-07" _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_07(LOCAL) <- STRING[2,133]: "Ol\xE1, nosso hor\xE1rio de atendimento \xE9 das 6:00 \xE0s 23hs de segunda \xE0 domingo. Retorne nesse per\xEDodo e converse com nossos especialistas." _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_03(LOCAL) <- STRING[2,116]: "O nosso hor\xE1rio de atendimento \xE9 de Segunda a Sexta-feira das 08h00 \xE0s 19h00 (hor\xE1rio de Bras\xEDlia), exceto feriados." _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_04(LOCAL) <- STRING[2,116]: "O nosso hor\xE1rio de atendimento \xE9 de Segunda a Sexta-feira das 08h00 \xE0s 19h00 (hor\xE1rio de Bras\xEDlia), exceto feriados." _I_I_032f03071971a490 [09:04] ASSIGN: var_sub_fora_03(LOCAL) <- STRING[2,116]: "O nosso hor\xE1rio de atendimento \xE9 de Segunda a Sexta-feira das 08h00 \xE0s 19h00 (hor\xE1rio de Bras\xEDlia), exceto feriados."    
 I want to track the executed SQL server queries, however I don't want to enable trace log because it would impact SQL server I/O and consume a lot of local space. So, I don't have any sql server tra... See more...
 I want to track the executed SQL server queries, however I don't want to enable trace log because it would impact SQL server I/O and consume a lot of local space. So, I don't have any sql server trace logs (*.trc files) stored in the server/DB. Is there any work around or splunk app can track the executed SQL server queries?
  I know that someone may have asked this, but the truth is I did not find anything similar. I need to create a query for the events that exceed 500 attempts every 1 hour, when this happens an aler... See more...
  I know that someone may have asked this, but the truth is I did not find anything similar. I need to create a query for the events that exceed 500 attempts every 1 hour, when this happens an alert is generated. index=* dest=200.22.22.22 dest_port=443 action=Accept |stats count by action src dest dest_port |where count > 120 |sort -count
Hi, I want to index a fieldName which contains square brackets Below is the key-value pair format I have and  splunk is not indexing keys value which consists [] eg: root[60]_level[5]=value ... See more...
Hi, I want to index a fieldName which contains square brackets Below is the key-value pair format I have and  splunk is not indexing keys value which consists [] eg: root[60]_level[5]=value Any suggestions?
Hey All, We are looking to upgrade from 7.3.7 to 8.0.6 or the most recent release in the 8.x  code base. Besides potential app incompatibilities, does anyone else have any "gotcha's" or issues/conc... See more...
Hey All, We are looking to upgrade from 7.3.7 to 8.0.6 or the most recent release in the 8.x  code base. Besides potential app incompatibilities, does anyone else have any "gotcha's" or issues/concerns they have run into with an upgrade like this?   Thanks! Andrew
I am doing some splunk training with Splunk Fundamental 2 Training. I will need help locating the pdf file that will guide me how the fundamental 2 splunk training on what data are ingested into splu... See more...
I am doing some splunk training with Splunk Fundamental 2 Training. I will need help locating the pdf file that will guide me how the fundamental 2 splunk training on what data are ingested into splunk for the training. I had tried tutorialdata.zip file which I ingested into Splunk 8.0.6 version but I am getting an error.  Please show me or direct me the right path.
I am trying to figure out how to get data out of the event and into a field. I need to get all the data in brackets. Sample data: Run,jump,fly[walk]dog/cat File storage ,[app - run] development I... See more...
I am trying to figure out how to get data out of the event and into a field. I need to get all the data in brackets. Sample data: Run,jump,fly[walk]dog/cat File storage ,[app - run] development I want the result put into a new field and look like this: walk app - run
Hi All,   I am in an interesting predicament in the environment I work with where our traditional method of tagging devops hosts via UF (in props/transforms) will no longer suffice as the team is m... See more...
Hi All,   I am in an interesting predicament in the environment I work with where our traditional method of tagging devops hosts via UF (in props/transforms) will no longer suffice as the team is moving to kubernetes and using Splunk Connect to forward to the HEC on our SH.  A few of our Splunk end-users are questioning our ability to dynamically create tags, which they rely heavily on when creating custom reports and dashboards. Long story short, I am curious if there is the ability to assign a field value (i.e, altci) to a tag when a log is sent to the HEC. I believe there may be the ability to do this on the indexer level as data is being sent through the indexing pipeline but I do not have any experience and I can't find any documentation that states this. I would appreciate any type of guidance on this matter. Thank you! Dan   Ideally, it would be something like: If log has a field named altci, turn the field value into a tag=altcivalue.  
Hello, I am trying to get a Chart that would tell me the % of calls that had an error.  However, I can't find a Metric for this.  I was thinking, maybe I could obtain this metric by combining Calls ... See more...
Hello, I am trying to get a Chart that would tell me the % of calls that had an error.  However, I can't find a Metric for this.  I was thinking, maybe I could obtain this metric by combining Calls per minute and Errors per minute metrics.  For example  at 11:20am we had 79 errors/182 transactions,  I need the chart to show the % of 43%. Is this possible in AppDynamics? Thanks for any help on this, Tom
Hello, Is it possible to configure 'REST' data input with a 'payload' parameter (bolded section in below 'curl' command)? I am looking to configure below curl command as a 'REST API' input in the... See more...
Hello, Is it possible to configure 'REST' data input with a 'payload' parameter (bolded section in below 'curl' command)? I am looking to configure below curl command as a 'REST API' input in the Splunk Addon-builder. REST API call: curl  -XGET -H 'Authorization: Bearer <token>' -H 'Content-Type: application/json' https://<app_dns>/api/case --data '{"query": {"status": "Resolved"}}'
My search is pulling out events with the date embedded within the event, eg: [2020-10-05 07:23:08.308] ALL **** sending file= /u01/shared/pc/files/OUT/IF_148/Transferring/XXX_XXXXXX..XX_LUIS_312XXX_... See more...
My search is pulling out events with the date embedded within the event, eg: [2020-10-05 07:23:08.308] ALL **** sending file= /u01/shared/pc/files/OUT/IF_148/Transferring/XXX_XXXXXX..XX_LUIS_312XXX_20201005_001.csv   [2020-10-05 13:40:17.101] ALL **** sending file= /u01/shared/pc/files/OUT/IF_148/Transferring/XXX_XXXXXX..XX_EDIW_312XXX_20200721_003.CSV The embedded dates in the format '202001005' & '202001005' respectively. Is there a way to search in Splunk for events that will only pick up the events for a certain date embedded within them? - without having to manually alter the search each time?   (current search for ref: **** sending file= /u01/shared/pc/files/OUT/IF_148/Transferring/*312*)
Hello, we tried the vmware app from splunk to monitor our environment. But we think the app is not detailed enough. So for performance values we are searching for an alternative. Does someknow an a... See more...
Hello, we tried the vmware app from splunk to monitor our environment. But we think the app is not detailed enough. So for performance values we are searching for an alternative. Does someknow an alternative ? The ITSI app is very expensive, so that we wish us a cheaper solution.
I have a set of devices that are identified by a very long 15 number.   The first 8 numbers are just a prefix which we would like to hide and only display last 7 numbers   867723030939341 is an e... See more...
I have a set of devices that are identified by a very long 15 number.   The first 8 numbers are just a prefix which we would like to hide and only display last 7 numbers   867723030939341 is an example   index=index1 sourcetype=devices | stats count by device....   Is there a ay to rem
How to find out the last configuration change date, time and what configuration applied  in the universal forwarder from the local log files on the UF. In which log file I can see this info. I need t... See more...
How to find out the last configuration change date, time and what configuration applied  in the universal forwarder from the local log files on the UF. In which log file I can see this info. I need to troubleshoot UF sending log to my indexer.
APIから送られるjsonファイルで24時間・毎分のデータを取得し、 データの取り込みでデフォルトのソースタイプ:_jsonを指定しました。 任意の形でテーブルを作成することができましたが、 一つのフィールドに表示されるデータは42行分しかありませんでした。 (24hx60min=1440count) 上限値を修正し、1440行分を完全に表示することは可能でしょうか?
Hi, Splunk Folks, I would like to why INDEXER crashes very often in the Cluster Environment. What are the steps I need to check. Any config needs to be checked? tuned?. I am new to Splunk Admin. Ple... See more...
Hi, Splunk Folks, I would like to why INDEXER crashes very often in the Cluster Environment. What are the steps I need to check. Any config needs to be checked? tuned?. I am new to Splunk Admin. Please suggest
Hi Community, after installing Splunk everything seemed to work. But after searching for Splunk Apps i get the following error in the app browser: errror connecting: error:14090086:SSL routines:ss... See more...
Hi Community, after installing Splunk everything seemed to work. But after searching for Splunk Apps i get the following error in the app browser: errror connecting: error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed - please check the output of the `openssl verify` command for the certificates involved; note that if certificate verification is enabled (requireClientCert or sslVerifyServerCert set to "true"), the CA certificate and the server certificate should not have the same Common Name..  So I understand that there is a certificate error but I am not sure how to solve it.  Thanks in advance