All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We have an on-prem Splunk deployment with 50TB/day ingest, 22PB storage, long term retention (1 year on most indexes). We use Kafka as primary ingest method. We collect cloud logs in Azure using Even... See more...
We have an on-prem Splunk deployment with 50TB/day ingest, 22PB storage, long term retention (1 year on most indexes). We use Kafka as primary ingest method. We collect cloud logs in Azure using EventHub and Kinesis in AWS. We then use Kafka MirrorMaker to bring cloud logs into on-prem Kafka. The volume ratio is currently 95% on-prem vs 5% cloud logs by volume. Inevitably this ratio will change in favor of cloud logs. And the cost to bring cloud logs on-prem will rise. For Azure alone we estimate Microsoft charge $15k/month for 5TB/day data egress. And of course the size of our cloud pipe is an issue. Therefore we are considering an architecture which will avoid backhauling cloud logs. Our stakeholders have made it clear they want a seamless experience - no logging into multiple platforms. Is your organization in a similar position to this or have you overcome this problem?       
Event 1: Product=shirt1 sku=123 sku=234 Event 2: Product=shirt2 sku=987 sku=789   index= store | rex field=_raw max_match=0 "sku\W(?P<sku>.*?)\," |stats count by _time, sku o/p: ... See more...
Event 1: Product=shirt1 sku=123 sku=234 Event 2: Product=shirt2 sku=987 sku=789   index= store | rex field=_raw max_match=0 "sku\W(?P<sku>.*?)\," |stats count by _time, sku o/p: _time sku count 01-04-23 123 1 01-04-23 234 1 01-04-23 987 1 01-04-23 789 1   Output I’m looking for _time count 01-04-23 4
Hey people,   Here is what I am trying to do I have a pie chart created with the data The above pie chart is generated from the following query   ...| table filterExecutionTime ddbWr... See more...
Hey people,   Here is what I am trying to do I have a pie chart created with the data The above pie chart is generated from the following query   ...| table filterExecutionTime ddbWriteExecutionTime buildAndTearDownTime | transpose 0     The pie chart looks stunning, but the only pain point is that to see the values I have to hover on the elements   Instead what I was thinking to make is a legends tab which will show the Names along with values   I was able to create a legend tab, but I couldn't add the values to it   Here is how I did the legends tab   <panel id="panel_legend_2"> <table> <search base="errors2"> <query>|fields column | rename column as "Legends" </query> </search> <format type="color" field="Legends"> <colorPalette type="sharedList"></colorPalette> <scale type="sharedCategory"></scale> </format> </table> </panel>     Could you please help me out here
Hey people, Here is what I am trying to do: - I have two dashboards, dashboardA & dashboardB - I am sending a token value from dashboardA -> dashboardB - Inside dashboardB, I want to create a... See more...
Hey people, Here is what I am trying to do: - I have two dashboards, dashboardA & dashboardB - I am sending a token value from dashboardA -> dashboardB - Inside dashboardB, I want to create a dynamic dropdown based on token value   This is how my dropdown should look like: - lets say the token value is 10, I want to display choices from 1 to 10 in the dropdown   My thinking was to do something like this:     <input type="dropdown" token="subrack_No"> <choice value="*">All</choice> <search> <query> | makeresults | "some command which generates data from 1 to token value" </query> </search> <default>1</default>       Could you please help me here
Hi, Splunkers,  I  have the following table drilldown earliest and latest time works properly. but when I copied it to a chart drilldown, it stopped working. I noticed table drilldown has the f... See more...
Hi, Splunkers,  I  have the following table drilldown earliest and latest time works properly. but when I copied it to a chart drilldown, it stopped working. I noticed table drilldown has the following drilldown option value= row. <option name="drilldown">row</option> and the link for passing  earliest and latest time uses row.StartDTM_epoch,  and  row.EndDTM_epoch form.field2.earliest=$row.StartDTM_epoch$&form.field2.latest=$row.EndDTM_epoch$   I noticed my chart drilldown has the following drilldown option value= all. <option name="drilldown">all</option> so, I changed it to form.field2.earliest=$all.StartDTM_epoch$&form.field2.latest=$all.EndDTM_epoch$ not sure if the all.StartDTM_epoch and all.EndDTM_epoch casusing the failure.     the following is the related working code for table drilldown to pass earliest and latest time. | eval StartDTM_epoch = relative_time(_time,"-20m") | eval EndDTM_epoch = relative_time(_time,"+20m") | eval TIME = strftime(_time, "%Y-%m-%d %H:%M:%S") | table _time,sid,Type,AgentName,DN,FAddress,Segment,Function,Client,Product,SubFunction,SubFDetail,MKTGCT,CCType,VQ,TLCnt,AFRoute,StateCD,TargetSelected,AFStatus,,CBOffered,CBRejected,AQT,EWT,EWTmin,PIQ,WT,LSInRange,LSPriority,LSRateS,PB,PCSS,PENT,PF,RONA,LANG,StartDTM_epoch,EndDTM_epoch</query> <earliest>$field2.earliest$</earliest> <latest>$field2.latest$</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">row</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">true</option> <option name="totalsRow">false</option> <option name="wrap">true</option> <fields>["_time","sid","Type","AgentName","DN","FAddress","Segment","Function","Client","Product","SubFunction","SubFDetail","MKTGCT","CCType","VQ","TLCnt","AFRoute","StateCD","TargetSelected","AFStatus","CBOffered","CBRejected","AQT","EWT","EWTmin","PIQ","WT","LSInRange","LSPriority","LSRateS","PB","PCSS","PENT","PF","RONA","LANG"]</fields> <drilldown> <condition match="$t_DrillDown$ = &quot;*&quot;"> <link target="_blank"> <![CDATA[ /app/optum_gvp/guciduuidsid_search_applied_rules_with_ors_log_kvp?form.Gucid_token_with2handlers=$click.value2$&form.field2.earliest=$row.StartDTM_epoch$&form.field2.latest=$row.EndDTM_epoch$ ]]> </link>   thx in advance.   Kevin
I have a dashboard with a table with 6 headers.  I would like to bold the text of the second, fourth, and fifth column headers.  I searched around but could only find solutions for the first or last ... See more...
I have a dashboard with a table with 6 headers.  I would like to bold the text of the second, fourth, and fifth column headers.  I searched around but could only find solutions for the first or last column headers (first-child, last-child) but nothing for anything in-between. If bolding isn't an option, I would also be open to simply coloring the background of that column from the default gray to say blue and make the column header text white or something like that.  The idea is to make the 2nd, 4th, and 5th column headers visually standout from the remaining in the table. Example table: index second source fourth fifth count
For a table panel, I would like to create a format rule (<format>) of type="color" that sets a threshold based on another field's data. Here is a sample search (note - "streamstats" and "count_strea... See more...
For a table panel, I would like to create a format rule (<format>) of type="color" that sets a threshold based on another field's data. Here is a sample search (note - "streamstats" and "count_stream" are used just to get sample data):   | makeresults count=3 | streamstats count AS count_stream | eval count_total = count_stream * 10 ,count_metric = count_stream * count_stream * 2 | eval perc_metric = count_metric / count_total * 100 | table perc_metric count_metric, count_total   I would like to create color rules for the "count_metric" field/column, but base it on the "perc_metric" value. My goal is to leverage the percentage values in "perc_metric," and apply color to the cells in "count_metric," hiding the "perc_metric" field/column in the final output. Here's my Simple XML <dashboard> <label>test</label> <row> <panel> <table> <search> <query>| makeresults count=3 | streamstats count AS count_stream | eval count_total = count_stream * 10 ,count_metric = count_stream * count_stream * 2 | eval perc_metric = count_metric / count_total * 100 | table perc_metric count_metric, count_total</query> <earliest>@d</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">50</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> <format type="color" field="perc_metric"> <colorPalette type="list">[#DC4E41,#F1813F,#53A051]</colorPalette> <scale type="threshold">21,41</scale> </format> </table> </panel> </row> </dashboard>
Hi, I have trial account with Splunk Cloud, where I am doing POC on sending the API logs to the SPlunk dashobard. So, fort that I have created HEC token, which is wokring fine with Postman and curl... See more...
Hi, I have trial account with Splunk Cloud, where I am doing POC on sending the API logs to the SPlunk dashobard. So, fort that I have created HEC token, which is wokring fine with Postman and curl command.  But when I am actually trying to use that with-in my react app, to send some data by calling the API with post request, I am getting below error, which I can understand becaus eits trying to access different domain and pslunk instance is not accepting request from other domian. And I am not sure how to fix this, may be is there some setting we can do on instance level to allow traffic from my specific domain ?   Can someone please help,     
Hello, When I was trying to create new connection for my dbConnect, using Connection Type DB2, I am getting error message: "No Suitable Driver is Available" saying no compatible driver found in "dr... See more...
Hello, When I was trying to create new connection for my dbConnect, using Connection Type DB2, I am getting error message: "No Suitable Driver is Available" saying no compatible driver found in "drivers" directory. How would I get the driver installed to resolve this issue? Thank you so much in advance for your support.      
Hi everyone We have a problem when we want graphics a customer metrics on the dashboard: We can create it: We wait 24 hrs for create the dashboard and build the new graphics, but when we going... See more...
Hi everyone We have a problem when we want graphics a customer metrics on the dashboard: We can create it: We wait 24 hrs for create the dashboard and build the new graphics, but when we going to create it, the SAAS  show us a one metrics only: The other one is not possible graphics because appdynamics don't show us. What can we do for to see all customer metrcis on dashboard? regards Henry tellez
Hello and happy new year to all, As the title says I would like to have the list of servers that have connected over the last 14 days (Lastlogon)... I have tried several methods but nothing works, ... See more...
Hello and happy new year to all, As the title says I would like to have the list of servers that have connected over the last 14 days (Lastlogon)... I have tried several methods but nothing works, here is my query :  index=msad  SamAccountName=*$ VersionOS="Windows Server*" | eval llt=strptime(LastLogon,"%d/%m/%Y %H:%M:%S") | eval LastLogon2=strftime(llt, "%d/%m/%Y %H:%M:%S") | rex field=SamAccountName mode=sed "s/\$//g" | table Domain,SamAccountName,VersionOS,LastLogon2 Thanks 
Hello guys. I received this task at my job, and I still need money in my pocket, so please help me :))  I'm in a Network Operational team; maybe this will help you understand better the following de... See more...
Hello guys. I received this task at my job, and I still need money in my pocket, so please help me :))  I'm in a Network Operational team; maybe this will help you understand better the following description. So, In a single Splunk search I have to connect 2 databases, from different servers.  One DB contains "Incidents":  Incident ID, Start time of the Incident (Let's call it A), End time of the incident (B) The other DB contains  "Call Complaints":  The timestamp of each Call complaint (C). I need to find out the amount of call complaints for each incident. So, if C>=A AND C<=B, we count a call complaint for a specific incident, and we can move on to check the next C timestamp, and so on.  I have issues right from the start. I tried to connect the databases with the next syntax: | dbxquery query=[...]  connection=A | join               [ dbxquery  query=[...]  connection=B] But, when I try a table command to see the interesting fields for me (Incident ID, A, B, C), the fields from the joined DB are looking the same on each line.. Could you please help me with this? 
Hi All, I have got 2 set of logs, one of which has the Connector details and the other has got the error details if any connector gets failed.     First set: Log1: sink.http.apac.hk.dna.eap.a... See more...
Hi All, I have got 2 set of logs, one of which has the Connector details and the other has got the error details if any connector gets failed.     First set: Log1: sink.http.apac.hk.dna.eap.adobedmp.derived.int.aamrtevents FAILED|mwgcb-csrla02u RUNNING|mwgcb-csrla02u NA Log2: sink.http.apac.au.dna.eap.adobedmp.derived.int.aamrtevents RUNNING|mwgcb-csrla02u RUNNING|mwgcb-csrla02u NA Log3: sink.http.apac.ph.dna.eap.adobedmp.derived.int.aamrtevents FAILED|mwgcb-csrla01u FAILED|mwgcb-csrla01u NA Log4: sink.http.apac.th.dna.eap.adobedmp.derived.int.aamrtevents FAILED|swgcb-csrla01u FAILED|mwgcb-csrla01u NA Here I used the below query to extract the required fields: ... | rex field=_raw "(\w+\.)(?P<Component>\w+)\." | rex field=_raw "(?P<Connector_Name>(\w+\.){3,12}\w+)\s" | rex field=_raw "(?P<Connector_Name>(\w+\-){3,12}\w+)\s" | rex field=_raw "(\w+\.){3,12}\w+\s(?P<Connector_State>\w+)\|" | rex field=_raw "(\w+\-){3,12}\w+\s(?P<Connector_State>\w+)\|" | rex field=_raw "(\w+\.){3,12}\w+\s\w+\|(?P<Worker_ID>\w+\-\w+)\s" | rex field=_raw "(\w+\-){3,12}\w+\s\w+\|(?P<Worker_ID>\w+\-\w+)\s" | rex field=_raw "(\w+\.){3,12}\w+\s\w+\|\w+\-\w+\s((\_KK\_){0,1})(?P<Task1_State>\w+)\|" | rex field=_raw "(\w+\-){3,12}\w+\s\w+\|\w+\-\w+\s((\_KK\_){0,1})(?P<Task1_State>\w+)\|" | rex field=_raw "(\w+\.){3,12}\w+\s\w+\|\w+\-\w+\s((\_KK\_){0,1})\w+\|(?P<Worker1_ID>\w+\-\w+)\s" | rex field=_raw "(\w+\-){3,12}\w+\s\w+\|\w+\-\w+\s((\_KK\_){0,1})\w+\|(?P<Worker1_ID>\w+\-\w+)\s" | replace "mwgcb-csrla01u_XX_" with "mwgcb-csrla01u" in Worker1_ID | replace "mwgcb-csrla02u_XX_" with "mwgcb-csrla02u" in Worker1_ID | rex field=_raw "(\w+\.){3,12}\w+\s\w+\|\w+\-\w+\s((\_KK\_){0,1})\w+\|\w+\-\w+\s((\_KK\_){0,1})(?P<Task2_State>\w+)" | rex field=_raw "(\w+\-){3,12}\w+\s\w+\|\w+\-\w+\s((\_KK\_){0,1})\w+\|\w+\-\w+\s((\_KK\_){0,1})(?P<Task2_State>\w+)" | replace "NA" with "Not_Available" in Task2_State | rex field=_raw "(\w+\.){3,12}\w+\s\w+\|\w+\-\w+\s((\_KK\_){0,1})\w+\|\w+\-\w+\s((\_KK\_){0,1})\w+\|(?P<Worker2_ID>\w+\-\w+)" | rex field=_raw "(\w+\-){3,12}\w+\s\w+\|\w+\-\w+\s((\_KK\_){0,1})\w+\|\w+\-\w+\s((\_KK\_){0,1})\w+\|(?P<Worker2_ID>\w+\-\w+)" | replace "mwgcb-csrla01u_XX_" with "mwgcb-csrla01u" in Worker2_ID | replace "mwgcb-csrla02u_XX_" with "mwgcb-csrla02u" in Worker2_ID | fillnull value="Not_Available" Task1_State,Worker1_ID,Task2_State,Worker2_ID Second set: Log1: } Log2: "type": "sink" Log3: "tasks": [], Log4: "name": "sink.http.apac.hk.dna.eap.adobedmp.derived.int.aamrtevents", Log5: }, Log6: "worker_id": "mwgcb-csrla02u.nam.nsroot.net:8074" Log7: "trace": "org.apache.kafka.common.errors.SslAuthenticationException: SSL handshake failed\nCaused by: javax.net.ssl.SSLProtocolException: Unexpected handshake message: server_hello\n\tat sun.security.ssl.Alert.createSSLException(Alert.java:129)\n\tat sun.security.ssl.Alert.createSSLException(Alert.java:117)\n\tat sun.security.ssl.TransportContext.fatal(TransportContext.java:356)\n\tat sun.security.ssl.TransportContext.fatal(TransportContext.java:312)\n\tat sun.security.ssl.TransportContext.fatal(TransportContext.java:303)\n\tat sun.security.ssl.HandshakeContext.dispatch(HandshakeContext.java:473)\n\tat sun.security.ssl.SSLEngineImpl$DelegatedTask$DelegatedAction.run(SSLEngineImpl.java:990)\n\tat sun.security.ssl.SSLEngineImpl$DelegatedTask$DelegatedAction.run(SSLEngineImpl.java:977)\n\tat java.security.AccessController.doPrivileged(Native Method)\n\tat sun.security.ssl.SSLEngineImpl$DelegatedTask.run(SSLEngineImpl.java:924)\n\tat org.apache.kafka.common.network.SslTransportLayer.runDelegatedTasks(SslTransportLayer.java:501)\n\tat org.apache.kafka.common.network.SslTransportLayer.handshakeUnwrap(SslTransportLayer.java:593)\n\tat org.apache.kafka.common.network.SslTransportLayer.doHandshake(SslTransportLayer.java:439)\n\tat org.apache.kafka.common.network.SslTransportLayer.handshake(SslTransportLayer.java:324)\n\tat org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:205)\n\tat org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:561)\n\tat org.apache.kafka.common.network.Selector.poll(Selector.java:498)\n\tat org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:575)\n\tat org.apache.kafka.clients.admin.KafkaAdminClient$AdminClientRunnable.processRequests(KafkaAdminClient.java:1358)\n\tat org.apache.kafka.clients.admin.KafkaAdminClient$AdminClientRunnable.run(KafkaAdminClient.java:1289)\n\tat java.lang.Thread.run(Thread.java:750)\n", Log8: "state": "FAILED", Log9: "connector": { Log10: { And then the set repeats for the next Connector. I tried to extract the required fields using below queries: Query1: ... | regex _raw="name" | rex field=_raw "name\"\:\s\"(?P<Connector>[^\"]+)\"\," Query2: ... | rex field=_raw "Caused\sby\:\s(?P<Exception>[^\:]+)\:\s" | search Exception!="org.apache.kafka.common.KafkaException" | rex field=_raw "Caused\sby\:\s([^\:]+)\:\s(?P<Error_Msg>(\w+\:\s){0,1}(\w+\s){1,15}\w+)"     Note that, both the set of logs have different source but same index. I want to create a table using both the set of logs such that along with the failed connector we can see the error details also. For e.g. Connector_Name Connector_State Worker_ID Task1_State Worker1_ID Task2_State Worker2_ID Exception Error_Msg sink.http.apac.hk.dna.eap.adobedmp.derived.int.aamrtevents FAILED mwgcb-csrla02u RUNNING mwgcb-csrla02u NA NA javax.net.ssl.SSLProtocolException Unexpected handshake message sink.http.apac.au.dna.eap.adobedmp.derived.int.aamrtevents RUNNING mwgcb-csrla02u RUNNING mwgcb-csrla02u NA NA NA NA   I need to Connector name from second set and compare with first set, and whichever connector has failed state, I need to add the Exception and Error_Msg there. Please help me to get my desired table. Your kind help is highly appreciated.   Thank you..!!
Hi everyone, I'm very new here. I need support with extracting  this field,  "safeframe.googlesyndication.com"  from "ofc62fbe04078e8d3b0843298ad3421d.safeframe.google syndication.com" using regex e... See more...
Hi everyone, I'm very new here. I need support with extracting  this field,  "safeframe.googlesyndication.com"  from "ofc62fbe04078e8d3b0843298ad3421d.safeframe.google syndication.com" using regex expressions or is there any other command I can use to delete the crap before the urlhost? Thank you.
Splunk Enterprise 9.0.1. We upgrade DB Connect from 3.7.0. to 3.11.1 and we started to notice that some inputs were not working. Also on _internal we found an event that lists inputs that require m... See more...
Splunk Enterprise 9.0.1. We upgrade DB Connect from 3.7.0. to 3.11.1 and we started to notice that some inputs were not working. Also on _internal we found an event that lists inputs that require migration. INFO c.s.d.s.m.s.DbInputToModularInputInstanceMigrator - DB Inputs requiring migration: [...] The actual documentation doesnt say anything about this migration. https://docs.splunk.com/Documentation/DBX/3.11.1/DeployDBX/MigratefromDBConnectv1 Is there any other documentation on how to migrate to the latest version? UPDATE: We found that the cron expression validation changed between versions: '1/10 * * * *' was a valid expression in 3.7, now its no longer valid. We have to do '1-59/10 * * * *'. Also we have to disable and then enable the input for it to start working again, changing the cron is not enough.  
Hello,  Using dashboard studio (DS), and having an issue formatting a table with a single column based on its value. The field values look something like this. CPU: 37.1% C MEM: 60.2% W NET: 3.... See more...
Hello,  Using dashboard studio (DS), and having an issue formatting a table with a single column based on its value. The field values look something like this. CPU: 37.1% C MEM: 60.2% W NET: 3.6% N Here is the pseudocode for selecting the background color of the cell based on its value. If the value contains at least one "% C" the background color of the cell needs to be red. Else if the value contains at least one "% W", the background color of the cell needs to be yellow. Else if the value does not contain "% C" AND does not contain "% W",  the background color of the cell needs to be green. End if. According to some web page I read about DS, you cannot perform any inline CSS or JS functions. I don't believe I need JS for this simple use case. I read here, Selector and formatting functions, that I can use matchValue(). After resolving all the json errors (brackets and braces missing), no change in the color of cell background. Thanks in advance and God bless, Genesius
Hello, Can someone please kindly help me with the Javascript for the below criteria. create a JavaScript that calls PagerDuty using a button, eventually passing parameters from the dashboard. ... See more...
Hello, Can someone please kindly help me with the Javascript for the below criteria. create a JavaScript that calls PagerDuty using a button, eventually passing parameters from the dashboard. Thanks
Hi there,  I have a dashboard with bar charts.  The value of a bar chart would be in this format. e.g. "foo-10-bar".  After an user clicks on the bar-chart, it leads to another dashboard w... See more...
Hi there,  I have a dashboard with bar charts.  The value of a bar chart would be in this format. e.g. "foo-10-bar".  After an user clicks on the bar-chart, it leads to another dashboard with drilldown generating token with field values as $field$= "foo-10-bar" in this case.  Is this possible to split the field value in three parts with - as a separator, when user clicks on the  bar chart. so in this case, it would be three tokens as $FieldA$= foo; $FieldB$=10; $FieldC$=bar? Thanks. 
Hi there, I have a use case to query internal and external ip addresses of the host which has UF installed. I am using approach below and hoping for a better solution. Appreciate your help in advan... See more...
Hi there, I have a use case to query internal and external ip addresses of the host which has UF installed. I am using approach below and hoping for a better solution. Appreciate your help in advance! For external IP: index=_internal group=tcpin_connections hostname=*  This will provide me sourceIp (external ip) For Internal IP: index=_internal sourcetype=splunkd_access phonehome | rex command to retrieve internal ip from the string Is this the correct approach? I was hoping for a single search to retrieve both IPs.