All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I have a performance issue with a query using a "join" command. The problem is that the first search using a time picker on last 4 hours, and the search in the join (type outer) hook using "earl... See more...
Hi, I have a performance issue with a query using a "join" command. The problem is that the first search using a time picker on last 4 hours, and the search in the join (type outer) hook using "earliest=-30d" Example of the query :  index="A" sourcetype="AB" source="C"  | eval launch_time=round(strptime(launch_time, "%Y-%m-%dT%H:%M:%S"),0) | eval search_time=now() | eval launched_since=round((search_time-launch_time)/86400,0) | where launched_since > 7 | dedup id sortby -_time | lookup all_ids account_id OUTPUT acc_name site | site=* | join type=outer id [ search index="A" sourcetype="AC" source="D" earliest=-30d | lookup all_ids account_id OUTPUT acc_name site | site=* | rename agentId as id | dedup rpg id | sort rpg  | stats values(rpg_name) as pg by id acc_name site | eval Name=if(like(pg,"%name1/%"),"Name1","Name2") | table id title platform pg Name] | table site acc_name id pg Name launched_since  | dedup acc_name id | eval Name2=if(isnull(Name), "NULL", Name) | stats count(id) as count by Name2   Is it possible to make a search more efficient ? Thanks in advance !
  Is it normal to have colddb > db? 4.0K    thaweddb 20M     summary 4.2G    datamodel_summary 9.8G    db 59G     colddb [root@xxxxxxxplunkdata/_internaldb
We noticed that a host "1234"  is not longer connecting with the DS.  What does this mean?  What would be the impact? How do we troubleshoot this?  Thanks.    
Dear all!  I am trying to use a dynamic value for my epsilon in the MLTK in Splunk:   map search="search index = cisco_prod | timechart span=1h count as logins_hour | timewrap w series=short| fi... See more...
Dear all!  I am trying to use a dynamic value for my epsilon in the MLTK in Splunk:   map search="search index = cisco_prod | timechart span=1h count as logins_hour | timewrap w series=short| fields - logins_hour_s6|table logins_hour_s*|transpose 0| fit DBSCAN \"row *\" eps=$eps$"    This doesn't return anything if eps is a float, only when I first round the dynamic variable. However if I run the same search with a static float value for eps instead of my variable, it returns the clustering I am looking for. Has anyone an idea what's wrong with my dynamic search?  Thanks! 
I'm building an app where Splunk is receiving a large number of IDs and a property that I will need to sum over time.  Let's take for instance a simple example: _time id amount 00:00 A 1 ... See more...
I'm building an app where Splunk is receiving a large number of IDs and a property that I will need to sum over time.  Let's take for instance a simple example: _time id amount 00:00 A 1 00:01 C 8 01:01 B 2 01:02 A 4   At 01:03, a user asks "What is the sum of amount per id, for each id you've seen in the last 15 minutes" ? Take into account this is a very limited example, but you will have millions of unique ids and a big event rate (hundreds of events per second). The expected answer is: A 5 B 2   The first reflex is something like this: "stats sum(amount) earliest=0 by id [ |search id ]" over the last hour. But it's not really scable since, over time, the first part of the query will need to sum a lot of events. Then the second thought was to add an intermediary summary index which is doing the sum(amount) over a small period of time (15 min) and keep the result. Yes, it's accelarating but after a year or so, I will still have performance / scalability issues. In the end, then only thing I need is to keep the last value "sum(amount)" per id and continue counting based on this value every time you receive a new event. That's why I'm wondering if we could use the kvstore to keep counting, everytime we see a record, we simply update the records with : key = id value = value(id) + amount Anyone having a similar experience with KV Store or having a similar issue ?
Hello Team,   We are trying to adding the inputs and configuration in microsoft azure add-on for splunk,  but pages are not opening it's buffering from long time back. can you please suggest the ... See more...
Hello Team,   We are trying to adding the inputs and configuration in microsoft azure add-on for splunk,  but pages are not opening it's buffering from long time back. can you please suggest the solution and what else i need to for the page opening. you can find the attachment for that.  URL:https://prd-p-dbp3o.splunkcloud.com Regards, kartheek.
Hi , I followed the instruction to setup self-storage   https://docs.splunk.com/Documentation/SplunkCloud/7.2.7/Admin/DataSelfStorage?ref=hk I've confirmed Splunk created the test file in my s3 b... See more...
Hi , I followed the instruction to setup self-storage   https://docs.splunk.com/Documentation/SplunkCloud/7.2.7/Admin/DataSelfStorage?ref=hk I've confirmed Splunk created the test file in my s3 bucket. But it does not move my index data to the s3 bucket. I don't see any files expect the test one Expect to configure my index with "Self-storage" and select "storage location",  do I need to do something to trigger?  Any suggestion is appreciated. Thanks.   My index setting: Earliest Event ==> "10 months ago", latest Event ==> "a month ago" , status ==> "Enable" Searchable time (days) => 7 Dynamic Data Storage => Self Storage,  Self storage location ==> <my s3 bucket>  
Hi  When  we used to run the following query index=symantec we would get the following result. host = dev1pgs01 source = D:\Program Files (x86)\Symantec\Symantec Endpoint Protection Manager\data\... See more...
Hi  When  we used to run the following query index=symantec we would get the following result. host = dev1pgs01 source = D:\Program Files (x86)\Symantec\Symantec Endpoint Protection Manager\data\dump\scm_system.tmp sourcetype = sep12:scm_system Till 29/08/2018 we were getting the results. But now when we run the same query index=symantec it shows no results found Below are our following findings/Actions. Splunk forwarder was not installed we installed it. We checked this path  source = D:\Program Files (x86)\Symantec\Symantec Endpoint Protection Manager\data\dump\scm_system.tmp but in dump file the file is empty there is nothing like sc_system.tmp Sourcetype is also missing. Please help me to fix this issue.
I implemented the <colorpalete> tag giving condition for success and failed scenario but its not getting impacting the code . visual- <format type="color" field="Stream"> <colorPalette... See more...
I implemented the <colorpalete> tag giving condition for success and failed scenario but its not getting impacting the code . visual- <format type="color" field="Stream"> <colorPalette type="expression">case(value="P2O","#51cfc8",value="O2A","#EAD1DC",value="U2C","#93C47D")</colorPalette> </format> <format type="color"> <colorPalette type="expression">case(value="Success","#428C43",value="Failed","#B72D0E",value="NA","#b5b8bd")</colorPalette> </format> source code- <dashboard theme="dark"> <label>Dummy123</label> <init> <unset token="tok_details" /> <unset token="tok_json" /> </init> <search id="API_Base_Search"> <query><![CDATA[index=devops-insights sourcetype="http:davinci_api_metrics" | rex mode=sed "s/\"//g" | rex "name[\:\s]+(?<API_Name>.*?)\s+\((?<Applications>.*?)\)" | rex "steps[\:\s]+\[\s+\{\s+result[\:\s]+\{\s+duration[\:\s]+\w+\,\s+status[\:\s]+(?<APIServiceCall>.*?)\s+\}\,\s+line[\:\s]+\w+\,\s+name[\:\s]+(?<Step1>.*?)\," | rex "keyword[\:\s]+Given\s+\}\,\s+\{\s+result[\:\s]+\{\s+duration[\:\s]+\w+\,\s+status[\:\s]+(?<APIRequest>.*?)\s+\}\,\s+line[\:\s]+\w+\,\s+name[\:\s]+(?<Step2>.*?)\," | rex "keyword[\:\s]+When\s+\}\,\s+\{\s+result[\:\s]+\{\s+duration[\:\s]+\w+\,\s+status[\:\s]+(?<APIResponse>.*?)\s+\}\," | rex "API URL is[\:\s]+(?<API_URL>.*?)\," | rex "API response header should have status code as\s\\\(?<Code>.*?)\\\\," | rex "correlationId\\\\:\\\(?<CorrelationId>.*?)\\\\," | table API_Name, Applications, ,API_URL, APIServiceCall,Step1, APIRequest, Step2, APIResponse, Code, CorrelationId, _time]]></query> <earliest>@d</earliest> <latest>now</latest> </search> <row> <panel> <html> <p /> <style>#user_input th[data-sort-key=IR_Number] { width: 12% !important; } #test th[data-sort-key=Category] { width: 8.3% !important; min-width: 180px; } #test th[data-sort-key=Environment] { width: 8.3% !important; } #test th[data-sort-key=IRNumber] { width: 8.3% !important; } #test th[data-sort-key="6AM to 10AM"] { width: 8.3% !important; } #test th[data-sort-key=Comments6AMto10AM] { width: 8.3% !important; } #test th[data-sort-key="10AM to 2PM"] { width: 8.3% !important; } #test th[data-sort-key=Comments10AMto2PM] { width: 8.3% !important; } #test th[data-sort-key="2PM to 6PM"] { width: 8.3% !important; } #test th[data-sort-key=Comments2PMto6PM] { width: 8.3% !important; } #test th[data-sort-key="6PM to 10PM"] { width: 8.3% !important; } #test th[data-sort-key=Comments6PMto10PM] { width: 8.3% !important; } #test th[data-sort-key=Shakeout_Date] { width: 8.3% !important; } #customWidth th[data-sort-key=count] { width: 20% !important; } #test tbody tr td[data-cell-index="0"]{ border-radius: 0px !important; text-align:left !important; } #test tbody tr td[data-cell-index=n]{ border-radius: 0px !important; text-align:left !important; } #test tbody tr td{ border-radius: 8px !important; text-align:left !important; border: 2px solid black; } #test thead tr th{ border-radius: 8px 8px 0px 0px !important; background-color: mediumvioletred !important; text-align:center !important; } #test table { margin:4px !important; border:2px; border-radius:5px 5px 0px 0px; min-width:99% !important }</style> </html> <table id="test"> <title>API Calls Summary</title> <search base="API_Base_Search"> <query>| eval Stream=case(like(API_Name,"Logistics Booking API"),"P2O",like(API_Name,"Tramas Feasbility Check API"),"P2O") | eval Scenario=case(like(API_Name,"Logistics Booking API"),"Add New Mobile",like(API_Name,"Tramas Feasbility Check API"),"Service Migration") | eval ReaponseAPI=case((like(APIServiceCall,"passed") AND like(APIRequest,"passed") AND like(APIResponse,"passed")),"Success",true(),"Failed") | fields Stream,Scenario,CorrelationId, API_Name, , API_URL, ReaponseAPI, _time | eval Hour=strftime(_time, "%H") | eval 08:00=if(Hour==08,ReaponseAPI,"") | eval 09:00=if(Hour==09,ReaponseAPI,"") | eval 10:00=if(Hour==10,ReaponseAPI,"") | eval 11:00=if(Hour==11,ReaponseAPI,"") | eval 12:00=if(Hour==12,ReaponseAPI,"") | eval 13:00=if(Hour==13,ReaponseAPI,"") | eval 14:00=if(Hour==14,ReaponseAPI,"") | eval 15:00=if(Hour==15,ReaponseAPI,"") | eval 16:00=if(Hour==16,ReaponseAPI,"") | eval 17:00=if(Hour==17,ReaponseAPI,"") | eval 18:00=if(Hour==18,ReaponseAPI,"") | eval 19:00=if(Hour==19,ReaponseAPI,"") | eval 20:00=if(Hour==20,ReaponseAPI,"") | eval 21:00=if(Hour==21,ReaponseAPI,"") | stats values(Stream) as Stream, values(Scenario) as Scenario, values(08:00) as 08:00, values(09:00) as 09:00, values(10:00) as 10:00, values(11:00) as 11:00, values(12:00) as 12:00, values(13:00) as 13:00, values(14:00) as 14:00, values(15:00) as 15:00, values(16:00) as 16:00, values(17:00) as 17:00, values(18:00) as 18:00, values(19:00) as 19:00, values(20:00) as 20:00, values(21:00) as 21:00, values(22:00) as 22:00, by API_Name | fields Stream,Scenario, API_Name, 08:00,09:00,10:00,11:00,12:00,13:00,14:00,15:00,16:00,17:00,18:00,19:00,20:00,21:00,22:00</query> </search> <option name="count">100</option> <option name="dataOverlayMode">none</option> <option name="drilldown">cell</option> <option name="percentagesRow">false</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">false</option> <format type="color" field="Stream"> <colorPalette type="expression">case(value="P2O","#51cfc8",value="O2A","#EAD1DC",value="U2C","#93C47D")</colorPalette> </format> <format type="color"> <colorPalette type="expression">case(value="Success","#428C43",value="Failed","#B72D0E",value="NA","#b5b8bd")</colorPalette> </format> <drilldown> <set token="tok_API_Name">$row.API_Name$</set> <set token="tok_hour">$click.name2$</set> <set token="tok_details">true</set> <unset token="tok_json" /> </drilldown> </table> </panel> </row> <row> <panel depends="$tok_details$"> <table> <title>Details of the selected $tok_API_Name$ Call</title> <search base="API_Base_Search"> <query>| search API_Name="$tok_API_Name$" | eval currentwindow="$tok_hour$".":00" | eval currenttime=now() | eval currenttime=strftime(currenttime, "%d-%m-%Y") | eval starttime_tok=currenttime+"T"+currentwindow | eval starttime_tok=strptime(starttime_tok,"%d-%m-%YT%H:%M:%S") | eval endtime_tok=starttime_tok+3600 | eval slot=if(_time&gt;starttime_tok AND _time&lt;endtime_tok,"Yes", "No") | where slot="Yes" | eval APIServiceCall=Step1." - ".APIServiceCall | eval APIRequest=Step2." - ".APIRequest | eval APIResponse=Code." - ".APIResponse | eval API_Call_Summary=APIServiceCall."#".APIRequest."#".APIResponse | eval API_Call_Summary=split(API_Call_Summary,"#") | fields API_Name, CorrelationId, Applications, API_URL, API_Call_Summary, _time</query> </search> <option name="count">100</option> <option name="dataOverlayMode">none</option> <option name="drilldown">cell</option> <option name="percentagesRow">false</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> <format type="color" field="API_URL"> <colorPalette type="sharedList">["0xEAD1DC"]</colorPalette> <scale type="sharedCategory" /> </format> <format type="color" field="API_Name"> <colorPalette type="map" /> </format> <drilldown> <set token="tok_json">true</set> <set token="tok_CorId">$row.CorrelationId$</set> </drilldown> </table> </panel> </row> <row> <panel depends="$tok_json$"> <table> <title>Details of $tok_API_Name$ for CorrelationId $tok_CorId$</title> <search> <query>index=devops-insights sourcetype="http:davinci_api_metrics" | search $tok_CorId$ | table _raw | rename _raw as Event_Details</query> <earliest>$earliest$</earliest> <latest>$latest$</latest> </search> <option name="count">100</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> </row> </dashboard>  
Hi, I am new to splunk so pardon me if made any mistake or asking simple questions, i  need to extract data from XML files, only when the xml parameter date is in current date and my date filed (prin... See more...
Hi, I am new to splunk so pardon me if made any mistake or asking simple questions, i  need to extract data from XML files, only when the xml parameter date is in current date and my date filed (printed-Timestramp) is in this format  "2020-06-20T01:23:23.693-0700" i tried below query now i need to pass the  XML Parameter printed-Timestramp , please correct me for the best way to get the result | makeresults | eval substrng=strptime(substr("2020-06-20T01:23:23.693-0700",1,10),"%Y-%m-%d")| eval compare=now() | where compare<substrng | fields + substrng,compare below arethe reference of my xml file    
Hi, I am trying to connect to Database using DB connect. When I choose Input type as Batch --> I am able to fetch the database and able to see the logs But when I choose as Raising --> with raising co... See more...
Hi, I am trying to connect to Database using DB connect. When I choose Input type as Batch --> I am able to fetch the database and able to see the logs But when I choose as Raising --> with raising column --> access_timestamp I am getting this error --> org.postgresql.util.PSQLException: No value specified for parameter 1. Please let me know how to proceed further.
Hi, I am trying to download splunk for windows machine to practice lab sessions in splunk fundamental tutorials. Download option is not working here. It says link is down or moved permanently. Please... See more...
Hi, I am trying to download splunk for windows machine to practice lab sessions in splunk fundamental tutorials. Download option is not working here. It says link is down or moved permanently. Please help. 
Hello community A question was asked about how IP geodata information is provided. I came across an app https://splunkbase.splunk.com/app/3022/ Internet Service Provider (ISP). There were difficu... See more...
Hello community A question was asked about how IP geodata information is provided. I came across an app https://splunkbase.splunk.com/app/3022/ Internet Service Provider (ISP). There were difficulties setting up applications. That the application worked correctly. In README, it is recommended that you edit geoipupdate conf. In the example, the password is not specified, there is no user https://github.com/splunk/seckit_sa_geolocation/blob/master/package/default/inputs.conf Where to ask for credentials for MaxMind? When starting the search, it swears at the script '/opt/splunk/etc/apps/SecKit_SA_geolocation/bin/SecKit_geo_lookup.py'.
Dear I am using network monitoring sensor (linux machine). I have deployed universal forwarder on this sensor. What i am looking for is to ingest IPFIX data directly from incoming interface on this ... See more...
Dear I am using network monitoring sensor (linux machine). I have deployed universal forwarder on this sensor. What i am looking for is to ingest IPFIX data directly from incoming interface on this sensor (eth0) or from a directory (file) and send this data to the Indexer.  Looking to a Splink Stream documentation I cant find proper way to solve this problem. Looking forward to reading from you soon
Hi,   I am using Splunk to monitor our REST API calls search is index=prod-* "WEBSERVICES CALL ENDED" it gives  me results, but I want to get only results when time> 5000 ms  or get the slowest... See more...
Hi,   I am using Splunk to monitor our REST API calls search is index=prod-* "WEBSERVICES CALL ENDED" it gives  me results, but I want to get only results when time> 5000 ms  or get the slowest API response time by time field? hoe can I do it?
I want to automate installing an app in a Splunk instance, and in trying that (which I did not get working), I came across this: I created a local Docker Splunk instance by running: docker run -d ... See more...
I want to automate installing an app in a Splunk instance, and in trying that (which I did not get working), I came across this: I created a local Docker Splunk instance by running: docker run -d \ -p 8000:8000 \ -p 8089:8089 \ -p 8088:8088 \ -p 8191:8191 \ -p 9887:9887 \ -e "SPLUNK_START_ARGS=--accept-license" \ -e "SPLUNK_PASSWORD=PaSSWorD_FoR_SpLuNk" \ --name splunk \ splunk/splunk:latest First, I got my SplunkBase token: curl -k -XPOST https://splunkbase.splunk.com/api/account:login/ -d 'username=andycensys&password=MyPassword' <?xml version="1.0" encoding="utf-8"?> <feed xmlns="http://www.w3.org/2005/Atom"> <title>Authentication Token</title> <updated>2020-06-21T15:35:36.625401+00:00</updated> <id>MyToken</id> </feed> Having that, I then tried to have my Splunk instance install the Censys app: curl -k -XPOST \ -u admin:PaSSWorD_FoR_SpLuNk \ https://localhost:8089/services/apps/local/ \ -d name=censys \ -d filename=false \ -d auth=MyToken \ -d update=true That creates an app called "censys" but does not install the app. Opening the http://localhost:8000/en-US/manager/launcher/apps/local page to list the apps, the entry for the "censys" app indicates that it knows there is a new version available. It looks like: Name Folder name Version Update checking censys censys 1.0.0 Overwrite with 1.0.18 Yes That looks promising, sort of. However, clicking on the "Overwrite" link doesn't work. In web_service.log: 2020-06-21 15:55:45,082 INFO [5eef8301147f1cc8112210] error:321 - Masking the original 404 message: 'The path '/en-US/manager/appinstall/' was not found.' with 'Page not found!' for security reasons That makes me think that upgrades won't work in general, not just my attempt to install an app from the store.
What I want to do is pass a start/end time to a table from my linechart. On my line chart- if I click  a time in the chart- it passes the clicked time perfectly. I'd like to pass that end time & th... See more...
What I want to do is pass a start/end time to a table from my linechart. On my line chart- if I click  a time in the chart- it passes the clicked time perfectly. I'd like to pass that end time & then create a start time that is 5 days earlier as tokens to drill down to a time frame. If I use the drill down editor & use the EVAL to set time-432000 (5DAYBEFORE) then the eval doesnt work (get "No results found") If I convert my 5DAYBEFORE to a human readible & table it.. it shows exactly the date I want to see but if I use the token in the time picker- something goes wrong. I cant really see anything in the documentation to help w/ this example. I was hoping I could click twice & get earliest & latest & pass those 2 to my table. Is there an easy way to drill down time tokens (current time) WITH a eval'ed time to another applet for start/end time? My way seems to create those times perfectly.. its just that the target table wont accept EVAL to set time-432000 (5DAYBEFORE)
WARN CMSlave - Failed to register with cluster master reason: failed method=POST path=/services/cluster/master/peers/?output_mode=json master=clustermaster.domain.com:8089 rv=0 gotConnectionError=0 g... See more...
WARN CMSlave - Failed to register with cluster master reason: failed method=POST path=/services/cluster/master/peers/?output_mode=json master=clustermaster.domain.com:8089 rv=0 gotConnectionError=0 gotUnexpectedStatusCode=1 actual_response_code=500 expected_response_code=2xx status_line="Internal Server Error" socket_error="No error" remote_error=Cannot add peer=<IP> mgmtport=8089 (reason: bucket already added as clustered, peer attempted to add again as standalone. guid=F4204358-8FF9-4DD2-A09B-A0B51735559B bid= catch_all~747~F4204358-8FF9-4DD2-A09B-A0B51735559B). [ event=addPeer status=retrying 
Hi, can anyone explain , what happens when we kept association of correlation search none/blank.   Thanks, Praveen 
Hello, does somebody know any ready app or something to parse dell idrac syslog messages?