All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

How are you able to associate machine agent with application?
How can I change the starting point? Current How can I express it like this?
This project is for our badge system. I currently have a dashboard panel that shows how many badges have been used in a day giving us a ballpark figure on how many people are in the office. I am n... See more...
This project is for our badge system. I currently have a dashboard panel that shows how many badges have been used in a day giving us a ballpark figure on how many people are in the office. I am now working on showing how many people are on the different floors, which I have. The one feature that I need is to have the count change as the person moves from floor to floor. For an example: If there are 2 people on the first floor and 5 people on the second floor and on of the 5 people on the second floor badges in on the first floor, I want the dashboard to deduct 1 from 5 and add the person to the latest badged floor which in this case it is the first floor. I can get a count but don’t know how to link. Any thought? Thanks
Hello Guys, I've checked version compatibility But, I want to make sure from this. Our Heavy forwards is 7.1.2, Indexer and the others will be 7.3.x So is it compatible perfectly?
Hi does anyone know is there is a way for transaction starts with ends with take the middle result Example, i have transaction DESCRIPTION startswith = VALUE = “RUN” endswith =VALUE=“STOP” In my d... See more...
Hi does anyone know is there is a way for transaction starts with ends with take the middle result Example, i have transaction DESCRIPTION startswith = VALUE = “RUN” endswith =VALUE=“STOP” In my data there is RUN,STOP,RUN,RUN,RUN,STOP,RUN,STOP,STOP,RUN,STOP. Apparently the Transaction command works with RUN,STOP but if there is RUN,RUN,RUN,STOP it will only take the last part of the RUN,STOP. Does anyone know a way it can get information from RUN,....,....,STOP , and also RUN,STOP,STOP it will get RUN,....,STOP I hope you all understand what i meant.
ERROR NetVizAgentRequest - Fatal transport error while connecting to URL [http://127.0.0.1:xxxx/api/agentinfo?timestamp=0&agentType=APP_AGENT&agentVersion=4.5.19]: org.apache.http.conn.ConnectTimeout... See more...
ERROR NetVizAgentRequest - Fatal transport error while connecting to URL [http://127.0.0.1:xxxx/api/agentinfo?timestamp=0&agentType=APP_AGENT&agentVersion=4.5.19]: org.apache.http.conn.ConnectTimeoutException: Connect to 127.0.0.1:xxxx [/127.0.0.1] failed: connect timed out Been facing this error in agent log. Browser RUM data ok via injection into header of html files. Application monitoring shows connected, but no data received. Using java agent and maven.
Hello Splunkers, Can you help me below case to build splunk search. I have firewall data coming to index=firewall so i need to filter based on results from my external lookups fields IP as we... See more...
Hello Splunkers, Can you help me below case to build splunk search. I have firewall data coming to index=firewall so i need to filter based on results from my external lookups fields IP as well matching domain name from the indexed data. index=firewall | lookup url.csv | fields url | lookup domain.csv | fields domain | .. etc any of the matching fields from indexed data.
I'm not getting anything in my config input, even though I know events are logging. I started searching _internal and found that if I watch splunk_ta_aws.modinputs.config I can see the sqs processin... See more...
I'm not getting anything in my config input, even though I know events are logging. I started searching _internal and found that if I watch splunk_ta_aws.modinputs.config I can see the sqs processing. WHenever a messages pops up in the queue, splunk says it's not a config record. Example: 2020-03-22 19:06:17,736 level=WARNING pid=11190 tid=MainThread logger=splunk_ta_aws.modinputs.config pos=__init__.py:_stream_events:344 | datainput="Config_0bd7b668-1857-4466-9977-1595a2745bf3" | message="Invalid notifications have been removed from SQS : {"Records":[{"eventVersion":"2.1","eventSource":"aws:s3","awsRegion":"us-east-2","eventTime":"2020-03-22T19:06:16.660Z","eventName":"ObjectCreated:Put","userIdentity":{"principalId":"AWS:XXXXXXXXXXXXXX"},"requestParameters":{"sourceIPAddress":"172.18.xx.xxx"},"responseElements":{"x-amz-request-id":"03FF16BD00681075","x-amz-id-2":"redacted"},"s3":{"s3SchemaVersion":"1.0","configurationId":"Put config notice","bucket":{"name":"correct bucket name for config data","ownerIdentity":{"principalId":"XXXXXXXXXXXXX"},"arn":"arn:aws:s3:::correct bucket name for config data"},"object":{"key":"aws-config/AWSLogs/436617320021/Config/ConfigWritabilityCheckFile","size":0,"eTag":"d41d8cd98f00b204e9800998ecf8427e","versionId":"M5RCgK646j6EJPn5MGNeMYhbzRbh8Bge","sequencer":"005E77B7289FBE2097"}}}]}" I'm not sure how to diagnose from here!
Hi, We have started to experience line breaking issue for our csv source. As a result sometimes we have an attempt from Splunk to read a whole cvs file with 500+ lines as one event up to 256 l... See more...
Hi, We have started to experience line breaking issue for our csv source. As a result sometimes we have an attempt from Splunk to read a whole cvs file with 500+ lines as one event up to 256 lines in it. Then these errors occur and Splunk starts reading the rest of the file correctly: one line per one event. AggregatorMiningProcessor - Changing breaking behavior for event stream because MAX_EVENTS (256) was exceeded without a single event break. Will set BREAK_ONLY_BEFORE_DATE to False, and unset any MUST_NOT_BREAK_BEFORE or MUST_NOT_BREAK_AFTER rules. Typically this will amount to treating this data as single-line only. - data_source="/tmp/tmp-in/tmp/d_tmp_storage_history.csv", data_host="host06", data_sourcetype="d_tmp_storage_history" host = hf_host source = /opt/splunk/var/log/splunk/splunkd.log sourcetype = splunkd WARN AggregatorMiningProcessor - Breaking event because limit of 256 has been exceeded - data_source="/tmp/tmp-in/tmp/d_tmp_storage_history.csv", data_host="host06", data_sourcetype="d_tmp_storage_history" The issue seems to be started few days ago. No known changes were introduced on Splunk side or on the side that runs scripts to generate cvs files we monitor. Here is props.conf we use on Splunk Universal Forwarder on data_host="host06" that monitors data_source="/tmp/tmp-in/tmp/d_tmp_storage_history.csv" [ d_tmp_storage_history ] HEADER_FIELD_LINE_NUMBER = 1 SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]+) NO_BINARY_CHECK=true INDEXED_EXTRACTIONS=csv KV_MODE=none DATETIME_CONFIG=CURRENT We have checked csv file via Excel ,Notepad+ ,vi editor for hidden characters (:set list), cat with -v -t -e - to see if some special unusual character(s) pop up. Haven't found anything unusual Any advice which direction to look would be appreciated! Thank you
Hi I’ve create index and want to extract fields on it. is possible through the web interface or i should edit specific config file? Thanks,
I cant imagine this is possible, but splunk continuously surprises me, so ill ask: Is there anyway to exclude results, from the same host, + or - 2 seconds from a match. (or N seconds/minutes)... See more...
I cant imagine this is possible, but splunk continuously surprises me, so ill ask: Is there anyway to exclude results, from the same host, + or - 2 seconds from a match. (or N seconds/minutes) example, in this image below, id like to exclude the results above and below the match on the ip address 68.x.x.x ? (this is just an example, i know i could get to my goal in this case by just showing IP matches, and investigating any IPs not on a known good IP lookup csv) thanks
Dears, i have a question please, is there a way to get the number of records which output from splunk through SPLUNK DB Connect through oracle driver. Also, is there a way to insert only to the ... See more...
Dears, i have a question please, is there a way to get the number of records which output from splunk through SPLUNK DB Connect through oracle driver. Also, is there a way to insert only to the database without making insert & Update with each output?
Splunk's dashboard is global. In japan, there is Blog AND data so, I can make it (sorry , all text is japanese) . Here is GitHub link Is there other Country DATA and Visualization? and wi... See more...
Splunk's dashboard is global. In japan, there is Blog AND data so, I can make it (sorry , all text is japanese) . Here is GitHub link Is there other Country DATA and Visualization? and with Splunk?
Hi all, With all this work from home, I'm now pulling logs from the VPN equipment. Now leadership is asking to equate the UserName to a business unit. Our Active Directory doesn't natively provi... See more...
Hi all, With all this work from home, I'm now pulling logs from the VPN equipment. Now leadership is asking to equate the UserName to a business unit. Our Active Directory doesn't natively provide that but does give 'department'. I've built the lookup to equate departments to BusinessUnit, but can't figure out the missing piece. Lookup: Department BusinessUnit 11-000-* GA Legal 11-1* GA Security 11-2* GA HR 11-3* GA Internal Audit 11-5* GA Procurement 13-2* GA ITS 14-* GA Accounting 15-104* GA Publications 15-113-000 GA CFO 15-113-001 GA Intl Cntrl 15-180* GA Treasurer 15-250* GA Financial Planning 15-350* GA Treasury 16-1* GA Facilities 18-4* Fusion 19-* EMS 20-505* Diazyme 51-001* GA Uranium Res. Co. 6* ASI 7* SI My current search: eventtype=cisco-ise-passed-authentication Location="Location#All Locations#US#CA#Poway" NAS_Port_Type="Virtual" | eval UserName=lower(UserName) | stats dc(UserName) by UserName | lookup adlookup sAMAccountName as UserName | table UserName department | lookup BusinessUnitLookup.csv department as Department OUTPUTNEW BusinessUnit | stats dc(UserName) by BusinessUnit
Hi , I am unable to start the controller service in both nodes, service is disabled in both the servers and mysql service is running active in both the nodes. Thanks Kishore
Hello everyone, I have a python script which will connect to splunk and search with the query which I passed through a script and fetches the results. So now my script is working fine and fetchin... See more...
Hello everyone, I have a python script which will connect to splunk and search with the query which I passed through a script and fetches the results. So now my script is working fine and fetching the splunk results. Now I want to send these results to Prometheus.. i have Prometheuslib installed... Can anyone help me with docs, or script on this?
we are running splunk v6.6 , and i have tried just about every answer on these forums, but i can not get anything to add to the "Selected Fields" on the left hand side (beyond the stock defaults of H... See more...
we are running splunk v6.6 , and i have tried just about every answer on these forums, but i can not get anything to add to the "Selected Fields" on the left hand side (beyond the stock defaults of Host,Source,Sourcetype). see image, im trying to add "index" to where i have the red line (which should also add it below each search result, ie where the 2nd red line is). the change that makes the most sense (but is having no effect), is this one: add to the file: C:\Program Files\Splunk\etc\users\admin\user-prefs\local\ui-prefs.conf [default] display.events.fields = ["host","index","source","sourcetype"] ( from: https://answers.splunk.com/answers/634367/how-do-we-permanently-move-some-interesting-fields.html and from: https://docs.splunk.com/Documentation/Splunk/6.4.4/Admin/Ui-prefsconf ) And then restart splunk (i am always restarting splunk service , via splunk web gui, after each of these changes im trying). another setting ive tried is in: C:\Program Files\Splunk\etc\apps\search\local\viewstates.conf to add: [flashtimeline:_current] FieldPicker_0_6_1.fields = host,sourcetype,source,index (from: https://answers.splunk.com/answers/185864/selected-fields-in-fields-side-bar.html ) however none of these having any change, ie i still always have the default Host,Source,Sourcetype. any suggestions? thanks!
Hi All, I am having table, whose cell coloring is done based on the condition . So i have a java script which brings me the color coding of the cell, I need a specific color for the header ... See more...
Hi All, I am having table, whose cell coloring is done based on the condition . So i have a java script which brings me the color coding of the cell, I need a specific color for the header of the table. how can i achieve If i a css how can i pass two ids to a table ?please help me
When I see - FIELDALIAS-Status_as_Error_Code = Status AS Error_Code . Does it mean that Status gets the value of Error_Code or the other way around?
Building a Dashboard dropdown. The following query works fine and there are no duplicates in the resultset however the dashboard filter has the message "Duplicates values causing conflict". When ... See more...
Building a Dashboard dropdown. The following query works fine and there are no duplicates in the resultset however the dashboard filter has the message "Duplicates values causing conflict". When I hover over this message a pop up show the following: Error in 'dbxquery command'. This command is not supported in realtime search. the query is ... |dbxquery query="SELECT DISTINCT last_name FROM hr.employees" connection="ORCL_PDB1_CONN" Using SPLUNK DB COnnect APP In advance, thanks for you comments/insights and assistance.