All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Copied DB Connect local folder and metadata folder from a production DB Connect server to test, none of the inputs are showing up in the GUI. I verified the db_inputs.conf file has the entire list of... See more...
Copied DB Connect local folder and metadata folder from a production DB Connect server to test, none of the inputs are showing up in the GUI. I verified the db_inputs.conf file has the entire list of inputs, as well as the local.meta has allthe db_inputs as well.  The connections and identities  show up and are visible, but the inputs are not. Is there something I'm missing here?
We have two controllers installed with HA. Primary and Secondary Recently failover happened and Controller is running on Secondary. As per role my Secondary Node is acting as Primary Controller. I... See more...
We have two controllers installed with HA. Primary and Secondary Recently failover happened and Controller is running on Secondary. As per role my Secondary Node is acting as Primary Controller. I want to replicate the data from Primary (Role) to Secondary (Role) where should we run ./replicate.sh ? Node acting as Primary Role  or Node Action as Secondary Role ? I tried to run it as script on the node which is acting as secondary and getting the below error. mv: cannot move `/opt/AppDynamics/Controller/logs/replicate.log' to `/opt/AppDynamics/Controller/logs/replicate.log.2021-01-22.11:30:52': Read-only file system
I have a dashboard which has 11 rows and each row has 4 panels, now out of 11 rows 5rows belong to one application and another 6 rows belong to another application. I want to combine 5 rows and name... See more...
I have a dashboard which has 11 rows and each row has 4 panels, now out of 11 rows 5rows belong to one application and another 6 rows belong to another application. I want to combine 5 rows and name it as "Application A" and combine 6 rows and name it as "Application B" within the same dashboard.
@elliscj  and I will be upgrading Splunk Enterprise from 7.1.3 to 8.1.1.  The last time we did an upgrade, we had an onsite 'Splunker' , and she took care of our upgrade for us.  Does anyone have the... See more...
@elliscj  and I will be upgrading Splunk Enterprise from 7.1.3 to 8.1.1.  The last time we did an upgrade, we had an onsite 'Splunker' , and she took care of our upgrade for us.  Does anyone have the path on how to upgrade from 7.1.3 to 8.1.1?  Any gotcha's that we should be aware of?   Any advice to give us?
question is two fold question 1 - here is sample log |>messageType|2020-02-2 14:01:55.995|094a786b-4d07-498c-9c26-685aa4119a8f|unique_id|dir|not_unique|time|trxn|<?XML data>| messageType and dir ... See more...
question is two fold question 1 - here is sample log |>messageType|2020-02-2 14:01:55.995|094a786b-4d07-498c-9c26-685aa4119a8f|unique_id|dir|not_unique|time|trxn|<?XML data>| messageType and dir is interesting field from splunk. here is my query index=sample_index source="source_1" dir=In messageType=Web | rex field=_raw "^(?:[^\|\n]*\|){8}(?P<transactions>[^\|]+)" This query works for single value like trxn here. but how to get two value trxn and time. I am looking for chart, table with avg(time) and trxn question 2 - from above log last part xml data. here is sample xml <?xml version="1.0" encoding="utf-8"?> <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body> <ns3:trxn xmlns:ns2="url1" xmlns:ns3="url2"> <ResponseCode>OK</ResponseCode> </ns3:trxn> </soapenv:Body> </soapenv:Envelope> how can I get trxn and responnseCode here?
Hi All, I am working on dbconnect.(MYSQL Connection) I have a table in which there is column as below:- DATA DummyValue Host= dummyhost123 |User= dummyuser   I need to extract the value... See more...
Hi All, I am working on dbconnect.(MYSQL Connection) I have a table in which there is column as below:- DATA DummyValue Host= dummyhost123 |User= dummyuser   I need to extract the values contained in the above column and add it to 3 columns Column1  : DummyValue Column2:  dummyhost123 Column 3: dummyuser Please Help. NOTE- I have tried using REGEXP_SUBSTR but it was showing error:-REGEXP_SUBSTR is not recognised as built in function name      
I have events that are being ingested in JSON format. Two of the fields are comma separated lists of MAC and IPv4 addresses. I would like to convert those fields to multivalue fields at index time.  ... See more...
I have events that are being ingested in JSON format. Two of the fields are comma separated lists of MAC and IPv4 addresses. I would like to convert those fields to multivalue fields at index time.   Example input event: {"foo": "bar", "host_ip": "122.33.44.120,85.30.248.114,64.4.28.230", "baz": "biz", "mac": "11:22:33:44:55:66,AA:BB:CC:DD:EE:FF,A1:A2:A3:A4:A5:A6", "oof": "rab"} Example SPL query: index="foo" sourcetype=mymultivaluesourcetype | table mac Desired Output: mac 11:22:33:44:55:66 AA:BB:CC:DD:EE:FF A1:A2:A3:A4:A5:A6   Actual Output: mac 11:22:33:44:55:66,AA:BB:CC:DD:EE:FF,A1:A2:A3:A4:A5:A6   props.conf: [mymultivaluesourcetype] REPORT-mv_host_ip = mvhostip REPORT-mv_mac = mvmac transforms.conf: [mvhostip] SOURCE_KEY = field:host_ip REGEX = (?<host_ip>\d+\.\d+\.\d+\.\d+) FORMAT = host_ip::$1 MV_ADD = true REPEAT_MATCH = true [mvmac] SOURCE_KEY = field:mac REGEX = (?<mac>\w+\:\w+\:\w+\:\w+\:\w+\w+\:\w+) FORMAT = mac::$1 MV_ADD = true REPEAT_MATCH = true  
Here is what I've done.  How to break out the results into individual software correctly in Splunk.  Any tips could be helpful.  Here is the regex being used for software_name and software_version. ... See more...
Here is what I've done.  How to break out the results into individual software correctly in Splunk.  Any tips could be helpful.  Here is the regex being used for software_name and software_version.     | rex max_match=100 field=pluginText "\n+(?<software_name>[^[].*)\s\s\[version\s\d" | rex max_match=100 field=pluginText "\s\s\[version\s(?<software_version>[^[]*.)\]" | stats values(software_name) as software_name values(software_version) as software_version by dest       Here is the text being rex to field value. <plugin_output> The following software are installed on the remote host : McAfee Agent [version 5.6.6.232] Mozilla Firefox 84.0.2 (x64 en-US) [version 84.0.2] Mozilla Maintenance Service [version 84.0.2] The following updates are installed : Microsoft Visual C++ 2010 x64 Redistributable - 10.0.40219 : KB2151757 [version 1] [installed on 3/23/2020] KB2467173 [version 1] [installed on 3/23/2020] KB2565063 [version 1] [installed on 9/10/2020] KB982573 [version 1] [installed on 3/23/2020] Microsoft Visual C++ 2010 x86 Redistributable - 10.0.40219 : KB2151757 [version 1] [installed on 3/23/2020] KB2467173 [version 1] [installed on 3/23/2020] KB2565063 [version 1] [installed on 3/23/2020] KB982573 [version 1] [installed on 3/23/2020] </plugin_output> I want to break out the results to induvial line with the host repeated.  But, I don't know where to start.  I'd try mvexpan function but, it doesn't break out the correct pair of data.
I'm working on the initial set up of splunk single instance on prem and I haven't been able to get data in yet. I have installed the universal forwarder on 2 windows servers and installed the add on ... See more...
I'm working on the initial set up of splunk single instance on prem and I haven't been able to get data in yet. I have installed the universal forwarder on 2 windows servers and installed the add on for windows on those servers. I get this message in the monitoring console. ulimits.data_segment_size (current / recommended) ulimits.open_files (current / recommended) ulimits.user_processes (current / recommended) -1 4096 / 64000 47318 / 16000   Then when I log onto the Cent OS server and look at ulimits and they are set as the recommended minimum values.   How can I get the Splunk web to recognize how these settings are set on the server?
Hi , I have data where  i  want to read comment line and store value in field. for example  , I have log where first  4 line field is in commented for Version, Date, System, Software #Version: 1.0... See more...
Hi , I have data where  i  want to read comment line and store value in field. for example  , I have log where first  4 line field is in commented for Version, Date, System, Software #Version: 1.0 #Date: 2020-04-18 11:10:15 #System: 10.244.32.81 - SCWSA-7HBA-0001.nbnco.local #Software: ABC for Web 11.8.0-414 My query  : i have 4 field in datamodel for ver , date, system, software .now i want to store commented data in this field. so how to write the regex expression for this so-that i can see value in datamodel for this commented line
I have data following data in  csv file. need to suppress last one or two columns. please suggest me how to do that. 1st row contains header information ,comma is delimiter  want suppress/nullify t... See more...
I have data following data in  csv file. need to suppress last one or two columns. please suggest me how to do that. 1st row contains header information ,comma is delimiter  want suppress/nullify the  "myField" and associated value. i tried with props.conf and transform.conf. but not working. BusTargetId,HubId,MsgType,Priority,"Req_RespId","Source_IP","_raw","_time","command_type","endpoint_type",eventtype,gwTimestamp1,host,index,linecount,punct,source,sourcetype,"splunk_server","status_code",timestamp,"myField" 4DED32483ECD428A,100004030,Alert,2,rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7,"2001:470:b:654:225:fe8c:d410","Alert,rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7,2001:470:b:654:225:fe8c:d410,100004030,4DED32483ECD428A,ECHO,OUTAGE,2,1438432955000,,,SUCCESS, ","2020-06-12T12:08:53.000+0000",OUTAGE,ECHO,,1438432955000,uMhtVAppDPj11,"service_audit",1,",,::::::,,,,,,,,,,_","outage.txt",SMWAN,uMhtVAppDPj11,SUCCESS,none,name1 4DED32483ECD428A,100004031,Alert,2,rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7,"2001:470:b:654:225:fe8c:d410","Alert,rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7rQtuPiSZF7,2001:470:b:654:225:fe8c:d410,100004031,4DED32483ECD428A,ECHO,OUTAGE,2,1438432955000,,,SUCCESS, ","2020-06-12T12:08:53.000+0000",OUTAGE,ECHO,,1438432955000,uMhtVAppDPj11,"service_audit",1,",,::::::,,,,,,,,,,_","outage.txt",SMWAN,uMhtVAppDPj11,SUCCESS,none,name2   Appreciate your help!!
Hi all! I followed the instructions in the docs for enabling the HTTP Event Collector as well as setting up a Token, but the monitoring dashboard for HEC says I don't have any tokens configured. ... See more...
Hi all! I followed the instructions in the docs for enabling the HTTP Event Collector as well as setting up a Token, but the monitoring dashboard for HEC says I don't have any tokens configured. Has anybody else run into this?  
I am looking to completely remove data from an Index after 30 days.  Looking into utilizing "frozentimeperiodinsecs" to set the threshold but from what I've looked into this applies to an entire buck... See more...
I am looking to completely remove data from an Index after 30 days.  Looking into utilizing "frozentimeperiodinsecs" to set the threshold but from what I've looked into this applies to an entire bucket.  Just need some confirmation in my understanding before moving forward:  - Does "frozentimeperiodinsecs" only apply to warm or cold buckets? If I were to set this without configuring warm or cold buckets would it not touch the hot bucket due to Splunk not allowing it to pull directly from there? - If so am I correct to assume that I also need to set "maxHotSpanSecs" AND "maxDataSize" parameters slightly before (like a day) so "frozentimeperiodinsecs" can then pull from the warm bucket to delete? Thank you in advance
Hi,  Using Workload Management - Admission Rules, I cannot embed a URL or list a URL in the User Message since it does not accept "/" Is there a workaround? I want to provide users a link to an inte... See more...
Hi,  Using Workload Management - Admission Rules, I cannot embed a URL or list a URL in the User Message since it does not accept "/" Is there a workaround? I want to provide users a link to an internal WIKI page if they encounter a rule. Thanks Chris
HI ,   I am trying to send values from one panel to another dashboard using drill down , is it possible to split the value and then send.... I  have field name "host: Running" , "service: Runni... See more...
HI ,   I am trying to send values from one panel to another dashboard using drill down , is it possible to split the value and then send.... I  have field name "host: Running" , "service: Running"  , "URL: Running" in main dashboard, these are generated fields, when i click on any field it should trigger drill down In my drilldown dashboard I have a dropdown and want only the "host"  from main dashboard,  Drill Down Dropdown           service: Running  Expected Dropdown service Basically want to split the value in main dashboard My drilldown form.token=$click.name2$
I have created  chart with date and end time I need to chart the end times but it doesnt work on visualisation    I need to put start datetime on the X axis  ( start date would be fine actually) an... See more...
I have created  chart with date and end time I need to chart the end times but it doesnt work on visualisation    I need to put start datetime on the X axis  ( start date would be fine actually) and end datetime as the data points (Y axis)  Any help gratefully received  the word time and chart have revealed too many other issues in splunk base search  index=billing sourcetype=billing_invoice_poller HikariPool "Start completed." | bin _time span=1d as day | convert timeformat="%Y-%m-%d %H:%M:%S" ctime(_time) AS c_time | stats earliest(c_time) as Start_Time by day | appendcols [search index=billing sourcetype=billing_invoice_poller HikariPool "Shutdown completed." | bin _time span=1d as day | convert timeformat="%Y-%m-%d %H:%M:%S" ctime(_time) AS d_time | stats latest(d_time) as End_Time by day] | eval it = strptime(Start_Time, "%Y-%m-%d %H:%M:%S.%3N") | eval ot = strptime(End_Time, "%Y-%m-%d %H:%M:%S.%3N") | eval diff = tostring((ot - it), "duration") | eval diff1 = strptime(diff, "%H:%M:%S") | eval Total_Run_Time = strftime(diff1, "%H:%M:%S")|sort - Start_Time | table Start_Time, End_Time
In my Search 1, it will list all unique port numbers associated with a certain IP address, i.e. 1.2.3.4 "MYTOKEN is: fcd4e600-eda2-4ee0-a3b3-093562f49c2e" | rex "1.2.3.4:(?<ipport>.*?) " | dedup ipp... See more...
In my Search 1, it will list all unique port numbers associated with a certain IP address, i.e. 1.2.3.4 "MYTOKEN is: fcd4e600-eda2-4ee0-a3b3-093562f49c2e" | rex "1.2.3.4:(?<ipport>.*?) " | dedup ipport | table ipport | table ipport And then I'd like to concatenate those ports into one long string delimitated with "," that is, "57432, 57453,57198" and finally this concatenated string will be used in another search, i.e  "https_client-init <HTTP_REQUEST>: " | rex "2.3.4.5:(?<port>.*?) " | search port IN([search "MYTOKEN is: fcd4e600-eda2-4ee0-a3b3-093562f49c2e" | rex "1.2.3.4:(?<ipport>.*?) " | dedup ipport | table ipport | table ipport])   It will be really appreciated if someone could shed the light of how it can be solved. thanks in advance!
I have a log in tomcat as -   ... [MerchantEndPoint]: saveMerchantDetails():ednpoint execution enterd .. ... [CreditEndPoint]: saveCreditDetails():ednpoint execution enterd -..    I want to cr... See more...
I have a log in tomcat as -   ... [MerchantEndPoint]: saveMerchantDetails():ednpoint execution enterd .. ... [CreditEndPoint]: saveCreditDetails():ednpoint execution enterd -..    I want to create a chart based on the entry logs how many times service getting called /day   i have created a regex with below query but its not giving correct result, in regex editor it works fine   index=fg_wv_li | sourcetype="fg:mylogs.txt" ":endpoint execution started" | rex field=_raw "\b(?<stype>(\[]a-zA-Z]+\][:]))" | chart count by stype   i want as servicename:method name : count / day [MerchantEndPoint]: saveMerchantDetails():    10 [CreditEndPoint]: saveCreditDetails() : 15    Can someone help me to fix the query above? Thanks.
Hello Team, We are using "collect" command by Constructing a search that returns the data that we want to copy/update, and pipe the results to index file with collect command as below: | inputlook... See more...
Hello Team, We are using "collect" command by Constructing a search that returns the data that we want to copy/update, and pipe the results to index file with collect command as below: | inputlookup XXX.csv | collect index=Index_file. We are sending it with HTTP post method, we can see the above mentioned entries in "search" history for validation. But with collect , the mentioned Index_file is not updating instantly. It updates abruptly and inconsistently with some hours delayed. Sometimes 1 or 2 days delays. Any suggestion or valid reason greatly appreciated. Thanks, Nirav    
Hi everyone, I have a specific question for all of you. In Splunk ESS I created a correlation search and a notable for the monitoring Incident Review section. I have set up a specific notable wi... See more...
Hi everyone, I have a specific question for all of you. In Splunk ESS I created a correlation search and a notable for the monitoring Incident Review section. I have set up a specific notable with drilldown to which I pass a field of the CS (Corralation Search)  to perform the specific search and display via the Statistics tab. Corralation Search:   index=* (statusCode=4* OR statusCode=5*) | rename "requestTime" as Time, "statusCode" as Status, "sourceIp" as SourceIp, "httpMethod" as HttpMethod, "endpointRequestId" as "EndpointReqID" | stats values(Status) as Status, values(HttpMethod) as HttpMethod, count by index, SourceIp, EndpointReqID   Notable Drilldown   index=* (statusCode=4* OR statusCode=5*) | search sourceIp="$sourceIp$" | rename "requestTime" as Time, "statusCode" as Status, "sourceIp" as SourceIp, "httpMethod" as HttpMethod, "endpointRequestId" as "EndpointReqID" | stats values(Status) as Status, values(HttpMethod) as HttpMethod, count by index, SourceIp, EndpointReqID   When I open the drilldown from the Notable screen, the following query is returned:   index=* (statusCode=4* OR statusCode=5*) | search sourceIp="$sourceIp$" | rename "requestTime" as Time, "statusCode" as Status, "sourceIp" as SourceIp, "httpMethod" as HttpMethod, "endpointRequestId" as "EndpointReqID" | stats values(Status) as Status, values(HttpMethod) as HttpMethod, count by index, SourceIp, EndpointReqID   Instead of:   index=* (statusCode=4* OR statusCode=5*) | search sourceIp="129.12.x.x" | rename "requestTime" as Time, "statusCode" as Status, "sourceIp" as SourceIp, "httpMethod" as HttpMethod, "endpointRequestId" as "EndpointReqID" | stats values(Status) as Status, values(HttpMethod) as HttpMethod, count by index, SourceIp, EndpointReqID   Why is the $sourceIp$ field not recognized and replaced with the IP address of the CS so that it can perform a specific search? What is the error? Thank you all!