All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

is there a REST command to delete rows from the dmc_forwarder_assets.csv? For example, to remove rows where the status=missing or where hostname = a value provided in the REST command?     Than... See more...
is there a REST command to delete rows from the dmc_forwarder_assets.csv? For example, to remove rows where the status=missing or where hostname = a value provided in the REST command?     Thanks in advance.  
Hi Guys, So if I have two fields with really random set of text, no similarities except the red text.  Does have the 'red' fonts in similarity as below. In this case, is there a way to search to ... See more...
Hi Guys, So if I have two fields with really random set of text, no similarities except the red text.  Does have the 'red' fonts in similarity as below. In this case, is there a way to search to say when first four letters in FieldA exists in FieldB, exclude? This would be very easy in powershell or python.  I am medium splunk user as well, but not sure how I'd do this in splunk. This would be very helpful..  Your help is appreciated in advance.     FieldA FieldB Complete Exch.Complete NotComplete apps.NotC@ Satisftesting Satiapps
These two cells are examples of results I see in IIs logs.  If the field is just a / (backslash) ( as in the first example data cell ) I want to return just the /.  If there are characters after ... See more...
These two cells are examples of results I see in IIs logs.  If the field is just a / (backslash) ( as in the first example data cell ) I want to return just the /.  If there are characters after the first / (like search in #2) , I need to return the text between the backslashes.   / /search/20191108/master.svc   My results would look like this.  field1 root    search
Hi All, Windows event logs generate large volumes of data every day.  Thus, there is excessive data ingestion making data noisy and difficult to analyze.  I need your help to understand how to find... See more...
Hi All, Windows event logs generate large volumes of data every day.  Thus, there is excessive data ingestion making data noisy and difficult to analyze.  I need your help to understand how to find the events which can be filtered out to reduce the volume of ingested data without losing visibility of important events which help to track security issues.  Thank you
I have a dashboard which show results through the selected dropdown. The selected api will display only api_resposne which it has corresponding downstream_response as well. But the issue is the downs... See more...
I have a dashboard which show results through the selected dropdown. The selected api will display only api_resposne which it has corresponding downstream_response as well. But the issue is the downstream_response can only be searched with the trackid of api_response. is there a way to show the data of api_response corresponding with the results of downstream_response?
Our organization currently hosts a "Splunk Health" dashboard. This dashboard has a panel that shows the heartbeat interval and overdue status of all the known splunk forwarders. However, what this da... See more...
Our organization currently hosts a "Splunk Health" dashboard. This dashboard has a panel that shows the heartbeat interval and overdue status of all the known splunk forwarders. However, what this dashboard does not have is a panel that can identify any valid endpoint on network that does not have the splunk forwarder installed. Problem Statement: Add panel to dashboard that depicts any valid endpoint without Splunk forwarder installed. I am fairly new to splunk. However, I thought that the best way to tackle would be to pull a csv from AD of all valid endpoints (WS, Servers, etc) and host as a lookup table file within Splunk. Then identify a query that compare current "clients" in Forwarder Management against the new lookup table file (AD output). The results would show me any endpoint that is not currently a client within Forwarder Management. Any help would be appreciated. Thank you.
Hi, I have a panel on Dashboard Studio where no results are displayed on the panel and I am getting the message "Search ran successfully, but no results were returned".   When I in... See more...
Hi, I have a panel on Dashboard Studio where no results are displayed on the panel and I am getting the message "Search ran successfully, but no results were returned".   When I investigate this same search on the "Open in Search", the expected do get produced though. What can I do in this situation?   Thanks.
Hi, I have an xml response in the below format. I'm trying to read the BusinessId value of this. Since there are multiple, I want to read only the first one and use it as part of my report. ... See more...
Hi, I have an xml response in the below format. I'm trying to read the BusinessId value of this. Since there are multiple, I want to read only the first one and use it as part of my report. This is how my query looks: index=customer app_name="searchservice" | rex field=msg "BusinessId>(?P<BusinessId>[0-9]*)<\/" | table Client, MethodName, BusinessId,CorrelationId Fields Client, MethodName and CorrelationId have already been parsed out. The issue I'm seeing is that if the response xml has multiple entries of BusinessId, it doesn't show up in the result as shown in the first two correlation ids. For the next two, the xml had only one instance BusinessId, so it showed up in the response. How do I fix the regex to parse only the first instance and ignore the rest? Thanks, Arun
Hi guy, I am using Splunk UI to develop new app on the Splunk My app has components: Setup page: let people provide some configuration such as Splunk Token, Splunk HEC token, Proxy config A... See more...
Hi guy, I am using Splunk UI to develop new app on the Splunk My app has components: Setup page: let people provide some configuration such as Splunk Token, Splunk HEC token, Proxy config App component: Custom Rest endpoint (python)   From the app comment, I will make some calls to the custom rest endpoint to get / process data. On the setup page, I save all configurations into KV-store, to re-use So my question is: - How custom rest endpoint script (python file) can get information from KV-store, as you know, with python SDK, need a credential to authorize with Splunk. Have any way to connect Splunk from the custom rest endpoint script without authen? - Which place is better to save credentials or configurations in Splunk? kv sore? storage password or config file  - If does not any way to authen with Splunk from Custom rest endpoint without credentials, how can pass credentials to rest endpoint? Enter credential into config file? or attach credentials in the param request when we call the rest endpoint?   Thank! P/s: The first time I see a big platform have many version/options to develop, It make develop is confused ( Splunkjs, Splunk SDK, Splunk UI ...)
I need to create a Dashboard with below columns  from below event data.   I couldn't able to get "Status" column value which is combination of  eventData{}.StatusCount{}.status and eventData{}.Status... See more...
I need to create a Dashboard with below columns  from below event data.   I couldn't able to get "Status" column value which is combination of  eventData{}.StatusCount{}.status and eventData{}.StatusCount{}.count Thanks in advance!!!   Dashboard five columns  and  expected values:       Date : "2021-10-14", eventKey: "event.request", ReceivedCount: 10, ProcessedCount: 10, MismatchCount: 0, Status :  DOCUMENT_REQUEST_RECEIVED:10 DOCUMENT_SUCCESS:10 DOCUMENT_NOTIFY_SUCCESS:10     "eventData": [ { "Date": "2021-10-14", "eventKey": "event.request", "ReceivedCount": 10, "ProcessedCount": 10, "MismatchCount": 0, "StatusCount": [ { "status": "DOCUMENT_REQUEST_RECEIVED", "count": 10 }, { "status": "DOCUMENT_SUCCESS", "count": 10 }, { "status": "DOCUMENT_NOTIFY_SUCCESS", "count": 10 } ] } ]
sample json: Hosts: { [-]    Nodepool1: { [-]        Cluster: xyz1        Accountid: idxyz    Nodepool3: { [-]       Cluster: xyz1      Accountid: idxyz    Nodepool5: { [-]      Cluster: xy... See more...
sample json: Hosts: { [-]    Nodepool1: { [-]        Cluster: xyz1        Accountid: idxyz    Nodepool3: { [-]       Cluster: xyz1      Accountid: idxyz    Nodepool5: { [-]      Cluster: xyz1     Accountid: idxyz   am trying below query but it display list of servers but missing few servers randomly, please correct the query if am missing something. index=index1 | eval cluster="" | foreach hosts.*.cluster [| eval cluster=isnotnull('<<FIELD>>'),'<<FIELD>>,cluster)] | table cluster
index A has table1 and Index B has table2 table1 table2.        table3 aaa.      zzz.             aaa bbb.     aaa.           bbb ccc.   ccc              ddd     ddd          I want do ou... See more...
index A has table1 and Index B has table2 table1 table2.        table3 aaa.      zzz.             aaa bbb.     aaa.           bbb ccc.   ccc              ddd     ddd          I want do output new table with values doesn't exist when compare with table1 with table2 
I recently migrated a clustered index.  We wanted to rename the index.  I created the new index as your normally would via the CM.  Put the cluster in maintenance mode.  Stop any ingest into the "old... See more...
I recently migrated a clustered index.  We wanted to rename the index.  I created the new index as your normally would via the CM.  Put the cluster in maintenance mode.  Stop any ingest into the "old" index and merely copied all the contents of the "old" index into the "new" index on all 6 of our indexers.  Took the cluster out of maintenance mode and did a rolling restart.  Everything worked fine except when I count the events in both indexes for ALL TIME, the old index is ~40 million events and the new index is ~111 million events.  We have a SF & RF of 3.  My thoughts are that its something with the RF of 3 however the math does not really workout to be 3x.  
Hi, I have a general question about which commands do you usually avoid in order to make search faster? For example I tend to aviod transaction and join. Instead of join, when possible I try do u... See more...
Hi, I have a general question about which commands do you usually avoid in order to make search faster? For example I tend to aviod transaction and join. Instead of join, when possible I try do use lookup. Also in favour of lookup, I try not to use subsearches which use | inputlookup command. What about other commands? What other commands do you avoid to save system resources?
I have Splunk installed on a machine running Windows 10 that is compliant with all Windows 10 STIGs.  I can access Splunk from that machine but no others.  I can ping the Splunk box from other machin... See more...
I have Splunk installed on a machine running Windows 10 that is compliant with all Windows 10 STIGs.  I can access Splunk from that machine but no others.  I can ping the Splunk box from other machines. I have tried disabling the firewall but the symptoms persist.   I figure it is a setting associated with a STIG and am hoping someone here has run into this before and remembers what it is.  
Has anyone solved the issue of the Splunk sendmail command changing the order of the input columns to another output column order in the resulting email?  Apparently, this order alteration effect has... See more...
Has anyone solved the issue of the Splunk sendmail command changing the order of the input columns to another output column order in the resulting email?  Apparently, this order alteration effect has been around for quite sometime now.  I need to send an email with the columns in a specific order.   How can I  specify the column order?  
We have requirement to mask data in index time. While below works to mask data in raw, it does not work for extracted field "User name". My SED is on universal forwarder (windows) and it works fine f... See more...
We have requirement to mask data in index time. While below works to mask data in raw, it does not work for extracted field "User name". My SED is on universal forwarder (windows) and it works fine for raw data: s/(GBW\d{8}\t)(\d{8}\s){0,1}(\w.*?)(\t)/\1\2(masked)\4/g My props.conf: [sourcetype] SEDCMD-username=s/(GBW\d{8}\t)(\d{8}\s){0,1}(\w.*?)(\t)/\1\2(masked)\4/1 FIELD_DELIMITER=tab HEADER_FIELD_DELIMITER=tab HEADER_FIELD_LINE_NUMBER=1 MAX_TIMESTAMP_LOOKAHEAD=300 TIMESTAMP_FIELDS=Timestamp TIME_FORMAT=%Y%m%dT%H%M%S.%3N+%z TRANSFORMS-anonymize = username-anonymizer However, Transforms does not work. Have tried by placing on Universal forwarder as well as Intermediate heavy forwarder. Have created based on response from Solved: How can I anonymize fields of data that has underg... - Splunk Community transforms.conf: [username-anonymizer] REGEX = (?m)^(.*User name\:\:)(\d{8}\s){0,1}(\w.*?)$ FORMAT = $1(masked) WRITE_META = false SOURCE_KEY = _meta DEST_KEY = _meta   Related info: We are expecting tab-delimited data. The field User name is in the middle and follows hostname and hence GBW is this example. "User name" could be combination of id and name and we only want to mask name: Value : 12345678 firstname lastname 12345678 firstname firstname lastname firstname expected masked value 12345678 (masked) 12345678 (masked) (masked) (masked) It could be blank as well.  
Hi, Inspired from this post: https://community.splunk.com/t5/Dashboards-Visualizations/How-can-i-re-use-Java-scripts-form-one-table-to-another-tables/m-p/414861 I modifed my Javascript to add a len... See more...
Hi, Inspired from this post: https://community.splunk.com/t5/Dashboards-Visualizations/How-can-i-re-use-Java-scripts-form-one-table-to-another-tables/m-p/414861 I modifed my Javascript to add a lense icon in two tables. Same fieldname "Metrics" same icon "lupe.png" but 2 different tables. With Splunk 8.1 it's working without any error. After the upgrade to 8.2.9 I get now this error message by accessing the App but the lens icons are still shown in both tables. The output from the developer console: TypeError: Cannot read property 'getVisualization' of undefined at eval (eval at <anonymous> ... table_lupe.js   require([ 'underscore', 'jquery', 'splunkjs/mvc', 'splunkjs/mvc/tableview', 'splunkjs/mvc/simplexml/ready!' ], function(_, $, mvc, TableView) { var CustomIconRenderer1 = TableView.BaseCellRenderer.extend({ canRender: function(cell) { return cell.field === 'Metrics'; }, render: function($td, cell) { var icon = 'lupe'; // Create the icon element and add it to the table cell $td.addClass('icon').html(_.template('<div class="myicon <%- icon%>"></div>', { icon: icon, })); } }); var CustomIconRenderer2 = TableView.BaseCellRenderer.extend({ canRender: function(cell) { return cell.field === 'Metrics'; }, render: function($td, cell) { var icon = 'lupe'; // Create the icon element and add it to the table cell $td.addClass('icon').html(_.template('<div class="myicon <%- icon%>"></div>', { icon: icon, })); } }); mvc.Components.get('lupe1').getVisualization(function(tableView1){ // Register custom cell renderer, the table will re-render automatically tableView1.addCellRenderer(new CustomIconRenderer1()); }); mvc.Components.get('lupe2').getVisualization(function(tableView2){ // Register custom cell renderer, the table will re-render automatically tableView2.addCellRenderer(new CustomIconRenderer2()); }); });     table_lupe.css   /* Custom Icons */ td.icon { text-align: center; } td.icon .lupe { background-image: url('lupe.png') !important; background-size:20px 20px; } td.icon .myicon { width: 20px; height: 20px; margin-left: auto; margin-right: auto }     Maybe some one can help me find whats the problem. Thank you.  
Hello, I am trying to write a script to run Splunk events every morning using PowerShell.  Has anyone done this before? Thanks,