All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Team, I have many(10+) large csv lookup files (200MB) being referred in multiple places. To improve optimisation, moved them into compressed files (*.csv.gz) and made changes in existing files. ... See more...
Hi Team, I have many(10+) large csv lookup files (200MB) being referred in multiple places. To improve optimisation, moved them into compressed files (*.csv.gz) and made changes in existing files. In some cases, we need to retain the existing file data. Thus said, | outputlookup *.csv.gz ```will fail to retain old data``` Hence Planning to do as follows, | inputlookup "old_file.csv.gz" | outputlookup "temp_file.csv" ```taking a backup ``` | Run a new search | outputlookup append=t "temp_file.csv" ```append new search results to the temp file``` | inputlookup "temp_file.csv" | outputlookup "old_file.csv.gz" | makeresults | where false() | outputlookup create_empty=true "temp_file.csv"   And this needs to be done on multiple places :(. Is there any better way to perform this without creating/clearing temp files?
Hi Team, I am using SplunkSDK version 1.12.1 (splunk-sdk@1.12.1). We are using oneshotSearch to get splunk query data from the Get api. Please see below code snippet for executeSearch     module.... See more...
Hi Team, I am using SplunkSDK version 1.12.1 (splunk-sdk@1.12.1). We are using oneshotSearch to get splunk query data from the Get api. Please see below code snippet for executeSearch     module.exports.executeSearch = function (query, params, cb) { splunkService.oneshotSearch(query, params, function (err, results) { console.log("Query is : "+query); cb(err, results); }); };     Below code is from where we are calling above      SplunkQuery.executeSearch(splunkSearch, splunkParams, function (err, results) { if (err) { if (err.data && err.data.messages) { Log.error(err.data.messages); } var error = Boom.badRequest(); error.reformat(); error.output.payload = Response.buildResponse(Errors.ERROR_RECORD_RETRIEVAL_FAILURE, []); // return reply(error); throw err; } var events = []; var rawRowIndex = results.fields.indexOf('_raw'); if (results.rows.length == 0 && request.query.id) { var error = Boom.badRequest(); error.reformat(); error.output.payload = Response.buildResponse(Errors.ERROR_INVALID_ID_PARAM, []); return h.response(error); } for (var i = 0; i < results.rows.length; i++) { var splunkRecord = results.rows[i]; Log.info("splunkRecord"+splunkRecord); if (splunkRecord && splunkRecord[rawRowIndex]) { var rawRecord = splunkRecord[rawRowIndex]; events.push(Util.splunkRecordToEvent(JSON.parse(rawRecord.replace(/\nValue of UseDynamoDB = True/g, '')))); } } Log.info("end splunck sear"); Log.info('Splunk search completed, events count:'+events.length); h.response(Response.buildResponse(0, events)); });     I can see the result or events in console with the search count (Splunk Search completed, events count: ) log as well. But I am getting 500 error as response through curl and postman too. What code changes I have to do to get the result data as response. Please suggest. Thank you
Hi Team, We have designed a studio dashboard like below with 70 KPI panels in the same page, but we want the KPI panels to be splitted and add only 20 panels per page, so that the next panels will g... See more...
Hi Team, We have designed a studio dashboard like below with 70 KPI panels in the same page, but we want the KPI panels to be splitted and add only 20 panels per page, so that the next panels will go to next page. and also we want to design this pages in such a way that they will be moving on to next pages one after the other with a sepecified time span, so that we can select the time span. Time span should be 10 seconds, 20 seconds, 30 seconds like that. The above shown is the kpi dsahboard we have designed. We request you to kindly help us on this.
Hi Team, We have recently started ingesting Apache access and request logs from an application, but the data parsing isn't working as expected. Could you please let me know the field names for thes... See more...
Hi Team, We have recently started ingesting Apache access and request logs from an application, but the data parsing isn't working as expected. Could you please let me know the field names for these events so I can try to extract them manually? Alternatively, do we have any format or add-on available that would enable automatic field extraction? If so, that would also be fine with me. For your information, our Splunk Search Head is hosted in the cloud and managed by Splunk Support. I have provided the log structure for both log sources for reference. Please help to check and update.   Request Logs: [09/Aug/2024:07:50:37 +0000] xx.yyy.zzz.aa TLSv1.2 ABCDE-FGH-IJK256-LMN-SHA123 "GET /share/page/ HTTP/1.1" xxxxx [09/Aug/2024:07:50:37 +0000] xx.yyy.zzz.aa TLSv1.2 xxxxx-xxx-xxx256-xxx-xxx123 "GET /share/page/ HTTP/1.1" - Access Logs: xx.yyy.zzz.aa - - [09/Aug/2024:07:57:00 +0000] "GET /share/page/ HTTP/1.1" 200 xxxxx aaa.bbb.ccc.dd - - [09/Aug/2024:07:56:53 +0000] "GET /share/page/ HTTP/1.1" 200 - Thank you.
Hello Splunkers!! I am executing below script for backfilling the summary index against my saved search. The script is working fine for date 4th and 6th august but it is not working for 5th of Aug... See more...
Hello Splunkers!! I am executing below script for backfilling the summary index against my saved search. The script is working fine for date 4th and 6th august but it is not working for 5th of Aug. Please help me to suggest me some of the potential reasons why script is not working for 5th Aug, although I have data available to main index from where saved search push data to summary index.   Example of script which I executed so far for 5th Aug. splunk cmd python fill_summary_index.py -app customer -name si_summary_search -et 1693883300 -lt 1693969700 -j 8 -owner admin -auth admin:yuuuyyyxx As I am getting below Warning also for 4th 5th and 6th only.    
Hi Team, I'm working on setting up a dashboard that includes the following EUM Browser metrics: Monthly Active Users Bounce Rate Session Duration Daily Average Active Users Could anyone provi... See more...
Hi Team, I'm working on setting up a dashboard that includes the following EUM Browser metrics: Monthly Active Users Bounce Rate Session Duration Daily Average Active Users Could anyone provide guidance on how to retrieve these metrics and display them on a dashboard? Best regards, Nivedita Kumari
Hello, Can anyone help me in getting this error resolved ? 2024-08-09 10:50:00,282 DEBUG pid=8956 tid=MainThread file=connectionpool.py:_new_conn:1007 | Starting new HTTPS connection (5): cisco-man... See more...
Hello, Can anyone help me in getting this error resolved ? 2024-08-09 10:50:00,282 DEBUG pid=8956 tid=MainThread file=connectionpool.py:_new_conn:1007 | Starting new HTTPS connection (5): cisco-managed-ap-northeast-2.s3.ap-northeast-2.amazonaws.com:443 2024-08-09 10:50:00,312 DEBUG pid=8956 tid=MainThread file=endpoint.py:_do_get_response:205 | Exception received when sending HTTP request. Traceback (most recent call last): File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/connectionpool.py", line 710, in urlopen chunked=chunked, File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/connectionpool.py", line 386, in _make_request self._validate_conn(conn) File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/connectionpool.py", line 1042, in _validate_conn conn.connect() File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/connection.py", line 429, in connect tls_in_tls=tls_in_tls, File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/util/ssl_.py", line 450, in ssl_wrap_socket sock, context, tls_in_tls, server_hostname=server_hostname File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock, server_hostname=server_hostname) File "/splb001/splunk_fw_teams/lib/python3.7/ssl.py", line 423, in wrap_socket session=session File "/splb001/splunk_fw_teams/lib/python3.7/ssl.py", line 870, in _create self.do_handshake() File "/splb001/splunk_fw_teams/lib/python3.7/ssl.py", line 1139, in do_handshake self._sslobj.do_handshake() ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1106)
Hi All,   Please provide conf files ( inputs.conf,props.con,outputs.conf) to index the below format data on daily basis  
Hi Team  Is there any way to create Sankey style tile for a single value , below image explaing abt group value.   Where i would like to break into single like Account locked , Invalid Login in se... See more...
Hi Team  Is there any way to create Sankey style tile for a single value , below image explaing abt group value.   Where i would like to break into single like Account locked , Invalid Login in separate tile     
Hi  I have few question regarding dashboard studio is there any way to customise the shape menu ex - line  Can i rotate the image based on my design ( Dashboard Studio ) How or Where to find out ... See more...
Hi  I have few question regarding dashboard studio is there any way to customise the shape menu ex - line  Can i rotate the image based on my design ( Dashboard Studio ) How or Where to find out more shape image ( Dashboard Studio ) attached image below How to make a text appear like a shadow in this space  
Hi Alll I have created Map tile in the dashboard studio however the query is running with no issue but i cannot see the out and i am getting the same error message  for multiple map tile. The below... See more...
Hi Alll I have created Map tile in the dashboard studio however the query is running with no issue but i cannot see the out and i am getting the same error message  for multiple map tile. The below map layout are from dashboard studio  Marker Layer with Base Configurations Marker Layer with Dynamic Coloring Bubble Layer with Single Series Bubble Layer with Multiple Series Choropleth Layer World Choropleth Layer with hidden base layer   Code  - index=test "event.Properties.apikey"="*" "event.endpoint"="*" | iplocation event.Properties.ip | dedup event.Properties.ip | top limit=20 Country Output - Blank with data however no errror was triggered   
When importing Prometheus metric data into Splunk, the following error is output. (Importing is performed using 'Prometheus Metrics for Splunk') /opt/splunk/var/log/splunk/splunkd.log WARN Pipelin... See more...
When importing Prometheus metric data into Splunk, the following error is output. (Importing is performed using 'Prometheus Metrics for Splunk') /opt/splunk/var/log/splunk/splunkd.log WARN PipelineCpuUsageTracker [1627 parsing] - No indexkey available chan=source::prometheusrw:sourcetype::prometheusrw:host::splunk-hf-75869c4964-phm44 timetook=1 msec. WARN TcpOutputProc [9736 indexerPipe] - Pipeline data does not have indexKey. [_path] = /opt/splunk/etc/apps/modinput_prometheus/linux_x86_64/bin/prometheusrw\n[_raw] = \n[_meta] = punct::\n[_stmid] = 3CUUsSnja9PAAB.B\n[MetaData:Source] = source::prometheusrw\n[MetaData:Host] = host::splunk-hf-6448d7ffdb-ltzbr\n[MetaData:Sourcetype] = sourcetype::prometheusrw\n[_done] = _done\n[_linebreaker] = _linebreaker\n[_charSet] = UTF-8\n[_conf] = source::prometheusrw|host::splunk-hf-6448d7ffdb-ltzbr|prometheusrw|2\n[_channel] = 2\n Please tell me the cause of the error and how to deal with it.
I plan to develop a customize visualization. I edit a formatter.html  <form class="splunk-formatter-section" section-label="Data Series"> <splunk-control-group label="Data Type"> <splunk-select id=... See more...
I plan to develop a customize visualization. I edit a formatter.html  <form class="splunk-formatter-section" section-label="Data Series"> <splunk-control-group label="Data Type"> <splunk-select id="dataTypeSelect" name="{{VIZ_NAMESPACE}}.dataType" value="Custom"> <option value="Custom">Custom</option> <option value="XBar_R-X">XBar R - X</option> <option value="LineChart">LineChart</option> <option value="Pie">Pie</option> <option value="Gauge">Gauge</option> </splunk-select> </splunk-control-group> <splunk-control-group label="Option"> <splunk-text-area id="optionTextArea" name="{{VIZ_NAMESPACE}}.option" value="{}"> </splunk-text-area> </splunk-control-group>... I wish to change dataType, then textarea option have diffenent value to appear in format menu. Menu Option  have many choice, How to modify  visualization_source.js content to get this?
Hi, I recently tried creating a private app on Splunk Cloud, the app is getting created successfully, but it does not show nor display in the list of apps which are on the Splunk Cloud. I tried to ... See more...
Hi, I recently tried creating a private app on Splunk Cloud, the app is getting created successfully, but it does not show nor display in the list of apps which are on the Splunk Cloud. I tried to create the app using both barebones and sample_app as a template with different App IDs but it didn't work, however the app is getting created and there's no error being displayed for the same, also I kept the visibility as yes. Please can someone assist me on this? Thanks!
Hi guys,   i have the following query that produces table below   index=core_ct_report_* | eval brand=case(like(report_model, "cfg%"), "grandstream", like(report_model, "cisco%"), "Cisco", l... See more...
Hi guys,   i have the following query that produces table below   index=core_ct_report_* | eval brand=case(like(report_model, "cfg%"), "grandstream", like(report_model, "cisco%"), "Cisco", like(report_model, "ata%"), "Cisco", like(report_model, "snom%"), "Snom", like(report_model, "VISION%"), "Snom", like(report_model, "yealink%"), "Yealink", 1=1, "Other") | stats count by fw_version,report_model,brand | table brand report_model fw_version count |sort report_model, count desc In this table i want to group the rows with the same value in report_model column, i use stats values() to achive that as follows index=core_ct_report_* | eval brand=case(like(report_model, "cfg%"), "grandstream", like(report_model, "cisco%"), "Cisco", like(report_model, "ata%"), "Cisco", like(report_model, "snom%"), "Snom", like(report_model, "VISION%"), "Snom", like(report_model, "yealink%"), "Yealink", 1=1, "Other") |stats count by fw_version,report_model,brand | stats values(brand) as brand values(fw_version) as fw_version values(count) as count by report_model |table brand report_model fw_version count   but with this query the count is also grouped, on 6th row there are count values missing, the count missing has the value 1 so only one '1' is showed. i can't remove count from stats values() or the count values doesn't appear in final table. What i'm doing wrong?   Thanks in advance for your help.  
[serversindex] Configuration initialization for /opt/splunk/var/run/searchpeers/serverhead-1721913866 took longer than expected (1002ms) when dispatching a search with search ID remote_serverhead_u... See more...
[serversindex] Configuration initialization for /opt/splunk/var/run/searchpeers/serverhead-1721913866 took longer than expected (1002ms) when dispatching a search with search ID remote_serverhead_userxx__userxx__search__search1_1723144245.50. This usually indicates problems with underlying storage performance.
I have a custom command that calls a script for nslookup and returns the data to splunk. All of it is working but I want to use this custom command in Splunk to return the data to an eval and output ... See more...
I have a custom command that calls a script for nslookup and returns the data to splunk. All of it is working but I want to use this custom command in Splunk to return the data to an eval and output that into a table. For example, the search string would look something like the following:    index="*" | iplocation src_ip | eval testdata = | nslookupsearch dest_ip | table testdata _time | sort - _time   NOTE: This is not the exact search string, this is just a mock string. When I run:   | nslookupsearch Record_Here   I get the correct output and data that I want to see. But when I run the command to attach the returned value to an eval, it fails. I keep getting errors on doing this but I can't find something that will work like this. The testdata eval keeps failing. 
HI All, I am new to using Splunk.  I am uploading a CSV to Splunk that has a column called 'Transaction Date' with the entries in DD/MM/YYYY format as shown below. At the Set Source Type step ... See more...
HI All, I am new to using Splunk.  I am uploading a CSV to Splunk that has a column called 'Transaction Date' with the entries in DD/MM/YYYY format as shown below. At the Set Source Type step I have updated the timestamp format to avoid getting the default modtime. I have updated it with %d/%m/%Y as shown below. This partly works as my '_time' field no longer shows the default modtime. However it shows the date in the incorrect format of MM/DD/YYYY instead of DD/MM/YYYY. (also shown below)     Everything else I have left as default. These are my advanced settings: Any Ideas how I can fix this to display the correct format?  Thank you!
Pretty green with SOAR and haven't been able to find an good answer to this. All of our events in SOAR are generated by pulling them in from Splunk ES.  This creates one artifact for each event.  I'... See more...
Pretty green with SOAR and haven't been able to find an good answer to this. All of our events in SOAR are generated by pulling them in from Splunk ES.  This creates one artifact for each event.  I'm looking for a way to extract data from that artifact so we can start using and labeling that data. Am I missing something here?  I haven't found much in the way of training on the data extraction part of this, so any tips for that would be great too.  
Hello, I have a 4 servers A, B C, & D. These servers points to two different DS. A & B points to US DS server, C & D servers points to UK DS Server. I'm selecting these 4 servers in an multise... See more...
Hello, I have a 4 servers A, B C, & D. These servers points to two different DS. A & B points to US DS server, C & D servers points to UK DS Server. I'm selecting these 4 servers in an multiselect value and it has to show two different panels. (hide initially) But, If i select only A & B it has show only US DS panel. (I don't want to show the DS values in the input values.