All Topics

Top

All Topics

I am trying to extract fields for this custom data but unable to parse the data | extract kv pairdelim="  " kvdelim=" _: " Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.... See more...
I am trying to extract fields for this custom data but unable to parse the data | extract kv pairdelim="  " kvdelim=" _: " Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123569 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 123456789 _: duckcreek.medpro.com _: UAT _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 123456789 _: duckcreek.medpro.com _: NFT2 _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 12345 _: duckcreek.medpro.com _: UAT _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 12345 _: duckcreek.medpro.com _: NFT2 _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: UAT _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: NFT2 _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Any help would be appreciated . can someone please help me with parsing from search or command line using props
In July, the Splunk Threat Research Team had 3 releases of new security content via the Enterprise Security Content Update (ESCU) app (v4.35.0, v4.36.0 and v.37.0). With these releases, there are 36 ... See more...
In July, the Splunk Threat Research Team had 3 releases of new security content via the Enterprise Security Content Update (ESCU) app (v4.35.0, v4.36.0 and v.37.0). With these releases, there are 36 new analytics, 6 new analytic stories, 6 updated analytics, and 20 updated analytic stories now available in Splunk Enterprise Security via the ESCU application update process. Content highlights include: The new AcidPour analytic story includes content to help detect and investigate activity that might relate to AcidPour Wiper malware. To learn more about this malware and the content in the related analytic story, check out this blog. The new Gozi Malware analytic story covers the detection and analysis of Gozi malware (also known as Ursnif or ISFB), one of the oldest and most persistent banking trojans. The new ShrinkLocker analytic story includes detections related to ShrinkLocker, a new ransomware that uses Windows BitLocker to encrypt files by creating new boot partitions. New Analytics (36) Splunk DoS via POST Request Datamodel Endpoint Splunk Information Disclosure on Account Login Splunk RCE PDFgen Render Splunk RCE via External Lookup Copybuckets Splunk Stored XSS conf-web Settings on Premises Splunk Stored XSS via Specially Crafted Bulletin Message Splunk Unauthenticated DoS via Null Pointer References Splunk Unauthenticated Path Traversal Modules Messaging Splunk Unauthorized Experimental Items Creation Splunk XSS Privilege Escalation via Custom Urls in Dashboard Splunk XSS Via External Urls in Dashboards SSRF Detect Distributed Password Spray Attempts Detect Password Spray Attempts Internal Horizontal Port Scan Internal Vertical Port Scan Windows AD add Self to Group Windows Increase in Group or Object Modification Activity Windows Increase in User Modification Activity Windows Network Share Interaction With Net Windows Vulnerable Driver Installed Ivanti EPM SQL Injection Remote Code Execution MOVEit Certificate Store Access Failure MOVEit Empty Key Fingerprint Authentication Attempt Windows ESX Admins Group Creation Security Event Windows ESX Admins Group Creation via Net Windows ESX Admins Group Creation via PowerShell Windows Known Abused DLL Loaded Suspiciously (External Contributor: @nterl0k) Windows LOLBAS Executed As Renamed File (External Contributor: @nterl0k) Windows LOLBAS Executed Outside Expected Path (External Contributor: @nterl0k) Windows Modify Registry Configure BitLocker Windows Modify Registry Delete Firewall Rules Windows Modify Registry Disable RDP Windows Modify Registry on Smart Card Group Policy Windows Modify Registry to Add or Modify Firewall Rule Windows Outlook WebView Registry Modification Windows Privileged Group Modification (External Contributor: @TheLawsOfChaos) New Analytic Stories (6) AcidPour Gozi Malware Ivanti EPM Vulnerabilities MOVEit Transfer Authentication Bypass ShrinkLocker VMware ESXi AD Integration Authentication Bypass CVE-2024-37085 Updated Analytics (6) Splunk CSRF in the SSG kvstore Client Endpoint Splunk Enterprise Windows Deserialization File Partition Splunk Stored XSS via Data Model objectName Field Splunk XSS in Highlighted JSON Events Splunk XSS in Save table dialog header in search page Splunk risky Command Abuse disclosed february 2023 Updated Analytic Stories (20) Detect Remote Access Software Usage DNS (External Contributor: @nterl0k) Detect Remote Access Software Usage FileInfo (External Contributor: @nterl0k) Detect Remote Access Software Usage File (External Contributor: @nterl0k) Detect Remote Access Software Usage Process (External Contributor: @nterl0k) Detect Remote Access Software Usage Traffic (External Contributor: @nterl0k) Detect Remote Access Software Usage URL (External Contributor: @nterl0k) Possible Lateral Movement PowerShell Spawn Linux Obfuscated Files or Information Base64 Decode Linux Decode Base64 to Shell Windows Protocol Tunneling with Plink Malicious PowerShell Process - Encoded Command Windows Event Log Cleared Azure AD Admin Consent Bypassed by Service Principal (External Contributor: @dluxtron) Azure AD Global Administrator Role Assigned (External Contributor: @dluxtron) Azure AD Privileged Role Assigned (External Contributor: @dluxtron) Azure AD Service Principal New Client Credentials (External Contributor: @dluxtron) Detect New Local Admin account (External Contributor: @dluxtron) Kerberos Pre-Authentication Flag Disabled in UserAccountControl (External Contributor: @dluxtron) Detect Renamed PSExec (External Contributor: Alex Oberkircher, Github) Scheduled Task Initiation on Remote Endpoint (External Contributor: @badoodish, Github) The team also published the following blogs: Introducing ShellSweepPlus: Open-Source Web Shell Detection Breaking Down Linux.Gomir: Understanding this Backdoor’s TTPs Splunk Security Content for Impact Assessment of CrowdStrike Windows Outage AcidPour Wiper Malware: Threat Analysis and Detections For all our tools and security content, please visit research.splunk.com. — The Splunk Threat Research Team
  Hi All, Httpevent collector logs in to splunk, not showing the host,source,sourcetype in splunk, please find the below screen shot, please help me.    
Hi All, I am trying to create a scatter dashboard or similar in Dashboard Studio to show debit transaction amounts over time. A query like this works well in Search, but translates poorly to the ... See more...
Hi All, I am trying to create a scatter dashboard or similar in Dashboard Studio to show debit transaction amounts over time. A query like this works well in Search, but translates poorly to the dashboard: source="Transaction File.csv" "Debit Amount"="*" | stats values("Debit Amount") BY "Posted Transactions Date" I am aware I likely need to convert the the date from string format to date format within my search, something to the effect of:  |eval Date = strptime("Posted Transactions Date","%d/%m/%y") But I am struggling to get the final result. I have also played around with using the _time field instead of Posted Transaction Date  field and with timecharts without success which I think is likely also a formatting issue.  Eg:  source=source="Transaction File.csv" | timechart values("Debit Amount")   As there are multiple debit amount values per day in some cases, I would ideally like a 2nd similar dashboard that sums these debits per day instead of showing them as individual values whilst also removing 1 outlier debit amount value of 7000. Struggling a bit with the required search(s) to get my desired dashboard results. Any help would be appreciated, thank you!        
Dear All, I need your assistance in fetching Microsoft Exchange Server logs using the Splunk Universal Forwarder. I can provide the paths for the MSG Tracking, SMTP, and OWA log files. The goal is ... See more...
Dear All, I need your assistance in fetching Microsoft Exchange Server logs using the Splunk Universal Forwarder. I can provide the paths for the MSG Tracking, SMTP, and OWA log files. The goal is to configure the Universal Forwarder to collect these logs and forward them to a central Splunk server. Given that the Splunk documentation indicates that the MS Exchange App is end-of-life (EOL), is it necessary to use an add-on? The documentation suggests creating GPO policies and making other changes. However, in IBM QRadar, the process is simpler: you install the WinCollect agent, specify the paths for MSG Tracking, SMTP, and OWA logs, and the agent collects and forwards the logs to the QRadar Console. The Auto Discovery feature in QRadar then creates the log source automatically. Is there a simpler and more straightforward method to collect these logs using the Splunk Universal Forwarder? Thank you in advance for your assistance.
how and when does the UF check in with the Cluster Node to see what indexers are available.   thank You, spent some time on searching, No joy.
Hi Team, I have many(10+) large csv lookup files (200MB) being referred in multiple places. To improve optimisation, moved them into compressed files (*.csv.gz) and made changes in existing files. ... See more...
Hi Team, I have many(10+) large csv lookup files (200MB) being referred in multiple places. To improve optimisation, moved them into compressed files (*.csv.gz) and made changes in existing files. In some cases, we need to retain the existing file data. Thus said, | outputlookup *.csv.gz ```will fail to retain old data``` Hence Planning to do as follows, | inputlookup "old_file.csv.gz" | outputlookup "temp_file.csv" ```taking a backup ``` | Run a new search | outputlookup append=t "temp_file.csv" ```append new search results to the temp file``` | inputlookup "temp_file.csv" | outputlookup "old_file.csv.gz" | makeresults | where false() | outputlookup create_empty=true "temp_file.csv"   And this needs to be done on multiple places :(. Is there any better way to perform this without creating/clearing temp files?
Hi Team, I am using SplunkSDK version 1.12.1 (splunk-sdk@1.12.1). We are using oneshotSearch to get splunk query data from the Get api. Please see below code snippet for executeSearch     module.... See more...
Hi Team, I am using SplunkSDK version 1.12.1 (splunk-sdk@1.12.1). We are using oneshotSearch to get splunk query data from the Get api. Please see below code snippet for executeSearch     module.exports.executeSearch = function (query, params, cb) { splunkService.oneshotSearch(query, params, function (err, results) { console.log("Query is : "+query); cb(err, results); }); };     Below code is from where we are calling above      SplunkQuery.executeSearch(splunkSearch, splunkParams, function (err, results) { if (err) { if (err.data && err.data.messages) { Log.error(err.data.messages); } var error = Boom.badRequest(); error.reformat(); error.output.payload = Response.buildResponse(Errors.ERROR_RECORD_RETRIEVAL_FAILURE, []); // return reply(error); throw err; } var events = []; var rawRowIndex = results.fields.indexOf('_raw'); if (results.rows.length == 0 && request.query.id) { var error = Boom.badRequest(); error.reformat(); error.output.payload = Response.buildResponse(Errors.ERROR_INVALID_ID_PARAM, []); return h.response(error); } for (var i = 0; i < results.rows.length; i++) { var splunkRecord = results.rows[i]; Log.info("splunkRecord"+splunkRecord); if (splunkRecord && splunkRecord[rawRowIndex]) { var rawRecord = splunkRecord[rawRowIndex]; events.push(Util.splunkRecordToEvent(JSON.parse(rawRecord.replace(/\nValue of UseDynamoDB = True/g, '')))); } } Log.info("end splunck sear"); Log.info('Splunk search completed, events count:'+events.length); h.response(Response.buildResponse(0, events)); });     I can see the result or events in console with the search count (Splunk Search completed, events count: ) log as well. But I am getting 500 error as response through curl and postman too. What code changes I have to do to get the result data as response. Please suggest. Thank you
Hi Team, We have designed a studio dashboard like below with 70 KPI panels in the same page, but we want the KPI panels to be splitted and add only 20 panels per page, so that the next panels will g... See more...
Hi Team, We have designed a studio dashboard like below with 70 KPI panels in the same page, but we want the KPI panels to be splitted and add only 20 panels per page, so that the next panels will go to next page. and also we want to design this pages in such a way that they will be moving on to next pages one after the other with a sepecified time span, so that we can select the time span. Time span should be 10 seconds, 20 seconds, 30 seconds like that. The above shown is the kpi dsahboard we have designed. We request you to kindly help us on this.
Hi Team, We have recently started ingesting Apache access and request logs from an application, but the data parsing isn't working as expected. Could you please let me know the field names for thes... See more...
Hi Team, We have recently started ingesting Apache access and request logs from an application, but the data parsing isn't working as expected. Could you please let me know the field names for these events so I can try to extract them manually? Alternatively, do we have any format or add-on available that would enable automatic field extraction? If so, that would also be fine with me. For your information, our Splunk Search Head is hosted in the cloud and managed by Splunk Support. I have provided the log structure for both log sources for reference. Please help to check and update.   Request Logs: [09/Aug/2024:07:50:37 +0000] xx.yyy.zzz.aa TLSv1.2 ABCDE-FGH-IJK256-LMN-SHA123 "GET /share/page/ HTTP/1.1" xxxxx [09/Aug/2024:07:50:37 +0000] xx.yyy.zzz.aa TLSv1.2 xxxxx-xxx-xxx256-xxx-xxx123 "GET /share/page/ HTTP/1.1" - Access Logs: xx.yyy.zzz.aa - - [09/Aug/2024:07:57:00 +0000] "GET /share/page/ HTTP/1.1" 200 xxxxx aaa.bbb.ccc.dd - - [09/Aug/2024:07:56:53 +0000] "GET /share/page/ HTTP/1.1" 200 - Thank you.
Hello Splunkers!! I am executing below script for backfilling the summary index against my saved search. The script is working fine for date 4th and 6th august but it is not working for 5th of Aug... See more...
Hello Splunkers!! I am executing below script for backfilling the summary index against my saved search. The script is working fine for date 4th and 6th august but it is not working for 5th of Aug. Please help me to suggest me some of the potential reasons why script is not working for 5th Aug, although I have data available to main index from where saved search push data to summary index.   Example of script which I executed so far for 5th Aug. splunk cmd python fill_summary_index.py -app customer -name si_summary_search -et 1693883300 -lt 1693969700 -j 8 -owner admin -auth admin:yuuuyyyxx As I am getting below Warning also for 4th 5th and 6th only.    
Hi Team, I'm working on setting up a dashboard that includes the following EUM Browser metrics: Monthly Active Users Bounce Rate Session Duration Daily Average Active Users Could anyone provi... See more...
Hi Team, I'm working on setting up a dashboard that includes the following EUM Browser metrics: Monthly Active Users Bounce Rate Session Duration Daily Average Active Users Could anyone provide guidance on how to retrieve these metrics and display them on a dashboard? Best regards, Nivedita Kumari
Hello, Can anyone help me in getting this error resolved ? 2024-08-09 10:50:00,282 DEBUG pid=8956 tid=MainThread file=connectionpool.py:_new_conn:1007 | Starting new HTTPS connection (5): cisco-man... See more...
Hello, Can anyone help me in getting this error resolved ? 2024-08-09 10:50:00,282 DEBUG pid=8956 tid=MainThread file=connectionpool.py:_new_conn:1007 | Starting new HTTPS connection (5): cisco-managed-ap-northeast-2.s3.ap-northeast-2.amazonaws.com:443 2024-08-09 10:50:00,312 DEBUG pid=8956 tid=MainThread file=endpoint.py:_do_get_response:205 | Exception received when sending HTTP request. Traceback (most recent call last): File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/connectionpool.py", line 710, in urlopen chunked=chunked, File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/connectionpool.py", line 386, in _make_request self._validate_conn(conn) File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/connectionpool.py", line 1042, in _validate_conn conn.connect() File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/connection.py", line 429, in connect tls_in_tls=tls_in_tls, File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/util/ssl_.py", line 450, in ssl_wrap_socket sock, context, tls_in_tls, server_hostname=server_hostname File "/splb001/splunk_fw_teams/etc/apps/TA-cisco-cloud-security-umbrella-addon/bin/ta_cisco_cloud_security_umbrella_addon/aob_py3/urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock, server_hostname=server_hostname) File "/splb001/splunk_fw_teams/lib/python3.7/ssl.py", line 423, in wrap_socket session=session File "/splb001/splunk_fw_teams/lib/python3.7/ssl.py", line 870, in _create self.do_handshake() File "/splb001/splunk_fw_teams/lib/python3.7/ssl.py", line 1139, in do_handshake self._sslobj.do_handshake() ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1106)
Hi All,   Please provide conf files ( inputs.conf,props.con,outputs.conf) to index the below format data on daily basis  
Tech Talk: Security Edition It’s time to accelerate your approach to asset and risk management, enhanced with the power of your Splunk data! Join us as we discuss the dynamic capabilities of S... See more...
Tech Talk: Security Edition It’s time to accelerate your approach to asset and risk management, enhanced with the power of your Splunk data! Join us as we discuss the dynamic capabilities of Splunk® Asset and Risk Intelligence. Learn how you can seamlessly leverage your existing Splunk data sources to create a comprehensive and accurate asset inventory including endpoints, servers, users, cloud, OT/IoT and more. We’ll highlight how you can reduce the time spent pivoting between systems and how Splunk Asset and Risk Intelligence can empower your security team by providing them real-time compliance metrics, enriched asset data, and contextual asset insights. Watch full Tech Talk here:
  Tech Talk: App Dev Edition   Splunk has tons of out-of-the-box functionality, and you’ve likely used Splunkbase apps to extend Splunk even further. What if you’re looking for even more?! ... See more...
  Tech Talk: App Dev Edition   Splunk has tons of out-of-the-box functionality, and you’ve likely used Splunkbase apps to extend Splunk even further. What if you’re looking for even more?! This Tech Talk is an introduction to Splunk App development. This is your first step in understanding and getting started developing your first Splunk application to maximize the value of Splunk. Join this Tech Talk to learn… What is a Splunk app and how do I get started building an app How do I test my app and deploy my app on Splunk How can I share my app with the broader Splunk community.
Hi Team  Is there any way to create Sankey style tile for a single value , below image explaing abt group value.   Where i would like to break into single like Account locked , Invalid Login in se... See more...
Hi Team  Is there any way to create Sankey style tile for a single value , below image explaing abt group value.   Where i would like to break into single like Account locked , Invalid Login in separate tile     
Hi  I have few question regarding dashboard studio is there any way to customise the shape menu ex - line  Can i rotate the image based on my design ( Dashboard Studio ) How or Where to find out ... See more...
Hi  I have few question regarding dashboard studio is there any way to customise the shape menu ex - line  Can i rotate the image based on my design ( Dashboard Studio ) How or Where to find out more shape image ( Dashboard Studio ) attached image below How to make a text appear like a shadow in this space  
Hi Alll I have created Map tile in the dashboard studio however the query is running with no issue but i cannot see the out and i am getting the same error message  for multiple map tile. The below... See more...
Hi Alll I have created Map tile in the dashboard studio however the query is running with no issue but i cannot see the out and i am getting the same error message  for multiple map tile. The below map layout are from dashboard studio  Marker Layer with Base Configurations Marker Layer with Dynamic Coloring Bubble Layer with Single Series Bubble Layer with Multiple Series Choropleth Layer World Choropleth Layer with hidden base layer   Code  - index=test "event.Properties.apikey"="*" "event.endpoint"="*" | iplocation event.Properties.ip | dedup event.Properties.ip | top limit=20 Country Output - Blank with data however no errror was triggered   
When importing Prometheus metric data into Splunk, the following error is output. (Importing is performed using 'Prometheus Metrics for Splunk') /opt/splunk/var/log/splunk/splunkd.log WARN Pipelin... See more...
When importing Prometheus metric data into Splunk, the following error is output. (Importing is performed using 'Prometheus Metrics for Splunk') /opt/splunk/var/log/splunk/splunkd.log WARN PipelineCpuUsageTracker [1627 parsing] - No indexkey available chan=source::prometheusrw:sourcetype::prometheusrw:host::splunk-hf-75869c4964-phm44 timetook=1 msec. WARN TcpOutputProc [9736 indexerPipe] - Pipeline data does not have indexKey. [_path] = /opt/splunk/etc/apps/modinput_prometheus/linux_x86_64/bin/prometheusrw\n[_raw] = \n[_meta] = punct::\n[_stmid] = 3CUUsSnja9PAAB.B\n[MetaData:Source] = source::prometheusrw\n[MetaData:Host] = host::splunk-hf-6448d7ffdb-ltzbr\n[MetaData:Sourcetype] = sourcetype::prometheusrw\n[_done] = _done\n[_linebreaker] = _linebreaker\n[_charSet] = UTF-8\n[_conf] = source::prometheusrw|host::splunk-hf-6448d7ffdb-ltzbr|prometheusrw|2\n[_channel] = 2\n Please tell me the cause of the error and how to deal with it.