All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I'm not sure what I've done -  Getting an error when trying to use the webtools curl add on which I'm not getting from postman or powershell "https://<myhost>.splunkcloud.com:8089/services/server... See more...
I'm not sure what I've done -  Getting an error when trying to use the webtools curl add on which I'm not getting from postman or powershell "https://<myhost>.splunkcloud.com:8089/services/server/introspection/kvstore/serverstatus" HTTPSConnectionPool(host='<myhost>.splunkcloud.com', port=8089): Max retries exceeded with url: /services/server/introspection/kvstore/serverstatus (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fb7520b0990>: Failed to establish a new connection: [Errno 110] Connection timed out')) Any internal settings from within splunk web likely to have that effect ?
Does anyone know if it's possible to rename an HEC or do you have to create a new one and update the token everywhere?  Seems like just renaming it should be an option under edit, but not seeing anyt... See more...
Does anyone know if it's possible to rename an HEC or do you have to create a new one and update the token everywhere?  Seems like just renaming it should be an option under edit, but not seeing anything. Thanks,    
Similar to some other existing community posts, I am having issues sending POST requests to the https://.../services/collector/event endpoint of my Splunk enterprise server running on AWS after follo... See more...
Similar to some other existing community posts, I am having issues sending POST requests to the https://.../services/collector/event endpoint of my Splunk enterprise server running on AWS after following Splunk guides on creating self signed ssl and using it.   Using -k in curl to skip insecure verify works, but including --cacert myselfsignedca does not. I've gone further and even added relevant x509 extensions like SANs with no success. The result from curl: ... * successfully set certificate verify locations: * CAfile: ./splunkCA.pem CApath: /etc/ssl/certs * TLSv1.3 (OUT), TLS handshake, Client hello (1): * TLSv1.3 (IN), TLS handshake, Server hello (2): * TLSv1.2 (IN), TLS handshake, Certificate (11): * TLSv1.2 (OUT), TLS alert, Server hello (2): * SSL certificate problem: self signed certificate in certificate chain * stopped the pause stream! * Closing connection 0 curl: (60) SSL certificate problem: self signed certificate in certificate chain More details here: https://curl.haxx.se/docs/sslcerts.html curl failed to verify the legitimacy of the server and therefore could not establish a secure connection to it. To learn more about this situation and how to fix it, please visit the web page mentioned above. ...   Any help is appreciated!
Hey Splunkers !   When an error occur during integration process, will that be recorded by "_internal" index?? Will data on-boarding / data parsing errors recorded by the _internal index....? ... See more...
Hey Splunkers !   When an error occur during integration process, will that be recorded by "_internal" index?? Will data on-boarding / data parsing errors recorded by the _internal index....? if so , logical SPL query to trouble shoot those errors would be welcome what kind of integration errors will be recorded in _internal index ?
According to my tests the Authorization header should not have a space between the colon and splunk keyword.  It should be "Authorization:Splunk ###-####..." and not "Authorization:  Splunk ###-####.... See more...
According to my tests the Authorization header should not have a space between the colon and splunk keyword.  It should be "Authorization:Splunk ###-####..." and not "Authorization:  Splunk ###-####..." https://docs.splunk.com/Documentation/Splunk/9.0.1/Data/FormateventsforHTTPEventCollector In other words this works: curl -k https://prd-p.splunkcloud.com:8088/services/collector -H "Authorization:Splunk ###-######" -d "{\"sourcetype\":\"_json\",\"index\": \"job1\",\"event\": {\"a\": \"value1\", \"b\": [\"value1\", \"value1\"]}}" Whereas this does not work: curl -k https://prd-p.splunkcloud.com:8088/services/collector -H "Authorization: Splunk ###-######-b680-72c7bd33f9bb" -d "{\"sourcetype\":\"_json\",\"index\": \"job1\",\"event\": {\"a\": \"value1\", \"b\": [\"value1\", \"value1\"]}}"  
  I want to create subsearch based on parent fields search. I want to show only rows from cor_inbox_entry that includes keys.OrderID. (keys.OrderID is substring of fullBodID) Example ... See more...
  I want to create subsearch based on parent fields search. I want to show only rows from cor_inbox_entry that includes keys.OrderID. (keys.OrderID is substring of fullBodID) Example for fullBodID : infor-nid:infor:111:APRD00908_2022-09-06T12:01:26Z:?ProductionOrder&verb=Process&event=10545  Example for keys.OrderID : APRD00908 index=elbit_im sourcetype=cor_inbox_entry | spath input=C_XML output=bod path=ConfirmBOD.DataArea.BOD | xpath outfield=fullBodID field=bod "//NameValue[@name='MessageId']" |appendpipe [ search "metadata.Composite"=ReportOPMes2LN | search fullBodID = "*".keys.OrderID."*"] | table _time, fullBodID Any idea?
Hi, the initial situation is the following: I have an all-in-one instance, that simultaneously takes on the role of the DS, and a UF that sends its data to the AiO. The required stanzas were distri... See more...
Hi, the initial situation is the following: I have an all-in-one instance, that simultaneously takes on the role of the DS, and a UF that sends its data to the AiO. The required stanzas were distributed as a separate app, in addition to the Linux TA via the DS. Scripted inputs from the TA like "vmstat.sh" or "netstat.sh" can be browsed on the AiO and work so far. In the next step I wanted to activate the "cpu_metric.sh" stanza and proceeded like this: 1. I created a metric index on the AiO, called "linux_metrics". 2. I configured the inputs.conf under the "deployment-apps" on the AiO and enabled the stanza. This config was pushed to the UF, or rather it was pulled by the UF. Config: [script:///opt/splunkforwarder/etc/apps/Splunk_TA_nix/bin/cpu_metric.sh] interval = 30 disabled = 0 index = linux_metrics Unfortunately, however, no data ran into my metric index. "For fun" I tried the same procedure for other metric stanzas, which then immediately passed their data to the dedicated indexer. Standard solutions, like installing sysstat, have already been tried. Maybe one of you can think of something else. Thanks in advance.
Hi All, I am eager to know which one would be a preferred monitoring tool and its relatively cost effective as well. i.e. Splunk or Microsoft Sentinel  If there are any comparison documents or Supp... See more...
Hi All, I am eager to know which one would be a preferred monitoring tool and its relatively cost effective as well. i.e. Splunk or Microsoft Sentinel  If there are any comparison documents or Supporting documents then kindly help to share the link.   
Hi everybody, I need your assistance if you have encountered this problem or not since I want to mask a particular field before it is processed by SPLUNK indexers. I need to mask a particular field... See more...
Hi everybody, I need your assistance if you have encountered this problem or not since I want to mask a particular field before it is processed by SPLUNK indexers. I need to mask a particular field because the data will be transferred via a universal forwarder (an externally installed agent). Regards, Amira  
Hello everyone, I am trying to create an add-on REST API with Splunk to download a .csv file from a confluence page and push the information to Splunk.  I would like to have a table with the conten... See more...
Hello everyone, I am trying to create an add-on REST API with Splunk to download a .csv file from a confluence page and push the information to Splunk.  I would like to have a table with the content of the .csv file, but so far I was just able to get one entry with 130 lines in the index and sourcetype defined by the add-on. It would help if I could split at the end of every line.. I've tried to define the sourcetype in porps.conf with line breaker and some other regex params. and at this point I don't know how to move on. Data sample:   1 UserID,Profile,Role1,Role2,Licenses,Programs,Domain,FieldX 2 user139,Profile description,Role 1,Role 2,"lic1,lic2,lic3,lic4,lic5,lic6",protram,domain,extra_field . . 130 user139,Profile description,Role 1,Role 2,"lic1,lic2,lic3,lic4,lic5,lic6",protram,domain,extra_field   I've tried to reproduce the content as best as I could, included spaces. The content between the quotes "", all belong to the header "licenses".
Hi All, We are using the Microsoft Cloud Services Add-On for Splunk to integrate and ingest the logs from Azure Storage Table and Azure Storage Blob and already we have ingested data from Azure Stor... See more...
Hi All, We are using the Microsoft Cloud Services Add-On for Splunk to integrate and ingest the logs from Azure Storage Table and Azure Storage Blob and already we have ingested data from Azure Storage Table and Blob into our Splunk successfully. But i have a doubt, so once the logs are getting ingested into Splunk then it would be tracked on how much GB we are ingesting into Splunk on daily basis and it will calculated in daily licensing for Splunk. But actually we are pulling the logs from Azure Storage Table and Blob so would there be any cost involved from Azure front if we pull the logs from Azure to Splunk?   And if it is yes then how much azure would be charging for the log ingestion into Splunk. If we have useful links to check on the same if then kindly share as well.
Hi, I'm new as Splunk user, I'm asking your help   I would like to create an easy dashboard with VPN datas. My search :   index="fw_paloalto" ( sourcetype="pan:globalprotect" log_subtype... See more...
Hi, I'm new as Splunk user, I'm asking your help   I would like to create an easy dashboard with VPN datas. My search :   index="fw_paloalto" ( sourcetype="pan:globalprotect" log_subtype="connected") OR (sourcetype="pan:system" log_subtype=auth signature="auth-fail")   With that datas, i would like to get values in a global timechart 1d like that :   >dc(user) WHERE log_subtype =connected + host="PA-3020*" > dc(user) WHERE log_subtype =connected + host="PA-820*" > c(user) WHERE signature="auth-fail" + host="PA-3020*" > c(user) WHERE signature="auth-fail" + host="PA-820*" For the moment, i'm not able to display that values in the same chart, i'm forced to have  1 chart per host. Hope it is clear enough, Thanks a lot for your help, Dimitri
Hello, in my dashboard, I use the "stats count" function to show the number of results for a particular search. Now I want to divide this number by 3 and display the result instead. How to do this... See more...
Hello, in my dashboard, I use the "stats count" function to show the number of results for a particular search. Now I want to divide this number by 3 and display the result instead. How to do this?   Thank you for your Help!  
Hello Community. Can you please advise me. Where in the configuration can I find out which SMTP mail server my Splunk uses to send notifications to employees? My configuration uses Search Head... See more...
Hello Community. Can you please advise me. Where in the configuration can I find out which SMTP mail server my Splunk uses to send notifications to employees? My configuration uses Search Head Cluster. I'm trying to find the configuration file where my company's SMTP is listed through which it can send alerts within our domain. I went to one of the Search Head and wanted to see the configuration at Settings/Server settings/Email settings. But there are no settings listed there. But this Search Head is sending alerts. Thank you for your feedback
Hi, I recently installed the elastic data integrator app for migrating data from elk server to splunk. After adding input option and enabling the modulator, no data is received on splunk. Is there ... See more...
Hi, I recently installed the elastic data integrator app for migrating data from elk server to splunk. After adding input option and enabling the modulator, no data is received on splunk. Is there anything additional that needs to be done while installing the modulator or there in input options. My project desires to obtain raw data from elk to splunk. I am also new to splunk. Any help will be much appreciated. I am also attaching screenshots depicting my problem. Thanking You Hari
Hi all, I am calculating a value from data and i want to plot it in a timechart.   | where status!="ABORTED" | streamstats count as start reset_on_change=true by status URL | where start=1 | strea... See more...
Hi all, I am calculating a value from data and i want to plot it in a timechart.   | where status!="ABORTED" | streamstats count as start reset_on_change=true by status URL | where start=1 | streamstats count(eval(status=="FAILURE")) as fails by status URL | eval fails=if(fails=0,null(),fails) | filldown fails | stats list(*) as * by fails URL| where mvcount(status) = 2| eval stime=mvindex(TIME, 0) | eval etime=mvindex(TIME,-1) | eval diff=(etime - stime)/3600/1000|timechart span=1mon avg(diff) as MTTR by URL|eval MTTR = round(MTTR,2)   I tried to plot timechart like this but it is not working and it is giving no results found. Is there anything needs to be done to plot a calculated value in a timechart?
I did whitelisted some hosts in DS but that servers are not deployed,  So I check internal logs for splunkd, I am seeing below error,  -0400 ERROR HttpListener - Handler for /services/streams/d... See more...
I did whitelisted some hosts in DS but that servers are not deployed,  So I check internal logs for splunkd, I am seeing below error,  -0400 ERROR HttpListener - Handler for /services/streams/deployment?name=default:fli_events_prod_12963_d209_corp_dist:fli_events_prod_12963_d209_corp_dist sent a 300 byte response after earlier claiming a Content-Length of 10240! Please help me how to fix it
Hi All, We have used azure certificate for single sign-on for our application. It was working fine until we updated certificate as its validity was over. Everything is same as before. We have not ma... See more...
Hi All, We have used azure certificate for single sign-on for our application. It was working fine until we updated certificate as its validity was over. Everything is same as before. We have not made any changes in any conf files... The path of idp cert is "\etc\auth\idpCerts". The authentication file is in \etc\system\local. Authentication file ->  [saml] idpCertPath = idpCert.pem I am not understanding what is causing this issue? can anybody please help me in this
Splunk memory is used over 95%, and the max usage process is named [Splunk server] -- over 60%. And the High Memory Usage is still keeping High-memory-usage from Aug 29th. Also,  it seems be caused... See more...
Splunk memory is used over 95%, and the max usage process is named [Splunk server] -- over 60%. And the High Memory Usage is still keeping High-memory-usage from Aug 29th. Also,  it seems be caused the Splunk server down. After I restarted it, the memory usage was coming to 95% soon. Also, I see the Q&A following, and I think it's splunkd problem after researching. So I need your help to  solve splunkd problem. URL:https://docs.splunk.com/Documentation/Splunk/6.5.7/Troubleshooting/Troubleshootmemoryusage Thanks a lot. Country/Location: China/Mainland Version Number: Splunk Enterprise Server 6.6.7
Hi Im trying to change the color of a line chart with: <option name="charting.seriesColors">[000000FF]</option>  but the color remains with the default red color