All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

HI Team, I am getting 2 hr time span only if i mentioned the 1 or 3 or 4 hours span too in the visualization line chart. Running below command gives correct time 1hr span but in the visualization... See more...
HI Team, I am getting 2 hr time span only if i mentioned the 1 or 3 or 4 hours span too in the visualization line chart. Running below command gives correct time 1hr span but in the visualization facing the issue. attached the reference. index="xx" * "*" |eval Day case(like(Date,"%22-AUg-22"),"work",like(Date,"%23-AUg-22"),"work",like(Date,"%24-AUg-22"),"week",like(Date,"%25-AUg-22"),"week",1=1,Day) |timechart span=1h max(YYY) by Day   Thanks in Advance. 
Hi, I would like to create a dropdown menu with two goals in mind : 1) Adjust button width to the size of the text selected so that the whole selected text can be read 2) In the dropdown list (... See more...
Hi, I would like to create a dropdown menu with two goals in mind : 1) Adjust button width to the size of the text selected so that the whole selected text can be read 2) In the dropdown list (when choosing a new item), many strings are too large to fit in one line. I want to increase the box list width to the size of the largest selectable text item  I managed to fix issue 1) with the greatful help of https://community.splunk.com/t5/Dashboards-Visualizations/Why-is-the-dropdown-input-width-not-increasing/m-p/416340 by adding       <html> <style> #Selection_DropDown div[data-component="splunk-core:/splunkjs/mvc/components/Dropdown"] {display: inline-block !important; width: auto !important } </style> </html>       I'm struggling though to find a SimpleXML/CSS, kind of easy solution, to the issue 2) Is it possible to complete both of these requirements ?   Cheers
Greetings, I have been creating a search that collects all the sourcetypes that have not collected any information during the last 4 hours (Which I was able to accomplish). The thing is that I need ... See more...
Greetings, I have been creating a search that collects all the sourcetypes that have not collected any information during the last 4 hours (Which I was able to accomplish). The thing is that I need to know which indexes these sourcetypes belong to in this same search. Any idea? This is the search: | metadata type=sourcetypes index=* | search sourcetype=* | where lastTime<now()-14400 | eval ageInSeconds = (now()- firstTime) | search ageInSeconds > 86400 | convert ctime(lastTime) ctime(recentTime) ctime(firstTime) | table sourcetype, lastTime
I am trying helloworld app from BlogProjects/splunk-custom-search-command-python/hello_world at master · CptOfEvilMinions/BlogProjects · GitHub. Compressed and  Installed it from file (hello_world.sp... See more...
I am trying helloworld app from BlogProjects/splunk-custom-search-command-python/hello_world at master · CptOfEvilMinions/BlogProjects · GitHub. Compressed and  Installed it from file (hello_world.spl). Then restarted Splunk... But when trying "index="zeek" sourcetype="bro:conn:json" | helloworld" getting Unknown search command 'helloworld'.   # ls -l /opt/splunk/etc/apps/hello_world/ total 4 drwxr-xr-x 2 splunk splunk 28 Sep 21 06:44 bin drwxr-xr-x 2 splunk splunk 43 Sep 21 06:44 default drwxr-xr-x 7 splunk splunk 140 Sep 21 06:44 lib drwxr-xr-x 2 splunk splunk 6 Sep 21 06:44 local drwx------ 2 splunk splunk 24 Sep 21 06:44 metadata -rw-r--r-- 1 splunk splunk 46 Sep 21 06:44 README.md # ls -l /opt/splunk/etc/apps/hello_world/bin total 4 -rwxr-xr-x 1 splunk splunk 491 Sep 21 06:44 hello_world.py # cat /opt/splunk/etc/apps/hello_world/default/commands.conf [helloworld] python.version = python3 chunked = true  
I am trying to create an alert to record failed logins for the Splunk servers, however not all of them show up in my current alert.   I can get the Search Heads and one of my Heavy Forwarders but my ... See more...
I am trying to create an alert to record failed logins for the Splunk servers, however not all of them show up in my current alert.   I can get the Search Heads and one of my Heavy Forwarders but my Indexers, Deployment/License server and Cluster Master/Monitoring Console server are not reporting.  Is there something that needs to be added or enabled to these servers?
Hello, I would like to display dates in a dashboard studio table, i want the format to be "%Y-%m-%d" but it is not displayed as such. Here is the spl excerpt:     | eval vuln_publication_d... See more...
Hello, I would like to display dates in a dashboard studio table, i want the format to be "%Y-%m-%d" but it is not displayed as such. Here is the spl excerpt:     | eval vuln_publication_date_string = strftime(normalized_publication_time,"%Y-%m-%d")     Here is the result of the search associated with the table. The type of the field is a string       And here the table itself. I guess it is due to the format, but i cannot change it   Does anybody have an idea how to force the format in the table ? Thank you
Hello fellow Splunkers. I am trying to set the sourcetype name using a part of the source path. I've read the answers from the same question on the community, but i just cant get it working, so il... See more...
Hello fellow Splunkers. I am trying to set the sourcetype name using a part of the source path. I've read the answers from the same question on the community, but i just cant get it working, so ill give it a shot and ask here. The goal: Set sourcetype name from the third folder in the source path. For example, automaticly set sourcetype to "foobar" from logs collected from source C:\data\Logs\foobar\logfile.log What i have tried: (on the universal forwarder) inputs.conf: [monitor://C:\data\Logs\...\*] index = main   (on the indexer receiving from the universal forwarder) props.conf: [source::C:\data\Logs\*] TRANSFORMS-changesourcetype = changesourcetype transforms.conf: [changesourcetype] SOURCE_KEY = MetaData:Source REGEX = C:\\data\\[^\\]+\\([^\\]+)\\ #REGEX = C:\\data\\Logs\\(\w+)\\ (Commented out: for debugging purposes) FORMAT = sourcetype::$1 DEST_KEY = MetaData:Sourcetype I've also tried (for debugging-reasons) solving it differently, by tagging a temporary sourcetype at the UFW: inputs.conf: [monitor://C:\data\Logs\...\*] index = main sourcetype = changemeplease props.conf: [changemeplease] TRANSFORMS-changesourcetype = changesourcetype transforms.conf: [changesourcetype] SOURCE_KEY = MetaData:Source REGEX = C:\\data\\[^\\]+\\([^\\]+)\\ #REGEX = C:\\data\\Logs\\(\w+)\\ (Commented out: for debugging purposes) FORMAT = sourcetype::$1 DEST_KEY = MetaData:Sourcetype   The data gets forwarded and indexed, but the transforms seem to not hit, what so ever. Any suggestions on what i am doing wrong here?
I have to ingest some data so i've created a field called customer data and the regex works fine - ^[0-9]{16}.{249}(?<customer_information>.{174}). As it contains PII data i need to mask it but keep... See more...
I have to ingest some data so i've created a field called customer data and the regex works fine - ^[0-9]{16}.{249}(?<customer_information>.{174}). As it contains PII data i need to mask it but keep the format of that event so the 174 characters within the customer_information field news to show as ####   Ive created this within the props.conf file but I can't get the data to be shown as ###. can you help? [mask_customer_data] DEST_KEY = _raw REGEX = ^[0-9]{16}.{249}(?<customer_information>.{174}) FORMAT = $1CI##############################################################################################################################################################################
Hi Splunkers, i work with Splunk Enterprise and wonder if i can modify content in share folder. I would like to change the fonts of the web-GUI to our corporate fonts and found the fonts used by ... See more...
Hi Splunkers, i work with Splunk Enterprise and wonder if i can modify content in share folder. I would like to change the fonts of the web-GUI to our corporate fonts and found the fonts used by splunk in $SPLUNK_HOME/share/splunk/search_mrsparkle/exposed/fonts and the CSS for that in ../exposed/build/css/bootstrap-enterprise.css (searched for proxima in the css) Will it have any Impact on splunk Version updates or Splunksupport if i change the Fonts to corporate Fonts? Thanks in Advance for your answers
I wonder if someone can help, we are getting the following error when trying to send data into Splunk, this previously worked but now we cant seem to get it working at all,  I have tried to curl the ... See more...
I wonder if someone can help, we are getting the following error when trying to send data into Splunk, this previously worked but now we cant seem to get it working at all,  I have tried to curl the event manually and it succeeds, which is even stranger. The error message is token name=xxxx, channel=********* source_IP=******, reply=6, events_processed=0, http_input_body_size=101493, parsing_err="While expecting event object to start: Unexpected character while looking for value: 'E', totalRequestSize=101493" the event we are trying to send looks like this {     "time": 1663679182,     "host": "test-sandbox",     "source": "aws/lambda",     "sourcetype": "aws:lambda",     "index": "xxx-xxx",     "event": {         "message": "2022/09/20 14:06:22 node=test-sandbox Starting to move cantabm_testfile-9.zip from c21-metadata-dropzone-sandbox to c21-metadata-dest-sandbox/Metadata/cantabm_testfile-9.zip\n",         "account": "11111111111"     } }
Hello! Using AppDynamics (SaaS Pro Edition), we would like to collect logs from Azure, so we are able to create useful custom dashboards. The data we are interested in is currently collected by Log ... See more...
Hello! Using AppDynamics (SaaS Pro Edition), we would like to collect logs from Azure, so we are able to create useful custom dashboards. The data we are interested in is currently collected by Log Analytics in Azure, but we would like to have some of that data in AppDynamics as well. All of the documention related to Log Analytics in AppDynamics mentions Analytics agent installation, but I don't think it applies to Azure App services for example, there's just AppDynamics App Agent extension available and afterwards there's no configuration to enable some Analytics Agent that you need to select while configuring Agent Scope/Source Rules. Is there a possibility to make use of Analytics Agents on Azure to fetch some logs?  Or is there another way to fetch the logs from Azure App Services that we can later see in Log Analytics or App Insights? Couldn't find an answer in the documentation. Thank you in advance! Marcin
I have one host that I want to remove from all my premade dashboards in the Splunk App for AWS Security Dashboards.  Can someone tell me where I would enter this in the source code for the Dashboard ... See more...
I have one host that I want to remove from all my premade dashboards in the Splunk App for AWS Security Dashboards.  Can someone tell me where I would enter this in the source code for the Dashboard so that it always excludes this host? 
Dear Splunkers,    Is enabling maintenance-mode for an indexers cluster needed when rebuilding frozen buckets?
Hi,  I have configured Application on SAAS platform, the application agent is successfully installed  and sending business  transaction data. But  under Application Infrastructure Performance - > Tie... See more...
Hi,  I have configured Application on SAAS platform, the application agent is successfully installed  and sending business  transaction data. But  under Application Infrastructure Performance - > Tier -> Hardware Resource,  for all the metrics the value is null (no data to display message). Please help is any change to be done to capture the host metrics data.
Hi,   I'm using the community edition of the SOAR, in that I created one label after creating that, I created a playbook and set that playbook to operate on that created label. for the automation... See more...
Hi,   I'm using the community edition of the SOAR, in that I created one label after creating that, I created a playbook and set that playbook to operate on that created label. for the automation run, I used the "Timer app" to run the playbook on that label everyday morning. Now the error comes, in automation the newly created label results in an error but the older( 2 days before ) label results properly. And if I try to run the playbook as manual on the newly created label results properly. This is the error:  'id: 789, version: 32, pyversion: 3, scm id: 2' playbook cannot be run on 'test'   Kindly help me out of this error, Thanks in advance.
In the Splunk Fortinet FortiGate app - wireless and System dashboards are not working both dashboards are not showing any data,  as I check from Fortinet app all logs are forwarded and other Splun... See more...
In the Splunk Fortinet FortiGate app - wireless and System dashboards are not working both dashboards are not showing any data,  as I check from Fortinet app all logs are forwarded and other Splunk Fortinet dashboards like (Fortinet n\w security, traffic, unified Threat management,vpn )  work perfectly.   please help.  
Just putting this here for others who come across this problem since I got no results when I searched here. After upgrading to Splunk 9.0.1 and configuring an SSL cert for the kvstore I got this err... See more...
Just putting this here for others who come across this problem since I got no results when I searched here. After upgrading to Splunk 9.0.1 and configuring an SSL cert for the kvstore I got this error on one of my two instances. On Windows, the kvstore relies on the server cert being in the Windows local machine certificate store. At startup it converts the supplied PEM (with embedded cert and password protected key) into a PFX which it then imports into the store. The error referenced in Subject is preceded by:   ERROR MongodRunner [9060 KVStoreConfigurationThread] - Command cmd="{CMD.EXE /C ( "C:\Program Files\Splunk\bin\openssl.exe" pkcs12 -inkey "C:\Program Files\Splunk\etc\auth\mycerts\splunkd.pem" -in "C:\Program Files\Splunk\etc\auth\mycerts\splunkd.pem" -passin pass:xxxx -export -out "C:\Program Files\Splunk\etc\auth\mycerts\splunkd.pem.pfx" -passout pass:xxxx )}" failed: exited with code 1. unable to load private key\r\n10460:error:06065064:digital envelope routines:EVP_DecryptFinal_ex:bad decrypt:.\crypto\evp\evp_enc.c:590:\r\n10460:error:0906A065:PEM routines:PEM_do_header:bad decrypt:.\crypto\pem\pem_lib.c:476:\r\n   I tried this command and it did not work. I tried the openssl command by itself in PowerShell and it DID work. The problem turned out to be one of the special character in the password I had set on my private key. I used open SSL to write out a new copy of the key with a different password with no special characters and lo and behold it worked. Just be careful to replace the new password in all locations it might be used (EG: Twice in server.conf and in inputs.conf on an indexer). The command for changing the password on your key is as follows:   .\openssl.exe rsa -aes265 -in mykey.key -out mynewkey.key -passin pass:oldpassword -passout pass:newpassword   Just make sure you are aware of what the special character in your old password might be and use PowerShell not the command prompt
Hi Team, I have a field which has the values in the below string format:  HH:MM:SS.3N 0:00:43.096 22:09:50.174 1:59:54.382 5:41:21.623 0:01:56.597 I want to convert the whole duration... See more...
Hi Team, I have a field which has the values in the below string format:  HH:MM:SS.3N 0:00:43.096 22:09:50.174 1:59:54.382 5:41:21.623 0:01:56.597 I want to convert the whole duration into minutes and anything under a min is considered 1 minute
app is unable to collect metric data  (metric_name="Memory.Page_Reads/sec" ) can any one help in the app script. operating system is linux.