All Topics

Top

All Topics

I saw there is responses from 2013 and 2015 you cannot rename a report. Why is this still not a thing? Is there something preventing this from being added? This seems very basic and is disappointing ... See more...
I saw there is responses from 2013 and 2015 you cannot rename a report. Why is this still not a thing? Is there something preventing this from being added? This seems very basic and is disappointing I cannot change my report without deleting and recreating it.
All,   What is the best way to update a KV store using automation? Python script or APIs. I am looking to take data from logs from a file and update a KV store based on that data or extract. 
can we invoke custom javascript or css in the dashboard studio App to adding animations? If yes, How?  
How to display the error input or value errors in a pop up? I am trying to build a custom command and want to show errors raised or returned in a pop up or modal. For example: In the inputlookup ,... See more...
How to display the error input or value errors in a pop up? I am trying to build a custom command and want to show errors raised or returned in a pop up or modal. For example: In the inputlookup , if no csv name is provided, it will return below error. How to show it in form of an pop up or modal? Also how to remove the First line and display only the 2nd and 3rd line? @splunk  @niketnilay  
Hi Community!  If I know the SID, search ID,  is there a way I can see the scheduled job/report/search associated with the SID?   
We removed a number of files to prevent problems with log4j. Now when I run a file integrity check, the missing files are showing up as "missing". Since we know we removed them, I would like to hav... See more...
We removed a number of files to prevent problems with log4j. Now when I run a file integrity check, the missing files are showing up as "missing". Since we know we removed them, I would like to have the file integrity check skip those files. How do I do this?
I'm using my on-prem DS to push out apps to my UFs. The current cert has expired, how can I push a new cert to my UFs? I see that in my DS, I have a directory /opt/splunk/etc/deployment-apps/100_splu... See more...
I'm using my on-prem DS to push out apps to my UFs. The current cert has expired, how can I push a new cert to my UFs? I see that in my DS, I have a directory /opt/splunk/etc/deployment-apps/100_splunkcloud/default/. In this directory I have a server.pem file with last year's date. Is this where I need to move the new pem file? I thought it was in the /opt/splunk/etc/deployment-apps/100_splunkcloud/local directory instead.    Thank you!
Hello, I have one data source and getting feed through the inputs.conf file located under default folder and it is currently assigned to one sourcetype. It has files with 3 different naming convent... See more...
Hello, I have one data source and getting feed through the inputs.conf file located under default folder and it is currently assigned to one sourcetype. It has files with 3 different naming conventions and I have to create three source types based on that. How should I do it? Should I create separate configuration files (props and inputs)  inside the local folder and assign 3 sourcetypes; leave the inputs.conf file under default folder as it is? or should I make changes within  inputs.conf  located in default folder.  But it is recommended not to  make any changes within  default folder. Your recommendation would be highly appreciated. Thank you!
Hi there, So I've first download  the machine learning toolkit app but  was not able to run the app due to this error: " Python for Scientific Computing is a Splunk Add-on that includes several... See more...
Hi there, So I've first download  the machine learning toolkit app but  was not able to run the app due to this error: " Python for Scientific Computing is a Splunk Add-on that includes several Python libraries for scientific computing, including numpy, scipy, pandas, scikit-learn, and statsmodels. Several of the dashboards included in the Machine Learning Toolkit require these modules. Please download and install the platform-specific version of this add-on that is appropriate for your Splunk Search Head:" So I've download the correct add on Python for Scientific Computing  but neither of the apps are working. 
Below is the sample log: {[-]     context: default      level: INFO      logger: logginfdata.pre-request.util     mdc: { [+]  } message:  this is a json request [evenId=7654678767888... See more...
Below is the sample log: {[-]     context: default      level: INFO      logger: logginfdata.pre-request.util     mdc: { [+]  } message:  this is a json request [evenId=76546787678888899999]] thread: RealtimeExecutor-1999 timestamp: 2022-03-23 15:44:41.965 } may i know how can write props for this kind of logs.
In Splunk Enterprise 9.0.0.1, I scheduled a saved search with an invalid macro name in it. When run, I receive the following error message as I should: Error in 'SearchParser': The search specifies ... See more...
In Splunk Enterprise 9.0.0.1, I scheduled a saved search with an invalid macro name in it. When run, I receive the following error message as I should: Error in 'SearchParser': The search specifies a macro 'my_macro' that cannot be found. Reasons include: the macro name is misspelled, you do not have "read" permission for the macro, or the macro has not been shared with this application. Click Settings, Advanced search, Search Macros to view macro information. The search was skipped, the error was logged to scheduler.log, and the log ingested into _internal all as expected. However, the reason field gets cut off because of the quotation marks in the error message. It thinks the field value ends at "have" when it should end at "information." I believe this is a minor defect. Is there any way to submit a bug report? I tried creating a case but received a message saying I don't have  a Support Contract or entitlement to do so. Can anyone point me in the right direction? Thanks!   Edit: Created Splunk Ideas post: https://ideas.splunk.com/ideas/EID-I-1586
I recently re-installed MS Windows AD Objects app due to some issues. After the re-install, I tried the lookups build configuration wizard, but it doesn't seem to build lookups even though wizard ran... See more...
I recently re-installed MS Windows AD Objects app due to some issues. After the re-install, I tried the lookups build configuration wizard, but it doesn't seem to build lookups even though wizard ran successfully with all green "successful" message.  I tried reseting the admon baseline, adding manual domain input but still no luck. Indexes look correct, log is still getting ingested,  I used pre-defined TA inputs.conf files, mainly working with 1 DC. This DC has below apps. Splunk_TA_windows  Splunk_TA_windows_dc Splunk_TA_windows_admon  Main lookup i'm trying to build is 'AD_User_LDAP_list' as my searches with this lookup shows error message "The lookup table 'AD_User_LDAP_list' requires a .csv or KV store lookup definition." Can somebody point me to the right direction to fix this issue?  
Want a clear path forward to getting started with Splunk training and feel empowered by better self-service to discover the educational courses that best fit your needs? Visit the new Splunk Training... See more...
Want a clear path forward to getting started with Splunk training and feel empowered by better self-service to discover the educational courses that best fit your needs? Visit the new Splunk Training and Certification experience on our newly designed main page here! A sample of the new pages included in this launch: Training & Certification Page Learning Paths Course Catalog FAQs You'll be inspired to discover, share and connect for an overall improved learning experience by finding the right learning paths and exploring the course catalog. We've also launched a new Learning Rewards Program! Find out what this new program is all about by visiting this page after completing Splunk Training and Certification classes and get points that you can use to redeem for Splunk swag.  Splunk courses are all about empowerment, increasing relevance for your role and your team(s), unlocking innovation, and driving your education journey so it's designed just for you. We offer both free and paid learning options (including bundles), so you have optimal access to the courses and training and can choose what you need.  Get started by visiting the Splunk Training and Certification page and get registered today! — Michelle Schlachta, Community + Content at Splunk  
Hello, I currently have the DB Connect plugin installed to receive the logs from an aurora database. To date everything works without problem but my client tells me that he needs to go from versio... See more...
Hello, I currently have the DB Connect plugin installed to receive the logs from an aurora database. To date everything works without problem but my client tells me that he needs to go from version 11.6 to version 11.5 I would like to know if I should do something or the fact that it is already working with the current version implies that with a higher version it should not affect anything?   https://docs.splunk.com/Documentation/DBX/3.8.0/JDBCPostgres/About  
Hello, I am currently receiving ADAudit Plus logs but I have no idea what use cases I can draw from this source. I also do not see that there is an APP with dashboards that help me Any suggestion?
Hello,   I am performing Splunk UF installation of version 8.0.5 and getting following error logged in error logs: ==> splunkd.log <== 08-23-2022 10:00:29.018 -0400 WARN TcpOutputProc - Pipel... See more...
Hello,   I am performing Splunk UF installation of version 8.0.5 and getting following error logged in error logs: ==> splunkd.log <== 08-23-2022 10:00:29.018 -0400 WARN TcpOutputProc - Pipeline data does not have indexKey. [_conf] = |||\n 08-23-2022 10:00:29.018 -0400 WARN TcpOutputProc - The event is missing source information. Event : no raw data I am not sure what is this error means. I can see that sources defined in config files are sending logs to Splunk. Could someone suggest how I can fix these errors to make sure no data lose there?
Hi Team, I'm trying to create getting response time from the below logs by using Trace ID( Or any unique value) as my logs don't have any specific URL. http-nio-8080-exec-8,WARN,com.xxx.product.s... See more...
Hi Team, I'm trying to create getting response time from the below logs by using Trace ID( Or any unique value) as my logs don't have any specific URL. http-nio-8080-exec-8,WARN,com.xxx.product.stoc.jpa.graph.AgreementProcessorServiceImpl, CHANNEL_ID : UI, RUN_ID : F3E51C72B62AC15C4E3FF2458A30C88F, TRACE_ID : 7uITsJ7CQ7MbWZZQZ9Ntz3, COLLATE_USER_ID : mashetta, EXTERNAL_USER_ID : _]  dt.trace_sampled: true, dt.trace_id: ed71da8c7bedadc2c9c568c04d91eafe, dt.span_id: feb179cfd106945fFacility 5766 has no Exposures, which are needed for a LGD calculation scheduling-1,INFO,com.xxxeventbus.sdk.listenerimpl.service.RetryCron, CHANNEL_ID : , RUN_ID : , TRACE_ID : , COLLATE_USER_ID : , EXTERNAL_USER_ID : ] Deleted 6 Completed or Failed received events. elastic-595,WARN,com.xxx.product.stoc.jpa.graph.AgreementProcessorServiceImpl, CHANNEL_ID : , RUN_ID : , TRACE_ID : , COLLATE_USER_ID : , EXTERNAL_USER_ID : ] All the Exposures have no Start Date and all the LED/Exposure links have no Rank EntityChangeExecutor_39,WARN,org.elasticsearch.client.RestClient, CHANNEL_ID : , RUN_ID : , TRACE_ID : , COLLATE_USER_ID : , EXTERNAL_USER_ID : ]  dt.trace_sampled: true, dt.trace_id: 73a4a5de4e28679b9c9330c852d9cc59, dt.span_id: 229f2f590836fc19request [POST http://<myurl:9200/] returned 2 warnings: [299 Elasticsearch-7.17.2-de7261de50d90919ae53b0eff9413fd7e5307301 "Elasticsearch built-in security features are not enabled. Without authentication, your cluster could be accessible to anyone. See <HTML File> to enable security."],[299 Elasticsearch-7.17.2-de7261de50d90919ae53b0eff9413fd7e5307301 "[ignore_throttled] parameter is deprecated because frozen indices have been deprecated. Consider cold or frozen tiers in place of frozen indices."] elastic-602,INFO,com.xxx.product.stoc.jpa.service.eligibility.EligibilityServiceImpl, CHANNEL_ID : , RUN_ID : , TRACE_ID : , COLLATE_USER_ID : , EXTERNAL_USER_ID : ] Calculated eligibility of LED 6209 as 0 with result id 1885. elastic-602,INFO,com.xxx.product.stoc.jpa.service.eligibility.EligibilityServiceImpl, CHANNEL_ID : , RUN_ID : , TRACE_ID : , COLLATE_USER_ID : , EXTERNAL_USER_ID : ] Checking eligibility of LED 6209... INFO,com.xxx.product.stoc.jpa.service.eligibility.EligibilityServiceImpl, CHANNEL_ID : , RUN_ID : , TRACE_ID : , COLLATE_USER_ID : , EXTERNAL_USER_ID : ] Calculated eligibility of LED 6235 as 0 with result id 1883. Your help is much appreciated
I have a field value like this that I want to exclude.   [22minfo[3: host.console[0]   The searches I can think of either don't do anything or return an error. Note, I am trying to speed... See more...
I have a field value like this that I want to exclude.   [22minfo[3: host.console[0]   The searches I can think of either don't do anything or return an error. Note, I am trying to speed up a search so I do not want to use regex.  Searches I tried:     message != [* message != "["* message != "[*" message != '[*' message != '['*    
We have some servers that are deployed in AWS and we want to monitor some files that are on them.  Typically, I'd go with the UF, but in this case our indexers only have private IPs.  We do have some... See more...
We have some servers that are deployed in AWS and we want to monitor some files that are on them.  Typically, I'd go with the UF, but in this case our indexers only have private IPs.  We do have some Heavy Forwarders that can be publicly addressed.  We have only used that for HEC though.  The Heavy Forwarders do have receiving set up on port 9997, but wouldn't that inde the data locally on the servers?   Have any of you had a similar issue?   We have a clustered on prem enviroment BTW.
Hi everyone, I'm looking for a solution here while playing around with the app builder on SOAR, and I could get the asset interface work fine and from the code I can get the values from there, but ... See more...
Hi everyone, I'm looking for a solution here while playing around with the app builder on SOAR, and I could get the asset interface work fine and from the code I can get the values from there, but the password type returns as an encrypted string instead (as the field is a password field). How can I decrypt it so the code can use that value in runtime accordingly?