All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

02-10-2022 09:00:35.120 -0500 INFO TailingProcessor [5728 MainTailingThread] - Adding watch on path: C:\.
I need to create a search (or an embedded search that feeds data to another search.  What I'm trying to get is a search like  |tstats values(host) where index=* by index which might feed to a spread ... See more...
I need to create a search (or an embedded search that feeds data to another search.  What I'm trying to get is a search like  |tstats values(host) where index=* by index which might feed to a spread sheet that has server and host and then another search on top of it to match up host with index. (NOT indexers) |tstats values(host) where index=* by index
Hello to all. I am using the CEF Extraction TA for extracting CEF fields in a FireEye log.  When I test this on a standalone system with Indexer and Search Head, the cs#Label fields extract correc... See more...
Hello to all. I am using the CEF Extraction TA for extracting CEF fields in a FireEye log.  When I test this on a standalone system with Indexer and Search Head, the cs#Label fields extract correctly. As soon as I put this in an environment with a Heavy Forwarder, Indexer, and Search Head distributed (or even just Indexer and Search Head)., the fields will not extract.   I am at my wit's end here. Help?  Thanks!
Hi, Recently I received a warning message like the following when installing Enterprise Console on a Linux machine. Apparently because of that I'm not able to logon to the Enterprise Console. ... See more...
Hi, Recently I received a warning message like the following when installing Enterprise Console on a Linux machine. Apparently because of that I'm not able to logon to the Enterprise Console. FirewallD status is not running, so I assume there's no blocking. Tried to retry the installation several times, but always the same result. Any thoughts on this one?
Existing release of signalfx-tracing uses "tar" package v4 which has the following vulnerability. tar package versions before 6.1.4 are vulnerable to Regular Expression Denial of Service (ReDoS). Wh... See more...
Existing release of signalfx-tracing uses "tar" package v4 which has the following vulnerability. tar package versions before 6.1.4 are vulnerable to Regular Expression Denial of Service (ReDoS). When stripping the trailing slash from files arguments, we were using f.replace(/\\/+$/, \'\'), which can get exponentially slow when f contains many / characters. This is ""unlikely but theoretically possible"" because it requires that the user is passing untrusted input into the tar.extract() or tar.list() array of entries to parse/extract, which would be quite unusual. As a security first policy in our organization we strive to keep updating to the latest fixes for all vulnerable packages. We are currently blocked because we use signalfx-tracing@latest. We need signalfx-tracing to update to the latest version of tar and release a package with no other breaking changes in the package. here is more information about the needed package of tar   For more details refer this GitHub link:https://github.com/signalfx/signalfx-nodejs-tracing/issues/97
Hello Everyone, I have a requirement where I have to generate a query.  event 1 : <l:event dateTime="2023-02-10 11:28:49.299"......some ******data*****<ns2:orderNumber>111040481</ns2:orderNumber>... See more...
Hello Everyone, I have a requirement where I have to generate a query.  event 1 : <l:event dateTime="2023-02-10 11:28:49.299"......some ******data*****<ns2:orderNumber>111040481</ns2:orderNumber>*****some****data****<ns2:customerType>B2C</ns2:customerType>   event 2 : event dateTime="2023-02-15 11:28:49.299"......some ******data*****<ns2:orderNumber>111040481</ns2:orderNumber>*****some****data****   I have to fetch OrderNumber from event2, and  CustomerType from event1... As ordernumber is unique.. Since event1 and event2 are on different date, can we write a query to get a report ?
Hi there everyone, I'm a fresh beginner at Splunk SDK, was exploring stuff and tried to add a user using the library and the below code and it ended up with the error at the bottom.    import spl... See more...
Hi there everyone, I'm a fresh beginner at Splunk SDK, was exploring stuff and tried to add a user using the library and the below code and it ended up with the error at the bottom.    import splunklib.client as client HOST = HOST PORT = 8089 BEARER_TOKEN =BEARER_TOKEN # Create a Service instance and log in try: service = client.connect( host=HOST, port=PORT, splunkToken=BEARER_TOKEN, verify=False ) if service: print("connected....) except Exception as e: print(e) try: service.users.create( username="test123", password="test321", roles="admin" ) except Exception as e: print(f"and as it goeeees {e}")   TimeoutError: [Errno 60] Operation timed out     suggestions or guidance would be awesome if any Sincerely Haydar
we have upgraded splunk Enterprise to 8.1 and Alert manager to 3.1.11. After upgrade, alerts are not getting auto assigned and keeps on sitting with "new" status and hence not getting processed. ... See more...
we have upgraded splunk Enterprise to 8.1 and Alert manager to 3.1.11. After upgrade, alerts are not getting auto assigned and keeps on sitting with "new" status and hence not getting processed.   Any leads to this problem will be appreciated. Thanks
We use Splunk cloud and one on-premises HF Using Splunk_TA_juniper in Splunk cloud, we get Juniper logs as syslogs What I need to do to do field attraction
Hello. I am trying to test app version before updating it in test environemnt but I receive an error after running command : ./splunk apply shcluster-bundle -target https://vxxxxxxx:8089 Er... See more...
Hello. I am trying to test app version before updating it in test environemnt but I receive an error after running command : ./splunk apply shcluster-bundle -target https://vxxxxxxx:8089 Error  :     WARNING: Server Certificate Hostname Validation is disabled. Please see server.conf/[sslConfig]/cliVerifyServerName for details. Error while creating deployable apps: Error copying src="/opt/splunk/etc/shcluster/apps/Splunk_TA_snow" to staging area="/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow". 9 errors occurred. Description for first 9: [{operation:"copying source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/metadata/local.meta", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/metadata/local.meta"}, {operation:"transfering contents from source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/metadata", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/metadata"}, {operation:"copying source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/local/passwords.conf", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/local/passwords.conf"}, {operation:"copying source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/local/splunk_ta_snow_settings.conf", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/local/splunk_ta_snow_settings.conf"}, {operation:"copying source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/local/splunk_ta_snow_account.conf", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/local/splunk_ta_snow_account.conf"}, {operation:"transfering contents from source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/local", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/local"}, {operation:"copying source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/lookups/snow_cmdb_ci_list.csv", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/lookups/snow_cmdb_ci_list.csv"}, {operation:"transfering contents from source to destination", error:"No such file or directory", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/lookups", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/lookups"}, {operation:"transfering contents from source to destination", error:"No such file or directory", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow"}]     Looks like permissions denied. I tried with both root user and splunk user error stays the same.  
Hello, I am looking for some guidance on licensing, please. AppD has two licensing models for Commerical SaaS: ABL and IBL 1) Is it possible to convert a customer licensing from ABL to IBL? 2) If... See more...
Hello, I am looking for some guidance on licensing, please. AppD has two licensing models for Commerical SaaS: ABL and IBL 1) Is it possible to convert a customer licensing from ABL to IBL? 2) If this is possible, would the controller(s) need to be re-configured? And would this mean effectively starting over in terms of application mapping, health alerts, dashboards, and non-out-of-the-box instrumentation? The License Entitlements and Restrictions page does not cover this: License Entitlements and Restrictions (appdynamics.com) 3) Both ABL and IBL licensing models are orderable via Cisco Commerce. However, it appears that for Cisco Enterprise Agreements 3.0, only IBL licensing (Enterprise and Premium tiers) is covered, and not ABL (Peak, Pro, Advanced). Does this mean that you can only move a customer to an EA if they are licensed for IBL? Appreciate your input. Thanks
hai team, we are using splunk cloud and one prem HF  we are getting juniper logs as syslogs and we are using Splunk_TA_juniper in splunk cloud how to do field attraction from my end 
 I have installed Splunk forwarder on Windows  server. I would like to configure an alert so everytime the disk is getting full an email would be sent to my email address. The same if the server was ... See more...
 I have installed Splunk forwarder on Windows  server. I would like to configure an alert so everytime the disk is getting full an email would be sent to my email address. The same if the server was turned off or not detected in the network. Could you plase help me? i check the forum have related info    sourcetype="WMI:FreeDiskSpace" PercentFreeSpace<10 Metrics index is meaning for?  and i look for splunk which portal to setup my alert  Splunk Search Explanation | mstats avg(LogicalDisk.%_Free_Space) AS "win_storage_free" WHERE index="<name of your metrics index>" host="<names of the hosts you want to check>" instance="<names of drives you want to check)>" instance!="_Total" BY host, instance span=1m Search metrics index(es) where perfmon disk space data is being collected and filter down to the desired host(s) to check. | eval storage_used_percent=round(100-win_storage_free,2) Convert percent storage free to percent storage used for readability. | eval host_dev=printf("%s:%s\\",host,instance) Create a new field that combines the host and disk drive. | timechart max(storage_used_percent) AS storage_used_percent BY host_dev Plot the storage used for each host and disk over time. Windows disk drive utilization nearing capacity - Splunk Lantern Monitor data through Windows Management Instrumentation (WMI) - Splunk Documentation 
How to route ECS Real Time logs in Splunk? please give an idea about how to route ECS real-time logs in Splunk enterprise.  I do this with Splunk cloud, but in Splunk enterprise don't have any op... See more...
How to route ECS Real Time logs in Splunk? please give an idea about how to route ECS real-time logs in Splunk enterprise.  I do this with Splunk cloud, but in Splunk enterprise don't have any option for that, please let me know if anyone has any idea about real-time logs in Splunk. please help       
Hello everyone, is it possible to collect logs from telegram chat to Splunk?  exist any ready solutions?
I've a couple of queries -  index="main"app="student-api" "tags.studentId"=3B70E5 message="Id and pwd entered correctly" | sort _time desc and index="main" app="student-api" "tags.decision"=... See more...
I've a couple of queries -  index="main"app="student-api" "tags.studentId"=3B70E5 message="Id and pwd entered correctly" | sort _time desc and index="main" app="student-api" "tags.decision"=SOP_REQUIRED "tags.studentId"=3B70E5 | sort _time desc I'd like to grab just the latest timestamp from both the results (and status code from one of them). However I'd like to do this reading the tags.studentId from a csv file (the fieldname is student_id and has ~100 entries). So the output should look like -  student_id| latest timestamp from 1st query| latest timestamp from 2nd query|status code from 2nd query I installed Lookup Editor.  Please let me know what next steps to follow (if there is alternative to Lookup Editor please suggest that too).  Thanks
Q. Splunk Universal Forwarder(Ubuntu) -> Splunk Enterprise(Ubuntu) I set inputs.conf after installing UF. And I ran the splunk service. But, Data does not accumulate when checked by the Enterprise... See more...
Q. Splunk Universal Forwarder(Ubuntu) -> Splunk Enterprise(Ubuntu) I set inputs.conf after installing UF. And I ran the splunk service. But, Data does not accumulate when checked by the Enterprise server. ㅠ.ㅠ 
I need to upgrade IBM was add-on I'm getting No spec file as you can see from attached. any one can help how to solve it. Thank you in advance 
We have been using this add-on with Splunk enterprise and jira service desk on-prem.. And we are moving to splunk cloud soon. Wondering if this app works with splunk cloud and the on-prem version of ... See more...
We have been using this add-on with Splunk enterprise and jira service desk on-prem.. And we are moving to splunk cloud soon. Wondering if this app works with splunk cloud and the on-prem version of jira service desk. Anyone who was able to successfully configure it this way
Howdy, I was wondering if anyone has any guidance on how to ingest data from Nagios Log Server? Prior to my arrival, we used Nagios-LS (I think 2.1) for several years to search and store logs from ... See more...
Howdy, I was wondering if anyone has any guidance on how to ingest data from Nagios Log Server? Prior to my arrival, we used Nagios-LS (I think 2.1) for several years to search and store logs from our devices. I've since implemented Splunk Enterprise and we are no longer supporting Nagios-LS and RHEL.  But I need to ingest the nagios-ls database into Splunk in order to enable searching of the historical logs (several TBs worth) but I'm not sure how to make that happen.