All Topics

Top

All Topics

Within Splunk:Soar(Phantom), is there a way to have a prompt message pop-up for the user running the playbook, as opposed to having to go up to notifications then click the prompt notification to ope... See more...
Within Splunk:Soar(Phantom), is there a way to have a prompt message pop-up for the user running the playbook, as opposed to having to go up to notifications then click the prompt notification to open it.   The idea being that when certain playbooks are launched from workbooks, the prompt will supply the user with a form that will affect the actions/outcomes of the playbook.  While having the promp directly pop-up isn't necessary for this, it would be QOL and help with confusion of new users. I could have sworn I saw this type of feature during a SOAR demo by Splunk, but can't find it documented anywhere if it exists.
I've configured a HEC to receive events from a Telegraf emitter, which provides metrics in the form: {"time":1676415410,"event":"metric","host":"VaultNonProd-us-east-2a","index":"vault-metrics","f... See more...
I've configured a HEC to receive events from a Telegraf emitter, which provides metrics in the form: {"time":1676415410,"event":"metric","host":"VaultNonProd-us-east-2a","index":"vault-metrics","fields":{"_value":0.022299762544379577,"cluster":"vault_nonprod","datacenter":"us-east-2","metric_name":"vault.raft.replication.heartbeat.NonProd-us-east-2b-d992bf60.stddev","metric_type":"timing","role":"vault-server"}} All of the fields come across from the HF to our indexers except the one we're most interested, the _value field.  Searching around, I found https://docs.splunk.com/Documentation/DSP/1.3.1/Connection/IndexEvent which, in part, states that "Entries that are not included in fields include: any key that starts with underscore (such as _time)" Is it possible to include an underscore-starting field in the forwarded event? Thanks
I am trying to create a query to get the sum of multiple fields by a field.    index="*****" |stats sum(field_A) as  A by field_C,sum(field_B) as B  by field_C | table field_C, field_A,field_B... See more...
I am trying to create a query to get the sum of multiple fields by a field.    index="*****" |stats sum(field_A) as  A by field_C,sum(field_B) as B  by field_C | table field_C, field_A,field_B   This query is giving error. 
I am using Splunk searching old log files and the _time is different from log time, would this make sense or do I have to parse the log to set  _time to log time? Thanks.
How do i verify the forwarder is sending data to the Indexer? What search do i need to perform other then Forwarder Management?
Hello, I'm a new Splunk Compliance Manager and I need some assistance. How do i check Splunk Compliance and how do i better manage licensing?   Thanks, Rodney
Hi all,  Splunk newbie with what I hope is a simple question... I have a UF installed on my windows file server, and it is set to monitor a directory--see below [WinEventLog://Security] check... See more...
Hi all,  Splunk newbie with what I hope is a simple question... I have a UF installed on my windows file server, and it is set to monitor a directory--see below [WinEventLog://Security] checkpointInterval = 5 current_only = 0 disabled = 0 start_from = oldest monit [WinEventLog://System] checkpointInterval = 5 current_only = 0 disabled = 0 start_from = oldest [monitor://D:\documents\Confidential] disabled = false  The intent is for it to report access/modifications/deletions to files in that directory, but I am not getting any file monitoring activity returned to my splunk server when I perform a simple query for the windows host.  I do get all the system and security events, though. Any ideas on why I'm not getting the file monitoring activity?  Thanks!
02-10-2022 09:00:35.120 -0500 INFO TailingProcessor [5728 MainTailingThread] - Adding watch on path: C:\.
I need to create a search (or an embedded search that feeds data to another search.  What I'm trying to get is a search like  |tstats values(host) where index=* by index which might feed to a spread ... See more...
I need to create a search (or an embedded search that feeds data to another search.  What I'm trying to get is a search like  |tstats values(host) where index=* by index which might feed to a spread sheet that has server and host and then another search on top of it to match up host with index. (NOT indexers) |tstats values(host) where index=* by index
Hello to all. I am using the CEF Extraction TA for extracting CEF fields in a FireEye log.  When I test this on a standalone system with Indexer and Search Head, the cs#Label fields extract correc... See more...
Hello to all. I am using the CEF Extraction TA for extracting CEF fields in a FireEye log.  When I test this on a standalone system with Indexer and Search Head, the cs#Label fields extract correctly. As soon as I put this in an environment with a Heavy Forwarder, Indexer, and Search Head distributed (or even just Indexer and Search Head)., the fields will not extract.   I am at my wit's end here. Help?  Thanks!
Hi, Recently I received a warning message like the following when installing Enterprise Console on a Linux machine. Apparently because of that I'm not able to logon to the Enterprise Console. ... See more...
Hi, Recently I received a warning message like the following when installing Enterprise Console on a Linux machine. Apparently because of that I'm not able to logon to the Enterprise Console. FirewallD status is not running, so I assume there's no blocking. Tried to retry the installation several times, but always the same result. Any thoughts on this one?
Existing release of signalfx-tracing uses "tar" package v4 which has the following vulnerability. tar package versions before 6.1.4 are vulnerable to Regular Expression Denial of Service (ReDoS). Wh... See more...
Existing release of signalfx-tracing uses "tar" package v4 which has the following vulnerability. tar package versions before 6.1.4 are vulnerable to Regular Expression Denial of Service (ReDoS). When stripping the trailing slash from files arguments, we were using f.replace(/\\/+$/, \'\'), which can get exponentially slow when f contains many / characters. This is ""unlikely but theoretically possible"" because it requires that the user is passing untrusted input into the tar.extract() or tar.list() array of entries to parse/extract, which would be quite unusual. As a security first policy in our organization we strive to keep updating to the latest fixes for all vulnerable packages. We are currently blocked because we use signalfx-tracing@latest. We need signalfx-tracing to update to the latest version of tar and release a package with no other breaking changes in the package. here is more information about the needed package of tar   For more details refer this GitHub link:https://github.com/signalfx/signalfx-nodejs-tracing/issues/97
Hello Everyone, I have a requirement where I have to generate a query.  event 1 : <l:event dateTime="2023-02-10 11:28:49.299"......some ******data*****<ns2:orderNumber>111040481</ns2:orderNumber>... See more...
Hello Everyone, I have a requirement where I have to generate a query.  event 1 : <l:event dateTime="2023-02-10 11:28:49.299"......some ******data*****<ns2:orderNumber>111040481</ns2:orderNumber>*****some****data****<ns2:customerType>B2C</ns2:customerType>   event 2 : event dateTime="2023-02-15 11:28:49.299"......some ******data*****<ns2:orderNumber>111040481</ns2:orderNumber>*****some****data****   I have to fetch OrderNumber from event2, and  CustomerType from event1... As ordernumber is unique.. Since event1 and event2 are on different date, can we write a query to get a report ?
Hi there everyone, I'm a fresh beginner at Splunk SDK, was exploring stuff and tried to add a user using the library and the below code and it ended up with the error at the bottom.    import spl... See more...
Hi there everyone, I'm a fresh beginner at Splunk SDK, was exploring stuff and tried to add a user using the library and the below code and it ended up with the error at the bottom.    import splunklib.client as client HOST = HOST PORT = 8089 BEARER_TOKEN =BEARER_TOKEN # Create a Service instance and log in try: service = client.connect( host=HOST, port=PORT, splunkToken=BEARER_TOKEN, verify=False ) if service: print("connected....) except Exception as e: print(e) try: service.users.create( username="test123", password="test321", roles="admin" ) except Exception as e: print(f"and as it goeeees {e}")   TimeoutError: [Errno 60] Operation timed out     suggestions or guidance would be awesome if any Sincerely Haydar
we have upgraded splunk Enterprise to 8.1 and Alert manager to 3.1.11. After upgrade, alerts are not getting auto assigned and keeps on sitting with "new" status and hence not getting processed. ... See more...
we have upgraded splunk Enterprise to 8.1 and Alert manager to 3.1.11. After upgrade, alerts are not getting auto assigned and keeps on sitting with "new" status and hence not getting processed.   Any leads to this problem will be appreciated. Thanks
We use Splunk cloud and one on-premises HF Using Splunk_TA_juniper in Splunk cloud, we get Juniper logs as syslogs What I need to do to do field attraction
Hello. I am trying to test app version before updating it in test environemnt but I receive an error after running command : ./splunk apply shcluster-bundle -target https://vxxxxxxx:8089 Er... See more...
Hello. I am trying to test app version before updating it in test environemnt but I receive an error after running command : ./splunk apply shcluster-bundle -target https://vxxxxxxx:8089 Error  :     WARNING: Server Certificate Hostname Validation is disabled. Please see server.conf/[sslConfig]/cliVerifyServerName for details. Error while creating deployable apps: Error copying src="/opt/splunk/etc/shcluster/apps/Splunk_TA_snow" to staging area="/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow". 9 errors occurred. Description for first 9: [{operation:"copying source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/metadata/local.meta", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/metadata/local.meta"}, {operation:"transfering contents from source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/metadata", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/metadata"}, {operation:"copying source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/local/passwords.conf", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/local/passwords.conf"}, {operation:"copying source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/local/splunk_ta_snow_settings.conf", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/local/splunk_ta_snow_settings.conf"}, {operation:"copying source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/local/splunk_ta_snow_account.conf", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/local/splunk_ta_snow_account.conf"}, {operation:"transfering contents from source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/local", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/local"}, {operation:"copying source to destination", error:"Permission denied", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/lookups/snow_cmdb_ci_list.csv", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/lookups/snow_cmdb_ci_list.csv"}, {operation:"transfering contents from source to destination", error:"No such file or directory", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow/lookups", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow/lookups"}, {operation:"transfering contents from source to destination", error:"No such file or directory", src:"/opt/splunk/etc/shcluster/apps/Splunk_TA_snow", dest:"/opt/splunk/var/run/splunk/deploy.618738529d3d8db4.tmp/apps/Splunk_TA_snow"}]     Looks like permissions denied. I tried with both root user and splunk user error stays the same.  
Hello, I am looking for some guidance on licensing, please. AppD has two licensing models for Commerical SaaS: ABL and IBL 1) Is it possible to convert a customer licensing from ABL to IBL? 2) If... See more...
Hello, I am looking for some guidance on licensing, please. AppD has two licensing models for Commerical SaaS: ABL and IBL 1) Is it possible to convert a customer licensing from ABL to IBL? 2) If this is possible, would the controller(s) need to be re-configured? And would this mean effectively starting over in terms of application mapping, health alerts, dashboards, and non-out-of-the-box instrumentation? The License Entitlements and Restrictions page does not cover this: License Entitlements and Restrictions (appdynamics.com) 3) Both ABL and IBL licensing models are orderable via Cisco Commerce. However, it appears that for Cisco Enterprise Agreements 3.0, only IBL licensing (Enterprise and Premium tiers) is covered, and not ABL (Peak, Pro, Advanced). Does this mean that you can only move a customer to an EA if they are licensed for IBL? Appreciate your input. Thanks
hai team, we are using splunk cloud and one prem HF  we are getting juniper logs as syslogs and we are using Splunk_TA_juniper in splunk cloud how to do field attraction from my end 
 I have installed Splunk forwarder on Windows  server. I would like to configure an alert so everytime the disk is getting full an email would be sent to my email address. The same if the server was ... See more...
 I have installed Splunk forwarder on Windows  server. I would like to configure an alert so everytime the disk is getting full an email would be sent to my email address. The same if the server was turned off or not detected in the network. Could you plase help me? i check the forum have related info    sourcetype="WMI:FreeDiskSpace" PercentFreeSpace<10 Metrics index is meaning for?  and i look for splunk which portal to setup my alert  Splunk Search Explanation | mstats avg(LogicalDisk.%_Free_Space) AS "win_storage_free" WHERE index="<name of your metrics index>" host="<names of the hosts you want to check>" instance="<names of drives you want to check)>" instance!="_Total" BY host, instance span=1m Search metrics index(es) where perfmon disk space data is being collected and filter down to the desired host(s) to check. | eval storage_used_percent=round(100-win_storage_free,2) Convert percent storage free to percent storage used for readability. | eval host_dev=printf("%s:%s\\",host,instance) Create a new field that combines the host and disk drive. | timechart max(storage_used_percent) AS storage_used_percent BY host_dev Plot the storage used for each host and disk over time. Windows disk drive utilization nearing capacity - Splunk Lantern Monitor data through Windows Management Instrumentation (WMI) - Splunk Documentation