All Topics

Top

All Topics

Dear Support I have downloaded Splunk Add-on for Sysmon. I am also using Sysmon App for Splunk - which requires the prior. My sysmon data are stored on an index named os_sysmon. Some dashbo... See more...
Dear Support I have downloaded Splunk Add-on for Sysmon. I am also using Sysmon App for Splunk - which requires the prior. My sysmon data are stored on an index named os_sysmon. Some dashboards of Sysmon App for Splunk show empty, because they rely on a field named EventDescription. I did check deployment of Splunk Add-on for Sysmon, under folder lookups, and did find there a file named microsoft_sysmon_eventcode.csv just as doc: Lookups for the Splunk Add-on for Sysmon  ... says The file is populated with 28 entries and has two fields: EventCode and EventDescription  when I search my index: index = os_sysmon I do get field EventCode, but not the EventDescription  (the same is for lookup file microsoft_sysmon_record_type.csv - I do have record_type but not the record_type_name) Now,  the Sysmon App for Splunk has only one macro - named sysmon - with an original sourcetype=....., which I changed to index=sysmon. No try to derivate any EventDescription  field from EventCode via the lookup file. Seems strange that developers of Sysmon App for Splunk forgot to create (eval) used field EventDescription  from EventCode (via lookup) in their only macro. Should I do it myself there, or is it something to fix at Splunk Add-on for Sysmon - and how? best regards Altin
I have found a search in the charge back application that might fit for seeing the SVC's by index.  Unfortunately that's how my company manages costs, by index.  The search is good, but I'm still hav... See more...
I have found a search in the charge back application that might fit for seeing the SVC's by index.  Unfortunately that's how my company manages costs, by index.  The search is good, but I'm still having issues getting just the SVC's and index as my return:  I did modify it from one day to 1 month, but I only want it to bring back for one month and thus have only one line of results.   Any help would be appreciated. index=summary source="splunk-ingestion" | `sim_filter_stack(myimplementation)` | dedup keepempty=t _time idx st | stats sum(ingestion_gb) as ingestion_gb by _time idx | eventstats sum(ingestion_gb) as total_gb by _time | eval pct=ingestion_gb/total_gb | bin _time span=1m | join _time [ search index=summary source="splunk-svc-consumer" svc_consumer="data services" svc_usage=* | fillnull value="" svc_consumer process_type search_provenances search_type search_app search_label search_user unified_sid search_modes labels search_head_names usage_source | eval unified_sid=if(unified_sid="",usage_source,unified_sid) | stats max(svc_usage) as utilized_svc by _time svc_consumer search_type search_app search_label search_user search_head_names unified_sid process_type | timechart span=1m sum(utilized_svc) as svc_usage ] | eval svc_usage=svc_usage*pct | timechart useother=false span=1m sum(svc_usage) by idx limit=200
Hi, What's the best way to only do a Lookup based on the results of the main search?  I want to only run this when 2 fields don't match.  Pseudo would be If field1!=field2 THEN | lookup accounts ... See more...
Hi, What's the best way to only do a Lookup based on the results of the main search?  I want to only run this when 2 fields don't match.  Pseudo would be If field1!=field2 THEN | lookup accounts department as field2 OUTPUT So like an if then statement most programming languages allow Thanks Lee
Hi,  quick summary of our deployment: - Splunk standalone 9.0.6 - PaloAlto Add-on and App freshly installed 8.1.0 - SC4S v3.4.4 sending logs to splunk - PA logs ingested in indexes and sourcetyp... See more...
Hi,  quick summary of our deployment: - Splunk standalone 9.0.6 - PaloAlto Add-on and App freshly installed 8.1.0 - SC4S v3.4.4 sending logs to splunk - PA logs ingested in indexes and sourcetypes according SC4S official doc https://splunk.github.io/splunk-connect-for-syslog/main/sources/vendor/PaloaltoNetworks/panos/ - I see events in all indexes and with all sourcetypes. Indexes: netfw, netproxy, netauth, netops Sourcetypes: pan:traffic , pan:threat , pan:userid, pan:system, pan:globalprotect, pan:config What else do I need to do to make the official PaloAlto App to work? I checked the documentation https://pan.dev/splunk/docs/installation/  and I enable the data acceleration, and still no data is shown in any dashboard. I don't know what else is missing, any suggestion? thanks a lot
If I have a multisite architecture with site A and site B, can they live on different cloud environments and still have index replication? For example, if I have site A components on Azure but site B... See more...
If I have a multisite architecture with site A and site B, can they live on different cloud environments and still have index replication? For example, if I have site A components on Azure but site B is on AWS, can I still utilize index clustering across the two sites for replication?
It's already October. When can we receive the result of the exam? The exam is also no longer in beta. Can we get any specifics? @cert-team-admin   
My query returns many events, each event is in a form of a json i.e. { "key1": "val1", "key2":"val2"} I would like to convert all events to one event that contains all the original events using sha2... See more...
My query returns many events, each event is in a form of a json i.e. { "key1": "val1", "key2":"val2"} I would like to convert all events to one event that contains all the original events using sha256 of the original event as the key so the new json file will look like: { sha256a: { "key1": "val1", "key2":"val2"}, sha256b: { "key1": "val1a", "key2":"val2a"}, } where sha256a is from | eval sha256a=sha256({ "key1": "val1", "key2":"val2"})
I have an alert but I want to suppress it during holidays How can I do that????
Hello everyone, I am trying to enable some basic detections that found from the Splunk Security Essentials app. We do have ES however; we are still in the process to getting all of our data CIM comp... See more...
Hello everyone, I am trying to enable some basic detections that found from the Splunk Security Essentials app. We do have ES however; we are still in the process to getting all of our data CIM complaint. Do alerts from the Splunk Security Essentials app need to be map to to ES using the "add mapping " option? or do these basic alerts have an equivalent in the ES content management use cases tab?
Hi, while using Splunk SOAR we have several Apps for several integrations with Azure/Graph. Examples of such apps are: Microsoft 365 Defender, MS Graph for Sharepoint, etc. However, most of such ap... See more...
Hi, while using Splunk SOAR we have several Apps for several integrations with Azure/Graph. Examples of such apps are: Microsoft 365 Defender, MS Graph for Sharepoint, etc. However, most of such apps have limited functionalities (i.e. thay do not have an action for all the possibile APIs that can be used). Hence, in order to use other APIs (not available through the standards Apps) we thought to configure the HTTP app with Graph (where we already have an app registration and several permissions - done via Azure). However when we configure the client_id and the secret_id along with the other parameters we receive the following answer from the app: This is the asset configuration:   Does anyone know what's wrong with my configuration? Did anyone make it to work?   Thank you in advance!
Hi, I have submitted my app into Splunk Cloud Platform for vetting process and it is in "Pending" status for more then 3 weeks. Is this timespan normal?  Is there anyway to contact the team and c... See more...
Hi, I have submitted my app into Splunk Cloud Platform for vetting process and it is in "Pending" status for more then 3 weeks. Is this timespan normal?  Is there anyway to contact the team and check for a more specific status? Thanks
I have data like provided below:  field A Field B Field C Field D abc.com 1 1 AB CD 1 1 xyz.com 2 2   AB CD 1 1 abc.com 1 1 AB  CD 1 1 xyz.com ... See more...
I have data like provided below:  field A Field B Field C Field D abc.com 1 1 AB CD 1 1 xyz.com 2 2   AB CD 1 1 abc.com 1 1 AB  CD 1 1 xyz.com 2 2 AB CD 1 1 def.com 1 AB CD 0   I want to group Field A values such that all abc.com value come in 1 row with associated count. I want output like field A count Field B Field C Field D abc.com 2 1 1 AB CD 1 1 xyz.com 2 2 2   AB CD 1 1 def.com 1 1 AB CD 0   if I take path of stats count then it split field C and D which I don't want, I want them to be uniquely compared as a group value. looking for suggestions. Thanks in advance. 
Hi, I am sending logs without indexing on Splunk to another product by using the "SYSLOG_ROUTING" DEST_KEY on the transform.conf file. Looking at the documentation of "How Splunk licensing works", ... See more...
Hi, I am sending logs without indexing on Splunk to another product by using the "SYSLOG_ROUTING" DEST_KEY on the transform.conf file. Looking at the documentation of "How Splunk licensing works",  it says: "When ingesting event data, the measured data volume is based on the raw data that is placed into the indexing pipeline." By looking on the monitor console I realized that the indexer pipeline is made by: syslog out, tcp out and indexer lines, so it seems that by using syslog_routing dest key I could also consume Splunk license. Can you confirm this? Kind Regards, Angelo      are those
Hey All I've configured tcp-ssl on HF, created certificates and the following configuration. The HF receive syslog from third-party, I'll send the third party company the CA (combined certificat... See more...
Hey All I've configured tcp-ssl on HF, created certificates and the following configuration. The HF receive syslog from third-party, I'll send the third party company the CA (combined certificat) I created based on these docs: 1. How to create and sign your own TLS certificates  2. Create a single combined certificate file  inputs.conf [tcp-ssl://2222] index = test sourcetype = st_test [SSL] serverCert = C:\Program Files\Splunk\etc\auth\mycerts\myServerCertificate.pem sslPassword = <Server.key password> sslRootCAPath = C:\Program Files\Splunk\etc\auth\mycerts\myCertAuthCertificate.pem Server.conf [sslconfig] sslPassword = <password encrypted that I didn't configured> And yet Splunk isn't listening to the requested port for example 2222 What am I missing? The error I get in Splunk _internal is: SSL context not found. Will not open raw (SSL) IPv4 port 2222 Please assist, and Thank YOU!!!  
The app "Splunk App for Fraud Analytics" introduced that we "can download and install test data from here. Please consider that using test data can use up to 7 GB and will take 10-30 minutes for the ... See more...
The app "Splunk App for Fraud Analytics" introduced that we "can download and install test data from here. Please consider that using test data can use up to 7 GB and will take 10-30 minutes for the test data to initialize correctly". But I did not find any test data attached.
Hi all, I successfully forward data from Windows using the command msiexec.exe /i splunkuniversalforwarder_x86.msi RECEIVING_INDEXER="indexer1:9997" WINEVENTLOG_SEC_ENABLE=1 WINEVENTLOG_SYS_ENABLE=... See more...
Hi all, I successfully forward data from Windows using the command msiexec.exe /i splunkuniversalforwarder_x86.msi RECEIVING_INDEXER="indexer1:9997" WINEVENTLOG_SEC_ENABLE=1 WINEVENTLOG_SYS_ENABLE=1 AGREETOLICENSE=Yes /quiet from Install a Windows universal forwarder . The same for Linux with the command ./splunk add monitor /var/log from Configure the universal forwarder using configuration files . Both works fine and I can see the hosts in the Data Summary as visible in the following figure. Data Summary If I instead set up the input in the local "inputs.conf" file after basic installation like [perfmon://LocalPhysicalDisk] interval = 10 object = PhysicalDisk counters = Disk Bytes/sec; % Disk Read Time; % Disk Write Time; % Disk Time instances = * disabled = 0 index = winfwtestinger for example and assign a specific index, I can see that data is ingested if I search for the specific index but they will not appear in the Data Summary. I would be very happy about any suggestion what I am doing wrong here.   Best regards
Hello When I run a search i have the message "could not load lookup" with different lookup name For example : Could not load lookup=LOOKUP-Kerberosfailurecode Could not load lookup=LOOKUP-Kerbero... See more...
Hello When I run a search i have the message "could not load lookup" with different lookup name For example : Could not load lookup=LOOKUP-Kerberosfailurecode Could not load lookup=LOOKUP-Kerberosresultcode Could not load lookup=LOOKUP-syscall I had a look in the lookup definition menu and I can see that some lookup are referenced to my splunk apps even if i dont use these lookups in my apps! But i can change the name of the apps Is it possible to change it? Moreover, some lookup like "syscall" doesnt exists in my lookup definition menu so how to solve this issue please?  
Lets say I have a table of two fields. and some of the cells are empty. How do I find the number of empty cells using "addcoltotals"
Hello friends! I get JSON like this {"key":"27.09.2023","value_sum":35476232.82,"value_cnt":2338} and so on ... { [-]    key: 29.09.2023    value_cnt: 2736    value_sum: 51150570.59 } аnd r... See more...
Hello friends! I get JSON like this {"key":"27.09.2023","value_sum":35476232.82,"value_cnt":2338} and so on ... { [-]    key: 29.09.2023    value_cnt: 2736    value_sum: 51150570.59 } аnd row_source like this 10/4/23 1:23:03.000 PM   {"key":"27.09.2023","value_sum":35476232.82,"value_cnt":2338} Show syntax highlighted host = app-damu.hcb.kz source = /opt/splunkforwarder/etc/apps/XXX/pays_7d.sh sourcetype = damu_pays_7d   And i want to get table like this: days sum cnt 27.09.2023 35476232.82 2338 29.09.2023 51150570.59 2736   so i have to get latest events and put it to table. Please help