All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

異なるソースタイプ[sourcteype=A1]の中に[user]、[sourcetype=B1]の中に[ap_user]というフィールドがあります。 この2つの[user],[ap_user]のユーザ名が同じであるかどうか判定するリアルタイムアラートを作成したいです。 リアルタイムサーチ時にappendやjoinでサブサーチを利用するとうまくいきませんでした。 これを解決できる方法があり... See more...
異なるソースタイプ[sourcteype=A1]の中に[user]、[sourcetype=B1]の中に[ap_user]というフィールドがあります。 この2つの[user],[ap_user]のユーザ名が同じであるかどうか判定するリアルタイムアラートを作成したいです。 リアルタイムサーチ時にappendやjoinでサブサーチを利用するとうまくいきませんでした。 これを解決できる方法がありましたら、ご教授下さい。 sourcetype="A1" |fields user |join [ |search sourcetype="B1" |fields ap_user ] |table user,ap_user |eval match=if(user==ap_user, "〇", "×")
Hi, I'm currently tring to connect splunk with zscaler nss cloud, which are in different networks. I've typed in the public IP of my firewall and opened port 8089 in the zscaler admin portal and a... See more...
Hi, I'm currently tring to connect splunk with zscaler nss cloud, which are in different networks. I've typed in the public IP of my firewall and opened port 8089 in the zscaler admin portal and an error popped up saying : Test Connectivity failed : SSL Certs missing for SIEM Host (0). Where do I get/make this cert and where do I upload it? Thanks!
Hi Everyone  I am trying to create an investigation in ES using SPL. Since ES is most work as lookup/kvstore, so I try to run the following SPL | makeresult... See more...
Hi Everyone  I am trying to create an investigation in ES using SPL. Since ES is most work as lookup/kvstore, so I try to run the following SPL | makeresults | eval class_name="investigation", collaborators="[{\"name\": \"AAAAAA\", \"write\": true}, {\"name\": \"BBBBBB\", \"write\": true}]", create_time=1668731443, creator="CCCCCC", description="DDDDDDD", mod_time=1668731608, status="[{\"name\": \"In Progress\", \"time\": 1668739809, \"id\": \"investigation:2\"}]", title="EEEEEEE", version=1, comments="[]", tags="[]" | table class_name, collaborators, create_time, creator, description, mod_time, status, title, version, comments, tags | outputlookup append=true investigation I am able to add an entry in the KV store, but when I load the investigation tab in ES is breaks and appear Error as "Expect an array" and not able to load the page   Has anyone done this before?   Is that the right way, or is there another way to use SPL to create an investigation?           
Hi, I'm trying to get the audit logs from github cloud into splunk instance which has limited network access. the problem is that ip of github that sends the data to splunk often changes.  Instea... See more...
Hi, I'm trying to get the audit logs from github cloud into splunk instance which has limited network access. the problem is that ip of github that sends the data to splunk often changes.  Instead of granting access to the changed ip, which takes some time to get the approval, I'd like to install another splunk instance in the DMZ environment, where there are no limit to the network, and send or forward the data in to the splunk instance in the limited network. GitHub needs Splunk http event collector in order to verify before sending data. So I'm guessing that only heavy forwarder(full splunk instance to my knowledge, right?) is available. Is this something that can be done? If so, could you please let me know the steps or docs that I could reference? Thank you in advance.
We use a custom app in our Splunk Cloud instance to segregate dashboards and searches from other teams. With the recent update to allow dark theme compatibility in Splunk 9 to the search view in "Sea... See more...
We use a custom app in our Splunk Cloud instance to segregate dashboards and searches from other teams. With the recent update to allow dark theme compatibility in Splunk 9 to the search view in "Searching and Reporting" I was wondering if there was a way we could enable that in our custom app as well? Currently while in this custom app and trying to switch the them to dark I get this error message: "The Theme setting is not supported by your current app context."
I'd like to build a search targeting media transfers and add it to my dashboard. Using the index of the security logs, I'd like to pick up all users create data transfers like CD burns, USB access,... See more...
I'd like to build a search targeting media transfers and add it to my dashboard. Using the index of the security logs, I'd like to pick up all users create data transfers like CD burns, USB access, etc. My client, requires data transfer accounts to have a specific suffix such as "-xxx". What's the best search for these requirements? 
Hi, Splunkers,    I  want to search string like abc/efg in my log using  multiselect field.  I directly defined this  search value  abc/efg in multiselect field , token  name "keyword" in my ... See more...
Hi, Splunkers,    I  want to search string like abc/efg in my log using  multiselect field.  I directly defined this  search value  abc/efg in multiselect field , token  name "keyword" in my query, I use $keyword" to search,  it doesn't' work,  I also try  abc\/efg, it doesn't work either,  but other normal string works here.   any ideas?    thx in advance.   Kevin    
I am trying to add a field to a search using a lookup table. However, my key field  is sometimes blank and I get an error the lookup table does not exist or is not available.  ... search with an ou... See more...
I am trying to add a field to a search using a lookup table. However, my key field  is sometimes blank and I get an error the lookup table does not exist or is not available.  ... search with an output field of user | lookup userList.csv id as user OUTPUT title There are a few lines rows where id is blank and I do not have permissions to edit the table.
I was able to deploy Otel collector into our AKS cluster to send logs to our splunk cloud instance.  I´m able to see application pod logs successfully.  I also was able to use the  option to add en... See more...
I was able to deploy Otel collector into our AKS cluster to send logs to our splunk cloud instance.  I´m able to see application pod logs successfully.  I also was able to use the  option to add environment in the values.yaml  and after I configured it,  i´m able to see environment name  into our splunk instance logs.  Now I would like to know if there is a way to add extra attributes ,the same way as environment, to have more filtering while searching into splunk, as we have multiple aks clusters  I would like to add the  same way as environment , other values as: aksvale1, myvalue2, myvalue3 so that those values are also created and are able to be usable into splunk cloud queries. I notice that in values.yaml there is this extra atribute option , but that is for adding tag in pods apps or namespaces.   Is there a native way to achieve this with same approach as environments?   Thank you.
I am VERY new to splunk so please bear with me.  I have a search, index=vulnerability "list of packages installed on the remote" myserver.com | rex field=output "\n{1,3}\s{2,4}(?<ProgramNameOutput>... See more...
I am VERY new to splunk so please bear with me.  I have a search, index=vulnerability "list of packages installed on the remote" myserver.com | rex field=output "\n{1,3}\s{2,4}(?<ProgramNameOutput>[^|]+)" max_match=5000 | table ProgramNameOutput Which produces the following fictional output: libhangul-0.1.0-8.el7 intltool-0.50.2-7.el7 libvert-7.6.1-120.el7 at-spi2-core-2.28.0-1.el7 spice-gtk3-0.35-5.el7_9.1 perl-Digest-MD5-2.52-3.el7 mesa-libvert-18.3.4-12.el7_9 hyperv-daemons-license-0-0.34.20180415git.el7 libsysfs-2.1.0-16.el7 openldap-clients-2.4.44-25.el7_9 libvirt-gconfig-1.0.0-1.el7 libpeas-gtk-1.22.0-1.el7 NetworkManager-adsl-1.18.8-2.el7_9 perl-Locale-Maketext-1.23-3.el7   So what I'd like to do is to pair that ProgramNameOutput down to only output that has "libvert" in it.   The output field looks similar to this, it's basically one big text block:   "output:  Here is the list of packages installed on the remote CentOS Linux system :   libhangul-0.1.0-8.el7|(none) Mon 01 Nov 2021 12:06:47 PM EDT intltool-0.50.2-7.el7|(none) Mon 01 Nov 2021 12:10:13 PM EDT gdb-7.6.1-120.el7|(none) Mon 01 Nov 2021 12:06:15 PM EDT at-spi2-core-2.28.0-1.el7|(none) Mon 01 Nov 2021 12:08:53 PM EDT spice-gtk3-0.35-5.el7_9.1|(none) Mon 01 Nov 2021 12:40:05 PM EDT perl-Digest-MD5-2.52-3.el7|(none) Mon 01 Nov 2021 12:05:15 PM EDT mesa-filesystem-18.3.4-12.el7_9|(none) Mon 01 Nov 2021 12:38:01 PM EDT"
Search head cluster captain's /opt/splunk/var/run/file.bundle still has the csv even though file was added in the /opt/splunk/etc/system/local/distsearch.conf's [replicationBlacklist]. $SPLUNK_HOME/... See more...
Search head cluster captain's /opt/splunk/var/run/file.bundle still has the csv even though file was added in the /opt/splunk/etc/system/local/distsearch.conf's [replicationBlacklist]. $SPLUNK_HOME/bin/splunk btool distsearch list --debug showing the csv file in the [replicationBlacklist] list but the csv file still in the latest bundle on the SH captain? Could this a bug for Splunk 8.2.4 (build 87e2dda940d1) when the number of entries in [replicationBlacklist] exceeds a number? in this case there are entries from blacklist_lookups_1 to blacklist_lookups_79. Thanks in advance for inputs.
Are there currently supported methods for ingesting and monitoring Suricata events in splunk?
Are there any existing parser for samba smbd_audit records?  Or other was to collect access to files with samba?
Hi Splunkers, I want to create a macro that will be looking inside a lookup file, but in a way that will not break the search if the lookup is non-existent after some time. Is there any equivalen... See more...
Hi Splunkers, I want to create a macro that will be looking inside a lookup file, but in a way that will not break the search if the lookup is non-existent after some time. Is there any equivalent of for example Linux known "test -f filename" in Splunk?
I'm trying to finally make my bareos logs "work" properly. Parsing the fields out of the events is one thing but I was wondering if there is any good schema for backup-related fields. There is no suc... See more...
I'm trying to finally make my bareos logs "work" properly. Parsing the fields out of the events is one thing but I was wondering if there is any good schema for backup-related fields. There is no such datamodel in CIM so I can't rely on that but maybe some well-known and widely used TA to take pattern from?
I see that there is a journald_input app in the splunk forwarder install, but I can't seem to find any information on how to use it.  I ran: /opt/splunkforwarder/bin/splunk enable  app journald_inpu... See more...
I see that there is a journald_input app in the splunk forwarder install, but I can't seem to find any information on how to use it.  I ran: /opt/splunkforwarder/bin/splunk enable  app journald_input But it doesn't appear to be ingesting any entries from the journal.
I want to be the order I list below? Very High  High  Medium Low Very Low  Info
hi I use a search  thats transpose events with span of 30 m the end of the search is this one   | where _time <= now() AND _time >= now()-14400 | eval time=strftime(_time,"%H:%M") | sort ti... See more...
hi I use a search  thats transpose events with span of 30 m the end of the search is this one   | where _time <= now() AND _time >= now()-14400 | eval time=strftime(_time,"%H:%M") | sort time | fields - _time _span _origtime _events | fillnull value=0 | transpose 0 header_field=time column_name=KPI include_empty=true | sort + KPI   as you can see, I just display events which exist in a specific time range   | where _time <= now() AND _time >= now()-14400   It works fine but just when the timepicker choice is "today" I would like to do the same think on previous timepicker choice like "last 7 days" or "last 30 days" Could you help please?
As of today data models, like the Network Traffic data model, have fields for src, src_ip, dest and dest_ip, but not src_dns and dest_dns. The way I understand it, DNS names should then be used... See more...
As of today data models, like the Network Traffic data model, have fields for src, src_ip, dest and dest_ip, but not src_dns and dest_dns. The way I understand it, DNS names should then be used in the src and dest fields, and IPs in the fields src_ip and dest_ip. Some logs don't have DNS names available in the log itself. However, if you have Splunk ES with a populated asset framework, it will automatically add the field src_dns and dest_dns to the events if the fields src and dest are already available. If I want the fields src_dns and dest_dns from the events to be added to the src and dest fields in the data model, I would normally solve this by adding a coalesce for src in props.conf for the source type, but since lookups are applied after evals in the search time parsing, this is not possible when src_dns and dest_dns comes from a lookup, as in the case with Splunk ES. Therefore I propose the following change to the data models themselves, for all datamodels that are using the src and dest fields: Change the eval for src from if(isnull(src) OR src="","unknown",src) to case((isnull(src_dns) OR src_dns="") AND (isnull(src) OR src=""),"unknown",NOT (isnull(src_dns) OR src_dns=""),src_dns,true(),dest) and likewise change the eval for dest from if(isnull(dest) OR dest="","unknown",dest) to case((isnull(dest_dns) OR dest_dns="") AND (isnull(dest) OR dest=""),"unknown",NOT (isnull(dest_dns) OR dest_dns=""),dest_dns,true(),dest)