All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi All, We are using Splunk Add-on for Box to get Box logs. With this Add-on, it appears that only [Events, Users, Folders, Groups] of the Box API endpoints is available. Furthermore, it seems tha... See more...
Hi All, We are using Splunk Add-on for Box to get Box logs. With this Add-on, it appears that only [Events, Users, Folders, Groups] of the Box API endpoints is available. Furthermore, it seems that only "Standard" columnscan be retrieved for each API. (API Reference - Box Developer Documentation) 1. Is there any way to get logs from other endpoints like COLLABORATIONS or DOWNLOADS? 2. is it possible to retrieve all available columns, for example FILES(FULL)? Best Regards,
Hi Guys, We have 1 indexer and 1 Search head in 2 different datacenter locations. (Lets say DC-A and DC-B) Since DC-A is being decommissioned, we have been directed to copy the indexed data from ... See more...
Hi Guys, We have 1 indexer and 1 Search head in 2 different datacenter locations. (Lets say DC-A and DC-B) Since DC-A is being decommissioned, we have been directed to copy the indexed data from the Indexer in DC-A to Indexer in DC-B.  Now, Indexer in DC-B has enough SAN to hold the indexed data from both the Datacenters but we would want to move/store the data in such a way that SH in DC-B is not able to search data from DC-A. So basically, I am looking at how to store data in indexer but make it non searchable. Any ideas, how to best proceed with this? Appreciate the help !! Thanks, Neerav Mathur 
I would like to find the detail of custom threats built-in as a file (via cli not via GUI). Can I do that?
I need a help from you. Could you please help me to generate a single query from these 3 separate queries ? The index is same in 1 & 2 queries. The source types of all 3 are different. Thank you. ... See more...
I need a help from you. Could you please help me to generate a single query from these 3 separate queries ? The index is same in 1 & 2 queries. The source types of all 3 are different. Thank you. 1. index="abc_oracle" source=audit_19c sourcetype="audit" | eval "Database Modifications:" = "Modification on " + host, "Date and Time" = TIMESTAMP, "Type" = SQL_TEXT, "User" = DB_USER , "Source" = sourcetype | search "Database Modifications:"="Modification on *" NOT select | rex field=_raw "SQL_TEXT=\S(?P<Type>\W?......)\s" | rex field=_raw "DB_USER=(?P<UserName>..........)" | table "Date and Time", "Database Modifications:" ,"Type", "User", "Source" 2. index="abc_oracle" source=audit_row_19c sourcetype="audit" | eval "Database Modifications:" = "Modification on " + host, "Date and Time" = TIMESTAMP, "Type" = SQL_TEXT, "User" = DB_USER , "Source" = sourcetype | search "Database Modifications:"="Modification on *" NOT select | rex field=_raw "SQL_TEXT=\S(?P<Type>\W?......)\s" | rex field=_raw "DB_USER=(?P<UserName>..........)" | table "Date and Time", "Database Modifications:" ,"Type", "User", "Source" 3. index="abc_11g" source=oracle_11g sourcetype="audit" | eval "Database Modifications:" = "Modification on " + host, "Date and Time" = TIMESTAMP_qab, "Type" = SQL_TEXT, "User" = DB_USER , "Source" = sourcetype | search "Database Modifications:"="Modification on *" NOT select | rex field=_raw "SQL_TEXT=\S(?P<Type>\W?......)\s" | rex field=_raw "DB_USER=(?P<UserName>..........)" | table "Date and Time", "Database Modifications:" ,"Type", "User", "Source" Thank you
Hello, I'd ask for a help on how to write a query where I need to get an alert "when there's a user added to a specific group and then removed from the group within 1 Hour time." I'm new to Splun... See more...
Hello, I'd ask for a help on how to write a query where I need to get an alert "when there's a user added to a specific group and then removed from the group within 1 Hour time." I'm new to Splunk, any help appreciated.
I am using Splunk DB Connect V3.7.0 and there seems to be a major security hole? I want to give some users access to some of the connections/identities. I set the permissions of what they can see,... See more...
I am using Splunk DB Connect V3.7.0 and there seems to be a major security hole? I want to give some users access to some of the connections/identities. I set the permissions of what they can see, and that works. BUT If a user explicitly asks for a connection that they cannot see, they are still allowed to access it?! This cannot be correct?
I am trying to setup a test environment so I can practice the new SPL that I am learning. I am trying to work with botsv1. I have downloaded and installed Splunk Enterprise along with the Splunk App ... See more...
I am trying to setup a test environment so I can practice the new SPL that I am learning. I am trying to work with botsv1. I have downloaded and installed Splunk Enterprise along with the Splunk App for Stream,  TA-Suricata, and the botsv1_data_set.tgz. At this point I should be able to run an "index=botsv1" which does run successfully, but it has zero events. That makes me think I have the app installed but not the data. When I click on the link in GetHub to download the botsv1.json.gz file it opens a new Chrome browser tab rather than downloading the file. The same with all the individual Json files. I know I am just doing it wrong (newbee), but how do I pull the data into Splunk so I can start searching it? 
I need help on development .  I have a requirement to capture the logs of a file path "care\outbound\prod" and "care\outbound\Test", Both the file names are same one will go to Test folder and other ... See more...
I need help on development .  I have a requirement to capture the logs of a file path "care\outbound\prod" and "care\outbound\Test", Both the file names are same one will go to Test folder and other will go to Prod folder. As per the initial requirement I want capture the test data that is coming to "care\outbound\Test" path. Need help on coding part. code:   index=*** doc_name= ***** "*care*"   I have choose "care" as a key point, What ever the files cross through "care" folder it captures. But I need to capture the files which are coming to  "care\outbound\Test" . Please let me know if you need more clarification.
It looks like Splunk Universal Forwarder service on Linux enables CPU accounting or CPU shares. If this is enabled another program cannot manually assign scheduling. Does Splunk Service need CPU a... See more...
It looks like Splunk Universal Forwarder service on Linux enables CPU accounting or CPU shares. If this is enabled another program cannot manually assign scheduling. Does Splunk Service need CPU accounting and can this be disabled when Splunk starts. We want to determine if this CPUShares= setting is absolutely necessary for the service or if you have workarounds for setting CPU scheduling for the service in the legacy style.
I am running db connect version 3.4.2 on Splunk 8.2.4. I have many splunk db connect output cron jobs scheduled at 6 AM EST but after the day light shifting  the jobs are not running at 6 AM but at 7... See more...
I am running db connect version 3.4.2 on Splunk 8.2.4. I have many splunk db connect output cron jobs scheduled at 6 AM EST but after the day light shifting  the jobs are not running at 6 AM but at 7 AM. The OS date and time already shift to day light saving time. Any one having this issue. 
How many Sybase database servers can be onboarded on a single DB connect server. Is there any limit in the number of stanzas configured in db_inputs.conf for configuration of new Sybase database ser... See more...
How many Sybase database servers can be onboarded on a single DB connect server. Is there any limit in the number of stanzas configured in db_inputs.conf for configuration of new Sybase database servers in the DB connect app.
I hope this is the right place to post this if not please let me know where to post it. There are multiple use-cases for Task Scheduler in the SSE app, my question pertains to all that are based on E... See more...
I hope this is the right place to post this if not please let me know where to post it. There are multiple use-cases for Task Scheduler in the SSE app, my question pertains to all that are based on EventID=4698 None of these searches seem to work in my environment out of box, I checked and my Windows TA is up to date. Not sure if there is another TA required? Here is on as an example and how I fixed it: `wineventlog_security` EventCode=4698 | xmlkv Message | search Command IN ("*\\users\\public\\*", "*\\programdata\\*", "*\\temp\\*", "*\\Windows\\Tasks\\*", "*\\appdata\\*") | stats count min(_time) as firstTime max(_time) as lastTime by dest, Task_Name, Command, Author, Enabled, Hidden | `security_content_ctime(firstTime)` | `security_content_ctime(lastTime)` | `winevent_scheduled_task_created_within_public_path_filter`   To fix this query I ended up changing line 2 to: | xmlkv TaskContent And line 4 to: | stats count min(_time) as firstTime max(_time) as lastTime by dest, TaskName, Command, Author, Enabled, Hidden, Arguments I dont know if I am missing something or if this is broken out of the box, if so is there somewhere to report this?  
Hi, I send email data to http event collector in JSON format like this : { "sender-domain":"domain.com", "sender":"sender.test@domain.com", "recipient":"Name1 Surname1<name1.surname1@domain.com>,... See more...
Hi, I send email data to http event collector in JSON format like this : { "sender-domain":"domain.com", "sender":"sender.test@domain.com", "recipient":"Name1 Surname1<name1.surname1@domain.com>, "Name2 Surname2<name2.surname2@domain.com>" } I would like to extract email addresses from recipient field and save it as multivalue field with the same name (field recipient will be used in email data model). Do you have any idea what can i do this?  The only idea which I have is use sedcmd to change name for recipient  to another field name and next use regex to do extraction from this fields email adresses as recipient field. The Regex is:   SOURCE_KEY = changed_recipient_field_name REGEX = (?<recipient>[\w\d\.\-\=\+]+\@[\w\d\.\-]+) FORMAT = recipient::$1 What is the best solution for this? Thank you in advance.  
I am facing following challenge. I have a lookup table myids.csv with ID's in it: ID 1 2 3 I have and index also with IDs in it (less than in the lookup): ID 1 2 I am lookup for a way to... See more...
I am facing following challenge. I have a lookup table myids.csv with ID's in it: ID 1 2 3 I have and index also with IDs in it (less than in the lookup): ID 1 2 I am lookup for a way to only show the ID from the lookup that is not present in the index. ID 3 Any suggestions?    
We are working with several remote datasets that are combined to give our end user a specific result.  Federated Search gives us an LDAP dn, which we are trying to use to pull enhancing information... See more...
We are working with several remote datasets that are combined to give our end user a specific result.  Federated Search gives us an LDAP dn, which we are trying to use to pull enhancing information from another remote source via a REST API.  The following search works:   index=federated:remote_dataset userid="cn=" | \ eval dn=lower(userid) | \ dedup dn | \ table dn   The idea is to use a scheduled search to populate a csv with a list of DNs at the top of every hour, then use a cron job to spawn a python script which generates a new CSV that contains the DN and the enhancing data from the REST API source. Our python script is working, however when we add "|outputlookup dn.csv append=true" to the otherwise functional SPL, we get nothing. This fails:   index=federated:remote_dataset userid="cn=" | \ eval dn=lower(userid) | \ dedup dn | \ table dn | \ outputlookup dn.csv append=true   Is this a limitation of Federated Search? Thank you
I am trying to configure controller settings in a C++ application in unix environment. I see these configuration settings in the example - const char APP_NAME[] = "SampleC"; const char TIER_NAME... See more...
I am trying to configure controller settings in a C++ application in unix environment. I see these configuration settings in the example - const char APP_NAME[] = "SampleC"; const char TIER_NAME[] = "SampleCTier1"; const char NODE_NAME[] = "SampleCNode1"; const char CONTROLLER_HOST[] = "controller.somehost.com"; const int CONTROLLER_PORT = 8080; const char CONTROLLER_ACCOUNT[] = "customer1"; const char CONTROLLER_ACCESS_KEY[] = "MyAccessKey"; const int CONTROLLER_USE_SSL = 0; Is a node name necessary to have in configuration. And where can i find the controller account and controller access key?
hi dear ,  Am facing a issue while installing splunk enterprise security on my windows10 system it showing error called  "Splunk enterprise Setup Wizard ended Prematurely" plz tell me the solutio... See more...
hi dear ,  Am facing a issue while installing splunk enterprise security on my windows10 system it showing error called  "Splunk enterprise Setup Wizard ended Prematurely" plz tell me the solution for this.......!
Hi All,  I have the below query which gives the columns : Name.    Count.    Percentage.    ControlID   | spath evaluation_results | search gear_name | spath input=evaluation_results | forea... See more...
Hi All,  I have the below query which gives the columns : Name.    Count.    Percentage.    ControlID   | spath evaluation_results | search gear_name | spath input=evaluation_results | foreach *.compliant [| eval Compliance=if('<<FIELD>>'="Compliant",if(isnull(Compliance),"<<MATCHSEG1>>".":".'<<MATCHSEG1>>.controlVersion',mvappend(Compliance,"<<MATCHSEG1>>".":".'<<MATCHSEG1>>.controlVersion')),Compliance) | eval NonCompliance=if('<<FIELD>>'="Compliant",NonCompliance,if(isnull(NonCompliance),"<<MATCHSEG1>>".":".'<<MATCHSEG1>>.controlID',mvappend(NonCompliance,"<<MATCHSEG1>>".":".'<<MATCHSEG1>>.controlVersion')))] | top 50 NonCompliance | eval controlVersion=mvindex(split(NonCompliance,":"),1) | eval NonCompliance=mvindex(split(NonCompliance,":"),0) | rename NonCompliance as "Name"   I have modified the above query to add a new column which shows RiskRating.    | spath evaluation_results | search gear_name | spath input=evaluation_results | foreach *.compliant [| eval Compliance=if('<<FIELD>>'="Compliant",if(isnull(Compliance),"<<MATCHSEG1>>".":".'<<MATCHSEG1>>.controlVersion',"<<MATCHSEG2>>".":".'<<MATCHSEG2>>.riskRating',mvappend(Compliance,"<<MATCHSEG1>>".":".'<<MATCHSEG1>>.controlVersion',mvappend(Compliance,"<<MATCHSEG2>>".":".'<<MATCHSEG2>>.riskRating')),Compliance) | eval NonCompliance=if('<<FIELD>>'="Compliant",NonCompliance,if(isnull(NonCompliance),"<<MATCHSEG1>>".":".'<<MATCHSEG1>>.controlID',"<<MATCHSEG2>>".":".'<<MATCHSEG2>>.riskRating',mvappend(NonCompliance,"<<MATCHSEG1>>".":".'<<MATCHSEG1>>.controlVersion',"<<MATCHSEG2>>".":".'<<MATCHSEG2>>.riskRating')))] | top 50 NonCompliance | eval controlVersion=mvindex(split(NonCompliance,":"),1) | eval riskRating=mvindex(split(NonCompliance,":"),2) | eval NonCompliance=mvindex(split(NonCompliance,":"),0) | rename NonCompliance as "Name"   Unfortunately the query doesn't execute. Can someone please help with where I got it wrong and what needs to be modified? The output columns should show as below: Name.    Count.    Percentage.    ControlID.    Risk Rating. Below is my raw source   | makeresults | eval _raw="{\"job_id\": \"abc123\", \"gear_event_id\": \"aaaa\", \"event_id\": \"7cf6-4ff0\", \"execution_start\": \"2021-10-06 13:29:31.143\", \"execution_end\": \"2021-10-06 13:29:50.104\", \"gear_version\": \"3.0.16\",\"gear_name\": \"sns\", \"resource_type\": [\"sns_topic\"], \"event_status\": \"SUCCESS\", \"compliance_result\": \"Compliant\", \"evaluation_results\": {\"Tags\": {\"compliant\": \"Compliant\", \"controlVersion\": \"1.0\", \"evaluationDetails\": \"\", \"riskRating\": \"Low\"}, \"Tags\": {\"compliant\": \"Compliant\", \"controlVersion\": \"1.0\", \"evaluationDetails\": \"not approved\", \"riskRating\": \"Low\"}, \"correlation_id\": \"4362-47fb\", \"service\": \"biosevent\", \"timestamp\": \"2021-10-06 13:29:31.143\", \"version\": \"3.0.16\", \"duration\": 18.961}}"  
Hi All, I have logs as below to check certificate validity: Valid from: Tue Jul 13 02:51:21 EDT 2021 until: Thu Jul 13 02:51:21 EDT 2023 I have extracted the from_date and until_date by using th... See more...
Hi All, I have logs as below to check certificate validity: Valid from: Tue Jul 13 02:51:21 EDT 2021 until: Thu Jul 13 02:51:21 EDT 2023 I have extracted the from_date and until_date by using the below query: ..... | rex field=_raw "from\:\s(?P<Valid_From>\w+\s\w+\s(\s{0,1})\d+\s\d+\:\d+\:\d+\s\w+\s\d+)\s" | rex field=_raw "until\:\s(?P<Valid_Until>\w+\s\w+\s(\s{0,1})\d+\s\d+\:\d+\:\d+\s\w+\s\d+)" Now I want to get the no. the days between these two dates to get the certificate validity. Please help me to create a query to get the desired output.
I have added the latest version of Splunk_TA_windows to my environment using a deployment server. The app has been pushed to all windows machines, the search heads and the heavy forwarders. I h... See more...
I have added the latest version of Splunk_TA_windows to my environment using a deployment server. The app has been pushed to all windows machines, the search heads and the heavy forwarders. I have only been receiving data into the "Main" index and be unsuccessful at redirecting the data to our preferred collection points index =  wineventlog. on the deployment server i have created a  Splunk_TA_windows/local/inputs.conf file containing the following.   [WinEventLog://ForwardedEvents] index = wineventlog disabled = 0 [WinEventLog://Application] index = wineventlog disabled = 0 [WinEventLog://System] index = wineventlog disabled = 0 [XmlWinEventLog] index = wineventlog [WinEventLog] index = wineventlog   I am primarily a linux guy for splunk admin and only have 1 windows host monitored at the moment (all windows events are forwarded to and collected from this node), is there something that needs to be done differently to redirect the index for this applications? Next consideration I have is using props/transforms to change the index although am worried about the hardware impact of that on 5 million events a day.