All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello Splunkers, I have a search created below to only detect local ip intel specified manually by the user: | tstats min(_time) as firstSeen max(_time) as lastSeen count from datamodel="Threat_I... See more...
Hello Splunkers, I have a search created below to only detect local ip intel specified manually by the user: | tstats min(_time) as firstSeen max(_time) as lastSeen count from datamodel="Threat_Intelligence"."Threat_Activity" where Threat_Activity.threat_key=local_ip_intel by Threat_Activity.weight Threat_Activity.threat_match_value Threat_Activity.threat_match_field Threat_Activity.src Threat_Activity.dest Threat_Activity.orig_sourcetype Threat_Activity.threat_collection Threat_Activity.threat_collection_key | rename Threat_Activity.* as * | join type=left threat_match_value [| inputlookup local_ip_intel.csv | rename ip as threat_match_value description as desc | fields threat_match_value desc] My goal here is to specify a description next to each local ip threat match to ease up the analysis and specify a reason as to why the intel was inserted there in the first place. The search works properly and when sourcetypes are searched the results do actually show up where they match the local ip intel, however it shows more than what was specified in the local ip list which it is seen trying to match ip's that are not even in the list and so no description can be joined in the result. PS: The SubSearch does show me all the correct IPs I have manually added Would appreciate anyone showing me where I am actually going wrong.
Hi  Consider this event structure :     {"result" : {"dogs" : [{"name" : "dog-a", "food":["pizza", "burger"] }, {"name" : "dog-b", "food":["pasta"] }] }}     Now want to filter the dogs... See more...
Hi  Consider this event structure :     {"result" : {"dogs" : [{"name" : "dog-a", "food":["pizza", "burger"] }, {"name" : "dog-b", "food":["pasta"] }] }}     Now want to filter the dogs by name and present them relevant food. When I try this search(with the relevant index):     result.dogs{}.name = dog_a| table result.dogs{}.food{}     I Am getting this result: pizza burger pasta    I Am expecting to get only dog-a foods(pizza and burger)  
Hello-   I am attempting to create a heat gauge off the average of two timestamp fields to determine average time an issue was worked. I'm running into issues as these fields are stored as string... See more...
Hello-   I am attempting to create a heat gauge off the average of two timestamp fields to determine average time an issue was worked. I'm running into issues as these fields are stored as strings in the ISO 8061 format. I'd like to know if there's a good way to convert this string as simply as possible or to be able to extract certain portions of the string to be able to use a numeric value from the average calculations (ideally extract the MM:SS from the ISO string.)   Thanks!
Hello-   I am attempting to make a table and hopefully be able to integrate it into a dashboard. Goal is to interrogate on two fields and pull stats accordingly. FieldA has multiple values- t... See more...
Hello-   I am attempting to make a table and hopefully be able to integrate it into a dashboard. Goal is to interrogate on two fields and pull stats accordingly. FieldA has multiple values- table is to show all values of FieldA. Utilize stats count for how many daily transactions have been processed by each unique value of FieldA.  Then the portion I am having difficulties with- with the daily count for each unique value of FieldA, I want to interrogate that count by FieldB to see how many of that count is a hit for any value of FieldB. This is the code I am using: table FieldA FieldB | fields "FieldB", "FieldA " | fields "FieldB", "FieldA " | stats count by FieldA , FieldB| sort -"count"   The second count of FieldB hits out of the count of FieldA instances is always showing up as zero, despite having values other than zero in FieldB. FieldB values should all be numeric. 
Hi, I would like to set time ranges from 2 different types of inputs, Dropdown and Time, as shared tokens into a panel. Currently, this is the code I have, the Dropdown has 4 options, and the Time... See more...
Hi, I would like to set time ranges from 2 different types of inputs, Dropdown and Time, as shared tokens into a panel. Currently, this is the code I have, the Dropdown has 4 options, and the Time input appears depending on the last dropdown option. I am stuck on passing the time range from the Time input into the "custom_earliest" and "custom_latest" tokens.     <fieldset submitButton="false"> <input type="dropdown" token="field1"> <label>Time Selection</label> <choice value="yesterday">Yesterday</choice> <choice value="-7d">Last 7 Days</choice> <choice value="mtd">Month To Date</choice> <choice value="custom">Custom Time</choice> <change> <condition label="Yesterday"> <set token="custom_earliest">-7d@d+7h</set> <set token="custom_latest">@d+7h</set> <unset token="showCustom"></unset> </condition> <condition label="Last 7 Days"> <set token="custom_earliest">-4w@d+7h</set> <set token="custom_latest">@d+7h</set> <unset token="showCustom"></unset> </condition> <condition label="Month To Date"> <set token="custom_earliest">-5mon@mon+7h</set> <set token="custom_latest">@d+7h</set> <unset token="showCustom"></unset> </condition> <condition label="Custom Time"> <set token="showCustom">Y</set> </condition> </change> <default>yesterday</default> <initialValue>yesterday</initialValue> </input> <input type="time" token="customTime" depends="$showCustom$"> <label>Time Range</label> <default> <earliest>-3d@d+7h</earliest> <latest>-2d@d+7h</latest> </default> </input> </fieldset>     Any help would be appreciated, thanks!
Dear all, I want to combine 2 search job into 1 job. My first search job is to search all the alert_id occur in the past 24 hours and listed them as a table. 2nd search job is to find among all ... See more...
Dear all, I want to combine 2 search job into 1 job. My first search job is to search all the alert_id occur in the past 24 hours and listed them as a table. 2nd search job is to find among all the alert_id in the first search job and try to match which alert_id has an event of packet filtered . I am able to generate a desired result by using the "map search" index="security_device" sourcetype= security_log "abnormal Protocol" alert_id | table alert_id | map search="search index="security_device" sourcetype=security_log "Filter action" $alert_id$" maxsearches=500 | table filter-discard However, I notice that using a map search is very in-efficient. It is taking forever if I select for 30 days. Can anyone recommend me a better way to do it.  FYI, I have tried the nested search, but no luck, it return a 0 result to me index="security_device" sourcetype=security_log "Filter action" [ search index="security_device" sourcetype=security_log "abnormal Protocol" alert_id | table alert_id ] | table filter-discard Thank you.
I have two values in a field source, I need to hide one i.e., http:kafka  
which apps are used in Splunk soc in a bank ?? for threat intel, incident response, and so on.
Hi all, I am quite new to Splunk and now trying to create a dashboard panel using a query that does the following: pulls the required fields from an index based on textfield input checks on o... See more...
Hi all, I am quite new to Splunk and now trying to create a dashboard panel using a query that does the following: pulls the required fields from an index based on textfield input checks on one specific field "opsID" from the index against a field "code" in a csv i uploaded if it is present in the csv, I just want to return a simple output that I could use to display in a table form The csv looks something like this: code, notes 123, User 456, Admin 789, User   Example of my query: index=userdatabase "abc12345" | eval abc=[|inputlookup Lookup.csv | where code=opsID| fields notes] | eval isPresent=if(abc!="", YES, NO) | table username, isPresent   However I am getting errors like Error in 'eval' command: The expression is malformed. An unexpected character is reached at ')'. I tried for a few days can't seem to figure it out my mistake, hence hoping for some help over my basic question.. I got a feeling my logic could be wrong to begin with
Hi Team, I'm new to Splunk Tool, I just have a question how to hunt below things in Splunk: 1). Investigate net connections associated with GitHub usage. 2). Look for unusual downloads, commen... See more...
Hi Team, I'm new to Splunk Tool, I just have a question how to hunt below things in Splunk: 1). Investigate net connections associated with GitHub usage. 2). Look for unusual downloads, commend line executions / code executions from GitHub.
Hi Splunkers. I have two level of logs (NOTICE,ERROR), for Error logs(json), method_name and message is automatically getting extracted but not for NOTICE logs, So i have written my case statement ... See more...
Hi Splunkers. I have two level of logs (NOTICE,ERROR), for Error logs(json), method_name and message is automatically getting extracted but not for NOTICE logs, So i have written my case statement like below in UI  and its working fine but im not sure how to deploy this in props.conf   index=index_name sourcetype=sourctype_name log_level=NOTICE |eval message =case(method_name='protopayload.table.create'=="table created",method_name='protopayload.table.delete'=="table deleted") i dont want to write case statement for error logs as its already getting extracted fine. to be precise:- i want my fields extraction to happen automatically for error logs (as its getting extracted automatically) and want my case statement work only for notice logs.   Please assist on this        
Hello, I am using rex to remove everything after a specific character, but i need to keep the specific character. Currently, I am using this - | rex mode=sed field=Cluster "s/[k].*//g" Unfort... See more...
Hello, I am using rex to remove everything after a specific character, but i need to keep the specific character. Currently, I am using this - | rex mode=sed field=Cluster "s/[k].*//g" Unfortunately it is also removing the 'k'. Can I amend this argument slightly so it removes everything after k but the k remains? Unfortunately I don't have any / to work with. Thanks!
Hello!   i am sending from a host to splunk cloud logs from the disk usage. Here is an example how the events are:       /dev/nvme4n1p1 xfs 50G 555M 50G 2% 25M 33K 25M 1% /var/www  ... See more...
Hello!   i am sending from a host to splunk cloud logs from the disk usage. Here is an example how the events are:       /dev/nvme4n1p1 xfs 50G 555M 50G 2% 25M 33K 25M 1% /var/www         where "2%" field is called as "Use_". How can i create an alert/search when this number (without the percentage symbol) is higher than 80 ?   Thanks!
Hello,  I'm fairly new to Splunk and I'm hoping someone could help me rename the API names in my Dashboard.  I have several names associated with urlPthTxt.   
Hi Need help with Left join There are two queries as below Query1 index=abc  sourcetype=123   |  table a.b.requestGUID  EmplId Query 2 index=adef  sourcetype=456   |  table c.requestGUID  Vacat... See more...
Hi Need help with Left join There are two queries as below Query1 index=abc  sourcetype=123   |  table a.b.requestGUID  EmplId Query 2 index=adef  sourcetype=456   |  table c.requestGUID  VacationStartDate In query 1 the request guid is under an object b within an  object a (hence a.b.requestGUID) In query 2  the request guid is under a different object c (hence c.requestGUID) what is the syntax to join query 1 & 2 on guid under two objects and see if an employee has a vacation coming up ( basically get Guid, emplID vacatioStartDate in one shot.) These two queries only have guid as common field but they are under different objects.   Thanks, for the help    
Hello, Is there a way to specify in the curl command the target index? For example with the following command, how can i target an index named: scheduler  in the command line? curl -k https:/... See more...
Hello, Is there a way to specify in the curl command the target index? For example with the following command, how can i target an index named: scheduler  in the command line? curl -k https://prd-plot.splunkcloud.com:8088/services/collector -H "Authorization:Splunk #####-4f99-b680-72c7bd33f9bb" -d "{\"sourcetype\" "_json\",\"event\": {\"a\": \"value1\", \"b\": [\"value1_1\", \"value1_2\"]}}"   Thanks, Mark
I inherited splunk enterprise installed on ec2 instances.     We have cloud watch sending log data to our splunk -   how do I get that data feed into an index. (not indexers) so confused.
Does anyone feel like we are going to be able to create modern dashboards which allow us to interact with kvstore data in the same way we were able to in simplexml..  the old school dashboards fee... See more...
Does anyone feel like we are going to be able to create modern dashboards which allow us to interact with kvstore data in the same way we were able to in simplexml..  the old school dashboards feel a bit clunky..  The alternative is exposing the datafeeds via rest and using a third party tool such as retool.com to allow the CRUD of kv store items..  The dedicated app to lookup file editing is useful but not great for end user consumption ...  I was wondering what are people using ? Apologies if I missed how to achieve this - most seemed to be old school simplexml dashboards
Hi all,   For our environment we want to ingest VMWare data for ITSI. The documentation tells us we need an OVA, so we installed one. However, this took a long time. And now I see we need a seper... See more...
Hi all,   For our environment we want to ingest VMWare data for ITSI. The documentation tells us we need an OVA, so we installed one. However, this took a long time. And now I see we need a seperate Scheduler since the searchhead is actually on Windows, which I did not notice in time is apparently not possible to combine with a Scheduler. It is also a bit vague to me how the connection works. With the scheduler you link a vCenter, so the scheduler needs a connection to the vCenter as well? And this data is then used on the DCN to also connect with the vCenter? My main question is: Is it possible to skip the scheduler and edit a conf file on the DCN itself to start ingesting VMWare and vCenter data right away? We have a limited time schedule since this is just a test environment and the ITSI license doesn't last forever.  Regards, Tim
I've been running into errors where larger searches are getting cancelled.  I read this could be due to running out of memory.  I looked at my search head which is running on a server with 32 gb but ... See more...
I've been running into errors where larger searches are getting cancelled.  I read this could be due to running out of memory.  I looked at my search head which is running on a server with 32 gb but only using 8gb (numbers from monitoring console) I'm assuming there's some setting to increase how much memory is allocated to splunk but i haven't found it.  I've seen settings for memory per search - is the overall memory calculated from allowed number of searches and memory per search? thanks