All Topics

Top

All Topics

We deploying Alert Manager with a new client. Most of my alerts have a "DRILLDOWN_URL" field which contains context-specific SPL. I am trying to configure this to be a 1-click operation to run usin... See more...
We deploying Alert Manager with a new client. Most of my alerts have a "DRILLDOWN_URL" field which contains context-specific SPL. I am trying to configure this to be a 1-click operation to run using "Drilldown Actions" but these do not work: _key name label url 6169aad5005c277d3b3788d5 Splunk search to show contributing events for this instance of this alert Contributing Events https://localhost:8000/en-US/app/alert_manager/search?q=$DRILLDOWN_SPL|u$ 6169af6f005c277d3b3788d6 Splunk search to show contributing events for this instance of this alert Contributing_Events_2 https://localhost:8000/en-US/app/alert_manager/search?q=$DRILLDOWN_SPL$     I still get no drilldowns defined in the investigation screen.
Can we automate adding inputs to splunk aws for respective heavy forwarders
We have a standalone install which has to follow specific guidance and documentation. Without getting much into things,  I need to document each port open and if certain ones don't already have a vul... See more...
We have a standalone install which has to follow specific guidance and documentation. Without getting much into things,  I need to document each port open and if certain ones don't already have a vulnerability assessment on file I need to generate a local report on what the port is for and how its utilized in the system(s). My clients have splunk installed but don't tap into a lot of its power currently. Therefore I expect a lot of the extra ports can be turned off (at least for now) and save me a lot of paperwork. This brings me to port 8065 and 8191. 8065, a local listening port that is tied to the splunk appserver. Problem is I can't find what Splunk is using this for exactly outside "app server". If we don't utilize Splunk apps is this required? If we did what does this port provide and why would it be required? When are calls made to it? How would I turn it off in version 8 if I don't need it? 8191 is used for app kv store. If apps are not utilized, can this be turned off? If so how? If apps are not utilized this seems like it wouldn't be required.   
Hi I tried searching all over but can't seem to find a good approach to do this. Basically, I have a multiselect input that needs to be used to filter a search, on a field that an array. For instance... See more...
Hi I tried searching all over but can't seem to find a good approach to do this. Basically, I have a multiselect input that needs to be used to filter a search, on a field that an array. For instance: multiselect input can be "value1", "value2", and the field from the search be a list or array of "value1", "value2", "value3" ..etc. how can we check and filter out events with fields that do not contain all the elements from the multiselect input ? Thanks in advance.
Hi Dear Splunkers, I have three searches that display the output into a Dashboard in three different panel, but I want to combine them into one linear chart, thank you. (index=ONE) (sourcetype="ON... See more...
Hi Dear Splunkers, I have three searches that display the output into a Dashboard in three different panel, but I want to combine them into one linear chart, thank you. (index=ONE) (sourcetype="ONE") (ID1="*") | eval ID1 = lower(ID1) | timechart span=1d distinct_count(ID1) (index=TWO) (sourcetype="TWO") (ID2="*") | eval ID2 = lower(ID2) | timechart span=1d distinct_count(ID2) (index=THREE) (sourcetype="THREE") (ID3="*") | eval ID3 = lower(ID3) | timechart span=1d distinct_count(ID3)
Is there Rest APIs to create the HTTP request template & Actions/Policies? I would like to automate the procedures for creating the HTTP request templates, Actions and policy.
Greetings. I'm trying to rewrite (because converting didn't work so well) a dashboard I'd written in simple xml to dashboard studio that heavily relies on search base.  Whenever I try to create a ma... See more...
Greetings. I'm trying to rewrite (because converting didn't work so well) a dashboard I'd written in simple xml to dashboard studio that heavily relies on search base.  Whenever I try to create a map viz object in studio, it only prompts me with the option to create a new data source.  However, I've already setup a search base and built multiple search chains off of it.  I'd like to reference the search chain to feed the visualization but I haven't been able to successfully do this.  Any ideas on how to approach it?
Pulling database events with Splunk DB Connect I noticed that: 1. New (non-existing) fields are created 2. text fields containing special characters  are cut The only reason I have been able t... See more...
Pulling database events with Splunk DB Connect I noticed that: 1. New (non-existing) fields are created 2. text fields containing special characters  are cut The only reason I have been able to identify consists in the presence of special characters of the kind: ( ) " : ... in fields like "SQL _Text", that by their nature can contain quotes, brackets and else. How can I escape these problem-causing characters ? This done inside Splunk - and not anywhere on the DB side, or the SQL command for pulling records. What are the characters that must be escaped? Asking the later because when working in a previous project with ingesting events from a DB to Splunk via TCP Data Input, I noticed that not all special characters where causing same problem as above - but only few of them. best regards Altin 
Hi, Just a query, I have some manual lookups in some of my dashboards, if I create an automatic lookup will this break the manual lookups in the dashboards, I don't believe it will but just wanted t... See more...
Hi, Just a query, I have some manual lookups in some of my dashboards, if I create an automatic lookup will this break the manual lookups in the dashboards, I don't believe it will but just wanted to ask.   Thanks,   Joe
All,  Setting up an index cluster of 3 nodes soon and sizing some disks. Feels like you would always want a replication factor of 2 searchable copies of 2.  But I see that I can in theory set ... See more...
All,  Setting up an index cluster of 3 nodes soon and sizing some disks. Feels like you would always want a replication factor of 2 searchable copies of 2.  But I see that I can in theory set replication factor of 2 searchable copies of 1, which leaves out tsidx/bloom etc.  Docs really seem to gloss over this.  What benefit does this have? What would recovery look like in a RF=2/SC=1 situation with a lost indexer? How would I bring that replicated copy online if I lost an indexer for good and didn't have a second searchable copy? 
I have two different data files which are related by a single named field.   Lets call that field common_field.  From one set of data files, I can get the count per common_field and from the other se... See more...
I have two different data files which are related by a single named field.   Lets call that field common_field.  From one set of data files, I can get the count per common_field and from the other set of data files I can get the count of errors per common_field. I want to create a query which will give me the ratio of error count to total count per common_field.  I have tried to use subsearches but it is not working. 
Is there any plan to support PyODBC as a DB exit? This library makes it easy for us to support multiple DB types in our analytical python environment. 
I am new to splunk. The answer can help me learn more. I have a message in log which looks something like  k45ksp: k45kspProcessControlBuff task 1 (p_id: 2). I need to just extract  k45kspProcessC... See more...
I am new to splunk. The answer can help me learn more. I have a message in log which looks something like  k45ksp: k45kspProcessControlBuff task 1 (p_id: 2). I need to just extract  k45kspProcessControlBuff from above message field  and count how many times it has occurred in a log.
Hi Splunkers. I need some ideas in showing KPI s in Splunk on Windows or Linux logs.  We have AD logs, System logs and Application logs. In Linux, We have secure logs. We are not trying to go w... See more...
Hi Splunkers. I need some ideas in showing KPI s in Splunk on Windows or Linux logs.  We have AD logs, System logs and Application logs. In Linux, We have secure logs. We are not trying to go with ITSI as of now but wanted to demo a KPI in Splunk Enterprise to other teams  to showcase the potential of Splunk. Please provide me some recommendations. Thanks in advance.  
I have been unable to find a suitable driver to get the DB Connect app to work to ingest some azure table data from a cosmos database.  Has anyone had any luck doing so or is there a better way to go... See more...
I have been unable to find a suitable driver to get the DB Connect app to work to ingest some azure table data from a cosmos database.  Has anyone had any luck doing so or is there a better way to go about it?
Hi!  I have a panel in dashboard that uses timechart. I want to make it zoom at highest count or count>0 automatically after it is done loading. Is there a way to achieve this? Thanks
I have two charts that work as expected when separate, but I'm having  a hard time combining them into one chart as they have different search criteria (but from the same index/source) so search2 end... See more...
I have two charts that work as expected when separate, but I'm having  a hard time combining them into one chart as they have different search criteria (but from the same index/source) so search2 ends up being wrong when using the criteria from search 1.  I tried combining using the chart overlays but I couldn't get it to work.  Any pointers would be very much appreciated! search 1 - last 30 days     index=foo source=bar criticality=high state=open | bin _time span=1d | stats count AS warnings by _time     search 2 - last 30 days     index=foo source=bar | bin _time span=1d | stats dc(accountId) AS Accounts by _time      
I've been working with Splunk for many years and have always made changes via the .conf files.  However, I recently added the /var/logs directory by using ./splunk add monitor /var/log -index main... See more...
I've been working with Splunk for many years and have always made changes via the .conf files.  However, I recently added the /var/logs directory by using ./splunk add monitor /var/log -index main -sourcetype linux It's working, but I want to modify it a bit.  However, I have been pulling my hair out trying to figure out which inputs.conf file was modified with the command. Any assistance appreciated. Tim
I made a clone of an existing and empty XML dashboard as the means to start a new studio one. I added text boxes and an image. All looks fine in edit mode. Anytime I save and click View, the dashboar... See more...
I made a clone of an existing and empty XML dashboard as the means to start a new studio one. I added text boxes and an image. All looks fine in edit mode. Anytime I save and click View, the dashboard's title remains but contents disappear. Going back to edit mode shows the contents. I've restarted, but the page and page's source doesn't have my text. How can I debug this ridiculously simple, beginner problem? 
Hi All! How to extract and create different fields by transforms when there is an array (JSON) with several fields with the same name but different values? For example, the "text" field in the firs... See more...
Hi All! How to extract and create different fields by transforms when there is an array (JSON) with several fields with the same name but different values? For example, the "text" field in the first case means "action", in the second the "text" field means "hostname". Just like "port" that appears twice would have to be identified as src_port and dst_port. Sample: "{     detail: {        indicators: [        {           filterId: [          ]          id: 1          objectType: port          objectValue: 445          relatedEntities: [          ]        }        {          filterId: [          ]          id: 2          objectType: text          objectValue: Reset          relatedEntities: [          ]        }        {          filterId: [          ]          id: 3          objectType: port          objectValue: 36880          relatedEntities: [        ]        }        {        filterId: [          ]          id: 6          objectType: text          objectValue: SERVERWIN01          relatedEntities: [          ]        }        {           filterId: [          ]          id: 7          objectType: detection_name          objectValue: Microsoft Windows SMB Information Disclosure Vulnerability (CVE-2017-0147)          relatedEntities: [          ]     }" Thanks! James