All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello @jrs42, In Dashboard studio, there's no option to specify a drilldown for a particular cell or a row. When you enable the drilldown, by default it gets applied to a cell. You can find the foll... See more...
Hello @jrs42, In Dashboard studio, there's no option to specify a drilldown for a particular cell or a row. When you enable the drilldown, by default it gets applied to a cell. You can find the following JSON source code as an example for a drilldown to set token in dashboard studio. { "visualizations": { "viz_dNS83Gj5": { "type": "splunk.table", "dataSources": { "primary": "ds_aQ7285AG" }, "eventHandlers": [ { "type": "drilldown.setToken", "options": { "tokens": [ { "token": "log_level_tok", "key": "row.log_level.value" } ] } } ] }, "viz_qGr86Sbm": { "type": "splunk.events", "options": {}, "dataSources": { "primary": "ds_MmJUCreO" } } }, "dataSources": { "ds_aQ7285AG": { "type": "ds.search", "options": { "query": "index=_internal source=\"*splunkd.log\"\n| stats count by log_level", "queryParameters": { "earliest": "$global_time.earliest$", "latest": "$global_time.latest$" } }, "name": "Search_1" }, "ds_MmJUCreO": { "type": "ds.search", "options": { "query": "index=_internal source=\"*splunkd.log\" log_level=\"$log_level_tok$\"", "queryParameters": { "earliest": "$global_time.earliest$", "latest": "$global_time.latest$" } }, "name": "Search_2" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } } }, "inputs": { "input_global_trp": { "type": "input.timerange", "options": { "token": "global_time", "defaultValue": "-24h@h,now" }, "title": "Global Time Range" } }, "layout": { "type": "absolute", "options": { "width": 1440, "height": 960, "display": "auto" }, "structure": [ { "item": "viz_dNS83Gj5", "type": "block", "position": { "x": 0, "y": 0, "w": 300, "h": 300 } }, { "item": "viz_qGr86Sbm", "type": "block", "position": { "x": 300, "y": 0, "w": 1140, "h": 300 } } ], "globalInputs": [ "input_global_trp" ] }, "description": "", "title": "Test Input Placeholder" }   Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated..!!  
Hi @Cyner__, you have to enable receiving on Splunk Enterprise, then you have to check the route from the Universal Forwarder on port 9997 to the Spunk Enterprise (using telnet), then you have to ... See more...
Hi @Cyner__, you have to enable receiving on Splunk Enterprise, then you have to check the route from the Universal Forwarder on port 9997 to the Spunk Enterprise (using telnet), then you have to configure your outputs.con (as described in the above link) in the Universal Forwarder. Ciao. Giuseppe
Thanks for your answer. I'm not sure if this is what I want. Because the advanced hunting app requires an API call with a limit of calls, I start doing a call on DeviceProcessEvents. Then I'm not sur... See more...
Thanks for your answer. I'm not sure if this is what I want. Because the advanced hunting app requires an API call with a limit of calls, I start doing a call on DeviceProcessEvents. Then I'm not sure if I need to do another API call on DeviceRegistryEvents, since I'd like to joint these two instances. 
Hi @anandhalagaras1 , you should take the searches in Workload and adapt them to your requirements. Ciao. Giuseppe
I have the same question, which capabilities are needed for the "Add Data" button?
@gcusello    We are using Splunk Cloud version 9.1.2308.203. Following your instructions, I navigated to Cloud Monitoring Console --> License Usage and found the following options in the Cloud Moni... See more...
@gcusello    We are using Splunk Cloud version 9.1.2308.203. Following your instructions, I navigated to Cloud Monitoring Console --> License Usage and found the following options in the Cloud Monitoring Console App: - Entitlement - Ingest - Workload - Storage Summary - Searchable Storage (DDAS) - Archive Storage (DDAA) - Federated Search for Amazon S3 Our Cloud Monitoring Console app is version 3.25.0. Please let me know how to pull the top 20 or top 50 sources with the index and sourcetype information.
Hi rsreese,  i know this post is some years old already but maybe it can help someone in the future. The McAfee ePO or now called Trellix Orchestrator can only sent data to tcp ports via SSL.  So s... See more...
Hi rsreese,  i know this post is some years old already but maybe it can help someone in the future. The McAfee ePO or now called Trellix Orchestrator can only sent data to tcp ports via SSL.  So switch the input from [tcp://514] to [tcp-ssl:514]. Be sure to fulfill the configuration requirements for tcp-ssl inputs. 
I don´t need this course. It will absolutely not help me for what I have to do which is pretty advanced in terms of classic Splunk architecture.
Linecount is not a significant factor when comparing event formats.  Most significant are the timestamp format and location, and how fields are delimited (key=value, JSON, etc.).
Hi. @gcusello    yes i did all. what do you mean by client do you mean the server with forwarder or splunk enterprise ?   and when i try to telnet the splunk server via forwarder server "i think ... See more...
Hi. @gcusello    yes i did all. what do you mean by client do you mean the server with forwarder or splunk enterprise ?   and when i try to telnet the splunk server via forwarder server "i think its client" connection always times out. i saw my splunk server (my computer i guess) doesn't have any inputs.conf at directory C:\Program Files\Splunk\etc\system\local path.  what should i do?    best regards  
Hello @Lidiane.Wiesner, I did some digging around and I've seen people suggesting to make sure java is running on a supported environment.  https://docs.appdynamics.com/appd/24.x/24.5/en/applicati... See more...
Hello @Lidiane.Wiesner, I did some digging around and I've seen people suggesting to make sure java is running on a supported environment.  https://docs.appdynamics.com/appd/24.x/24.5/en/application-monitoring/install-app-server-agents/java-agent/java-supported-environments
Hi @Cyner__ , port 9997 must be opened on the Spunk Enterprise, not on the client, you can open the port in [Settings > Forwarding and Receiving > Receiving]. Infact the telnet test must be done on... See more...
Hi @Cyner__ , port 9997 must be opened on the Spunk Enterprise, not on the client, you can open the port in [Settings > Forwarding and Receiving > Receiving]. Infact the telnet test must be done on the client not from the Splunk Server. Did you completed al the steps described in the document or in my previous answer? Ciao. Giuseppe
Hi @anandhalagaras1 , if you see in the Monitoring Console App [Settings > Monitoring Console > Indexing > icense Usage > Historic License Usage] or in License Concuption Report [Settings > Licensin... See more...
Hi @anandhalagaras1 , if you see in the Monitoring Console App [Settings > Monitoring Console > Indexing > icense Usage > Historic License Usage] or in License Concuption Report [Settings > Licensing > Usage Report> Previous 60 days > Split by ...] youcan find the searches you need. Ciao. Giuseppe
I finished the setup several times with my org/key in the setup page and I don't have the password.conf The Splunk is hosted in a server and I am doing the setup form my laptop , I don't know if t... See more...
I finished the setup several times with my org/key in the setup page and I don't have the password.conf The Splunk is hosted in a server and I am doing the setup form my laptop , I don't know if that can be the reason why I didn't get the  password.conf
Hi @deepakc  I was checking if i can use SECMD to remove that blank event . However i am not sure how to use it ?  Or try this  https://community.splunk.com/t5/Splunk-Search/Why-is-the-regex... See more...
Hi @deepakc  I was checking if i can use SECMD to remove that blank event . However i am not sure how to use it ?  Or try this  https://community.splunk.com/t5/Splunk-Search/Why-is-the-regex-creating-empty-events-from-incoming-data/m-p/396432  The previous event ends with a "." so can i try the above method ? 
I suspect that the MSSQL TA is normally supported and works in-conjuncton with the DB addon for sourcetype formatting (So it uses SQL queries and then does the props and transforms part, hence you’re... See more...
I suspect that the MSSQL TA is normally supported and works in-conjuncton with the DB addon for sourcetype formatting (So it uses SQL queries and then does the props and transforms part, hence you’re not seeing and values. So, I suspect it’s not being parsed correctly.   For the Windows Logs you normally use the Windows TA which contains the props and transforms from the standard Windows Events channels(app/information/security etc) and the TA contains the parsing code.  I don’t have a test environment, so can't check, but you could try. Change your sourcetype as there is a typo = mssql:aud to sourcetype = mssql:audit - and see of that works. Perhaps set renderXml = true in the inputs.conf with new sourectype mssql:aud:xml and create a props.conf with the mssql:aud:xml sourcetype add KV_MODE=xml (this is just a try and see without testing) If that doesn't work then stick with the DB connect solution.
I tried this but got an error Error in 'EvalCommand': Failed to parse the provided arguments. Usage: eval dest_key = expression. The search job has failed due to an error. You may be able view the ... See more...
I tried this but got an error Error in 'EvalCommand': Failed to parse the provided arguments. Usage: eval dest_key = expression. The search job has failed due to an error. You may be able view the job in the
You say you didn’t have a TA (props and transforms on the HF before), normally as the HF is full Splunk instances you should have the TA there for parsing, also known as cooked data and before it rea... See more...
You say you didn’t have a TA (props and transforms on the HF before), normally as the HF is full Splunk instances you should have the TA there for parsing, also known as cooked data and before it reaches the indexer, if you was sending direct then the TA on Indexer will suffice. (Why it worked before the upgrade I don’t know, but the as to the upgrade path, you should always follow the path, as this can often introduce breaking changes, so could be a factor.  I would try to deploy your custom TA(props code)  onto the HF and see if that makes a difference, as you already have this TA deployed to the current SH/IDX, you should be able to continue with normal field extractions, once it sees the sourcetype.    So, ensure the code for this data source lives perhaps in a custom TA or copy the code as it is in the SH/IDX and config or deploy that to the HF + restart. Tip for consistency keep the code in one Custom TA app is best practise, otherwise use /local/props.conf.
how to allow splunk to access public.if i am using splunk from diffrent gateway then what will i have to do to use the splunk web.
HTTP Event Collector examples - Splunk Documentation need troubleshooting suggestion if possible/ available with user access