All Topics

Top

All Topics

Hi Everyone! Recently, we are opting to standardize our monitoring solution. Upon our initial research and development, OpenTelemetry has been the newly established standard for monitoring and obse... See more...
Hi Everyone! Recently, we are opting to standardize our monitoring solution. Upon our initial research and development, OpenTelemetry has been the newly established standard for monitoring and observability. Our target is to migrate and be enabled on using OpenTelemetry as part of our policies and standard for monitoring. We are aware that there is a product called "Splunk Observability Cloud" which onboards OTLP and any supported platforms to a unified observability stack. For the AIOps, I believe this is still within Splunk Enterprise. While previously we have explored the possible movement to cloud, currently, we are still using Splunk Enterprise. We would like to know if there are any ways we can forward log events to OpenTelemetry, then to Splunk Enterprise. I know this might add overhead as adding another leg (OpenTelmetry) can add additional workload), but this is critical for us to standardize our current monitoring. Here's some items we want to explore:   Here's something we have researched before: Splunk Ingest Actions - I think this is only available for Heavy Forwarder. The documentations however, wasn't able to detail out if OTEL endpoint is supported. Splunk Transforms and Outputs (Heavy Forwarder) - On our initial testing, we weren't able to capture data on OTEL Collector. I don't think there exist a configuration for Universal Forwarder to OTEL Collector. May I kindly ask for inputs or any insights what are possible solutions for this? Thank you very much in advanced!
Hi Community, Does anyone know if the 14 day Splunk Cloud Platform Trial allows you to create multiple users and roles?  I need to test some capabilities. Thank you.
Hi all, I have encountered a weird issue in Splunk. Basically I have added a dropdown input/filter with the following settings:  But after hitting "Apply", it says "Search produced no results"... See more...
Hi all, I have encountered a weird issue in Splunk. Basically I have added a dropdown input/filter with the following settings:  But after hitting "Apply", it says "Search produced no results". The weird thing is if I run the search seperately, it does have results: So does Splunk disallow using the following query in the filter/input?  |dbxquery connection=100892_intelligence query="SELECT zone_name from win_data"
Hi,   What would the btool command be to find a certain part Of an input.conf file?   Thanks  
Hi, on Splunk cloud can you create a blank Splunk app for storing dashboards,alerts and reports or does it need reviewing by Splunk?   thanks
Hi  We are looking into getting some more experience with Splunk Enterprise. We therefore wanted to create a small distributed deployment.  We are only going to index about 5 gb per day.  The... See more...
Hi  We are looking into getting some more experience with Splunk Enterprise. We therefore wanted to create a small distributed deployment.  We are only going to index about 5 gb per day.  The documentation distributed minimum requirements is for 300GB a per day. So what is the minimum requirements for the search head and indexer for such a small distributed deployment?  Thank you for any help
Does the indexer cluster have to be http?
just trying to figure out why its gives that warning?
Hi, I am using inner join to form a table between 2 search, search is working fine but i want to subtract 2 fields in which one field is part of one search and another field is part of next search,... See more...
Hi, I am using inner join to form a table between 2 search, search is working fine but i want to subtract 2 fields in which one field is part of one search and another field is part of next search, I am displaying response in a table which  contains data from both search    example line1: datetime: , trace: 12345 , Request Received: {1}, URL:http:// line2:datetime: , trace: 12346 , Request Received: {2}, URL:http:// line3:datetime: , trace:12345 , Reponse provided: {3} line4:datetime: ,trace:12346 , Reponse provided :{4}   In line1 and line 3 trace is common field and so is in line 1 and line 4 i have combined the result as .... | table trace, Request,startTime | join type=Inner trace [ search ......... | table trace, Response,  EndTime] Which is giving me response as below trace      request     startTime     response     EndTime 12345   {1}                    09:18:20      {3}.              09:18:50 12346   {2}                   09:19:20       {4}.               09:20:21 I want to find out response time subtractingEndTime - startTime. 
As security threats and their complexities surge, security analysts deal with increased challenges and best-in-class security tools are essential for every enterprise. Splunk’s latest integration wit... See more...
As security threats and their complexities surge, security analysts deal with increased challenges and best-in-class security tools are essential for every enterprise. Splunk’s latest integration with VirusTotal not only allows customers to access insights from VirusTotal datasets in a one-click experience, but also all informed decisions to be made quickly and accurately. VirusTotal is one of the most popular and close to real-time crowdsourced malware dataset – the company was launched in June 2004 and acquired by Google in September 2014.  Data sources in VirusTotal include crowdsourced YARA rules, sandboxed dynamic analysis, Sigma rules acting on detonation behavior, IDS detections on network traffic and many security vendors. VirusTotal’s latest addition to Splunkbase, VT4Splunk, provides insights and enrichments on IOCs from a single pane of glass. With VT4Splunk, customers can discover CVEs affecting events and run Splunk searches on top of IOCs from these cases. This Google-supported add-on provides native integrations with VirusTotal API from a Splunk interfacing, making security researcher investigations more effective.  With over 2,800 unique apps and add-ons in Splunkbase, native integrations enable Splunk partners to achieve a large set of enterprise customers worldwide and foster innovation, enhance security practices, and cultivate resilience.  To install the free VT4Splunk add-on, login to Splunkbase and view the step-by-step installation guide. — Alexey Bokov, Cloud Strategist at Splunk  
Hello Splunkers, I am attempting to gather the free disk space of all servers and create a report / alert based on it. Thus far I have the SPL set so it outputs the Time, Host, Drive and % Free but... See more...
Hello Splunkers, I am attempting to gather the free disk space of all servers and create a report / alert based on it. Thus far I have the SPL set so it outputs the Time, Host, Drive and % Free but the results come back in a long list of pages. What I'd like to do is two-fold. First part is getting one result per Drive, so one result for each drive on a host and then I'd like to set up an alert for low disk space. Here's my SPL so far:   (index=main) sourcetype=perfmon:LogicalDisk instance!=_Total instance!=Harddisk* | eval FreePct-Other=case( match (instance, "C:"), null(), match(instance,"D:"), null(),true(),storage_free_percent), FreeMB-Other=case( match (instance, "C:"), null(), match(instance,"D:"), null(), true(),Free_Megabytes), FreePct-{instance}=storage_free_percent,FreeMB-{instance}=Free_Megabytes| search counter="% Free Space" | eval Time=strftime (_time,"%Y-%m-%d %H:%M:%S") | table Time, host, instance, Value | eval Value=round(Value,0) | rename Value AS "Free%" | rename instance AS "Drive" | rename host AS "Host"     The result is:   
Hello I've been looking at the new _configtracker index and I would like to know how I could get the User details associated with the configuration change. Regards
Hi, I am trying to configure Security Essentials 3.7.0 running in Splunk Cloud.  The documentation tells me to go to Data > Data Inventory to use introspection, but there is no Data menu that I can... See more...
Hi, I am trying to configure Security Essentials 3.7.0 running in Splunk Cloud.  The documentation tells me to go to Data > Data Inventory to use introspection, but there is no Data menu that I can see. The closest thing I can find is Configuration > Data Inventory. This popup shows Data Source Category Configuration with a status of Not Started, and Product Configuration also with a status of Not Started. There is no option to kick off introspection. Thanks in advance for any help!
Hello Everyone, This time i'm presenting the incompatibility between MSSQL Server 2022 and the Installed on Splunk (11.2). I installed the driver with the official add-on on the Splunkbase, but whe... See more...
Hello Everyone, This time i'm presenting the incompatibility between MSSQL Server 2022 and the Installed on Splunk (11.2). I installed the driver with the official add-on on the Splunkbase, but when i perform the health check the following message appears: "Driver version is invalid, connection: SQL_SERVER, connection_type: generic_mssql." Please help me, because Splunk DB Connect is unstable with the mentioned connection. Best regards, Diego T.
I'm trying to use python scritps with my own virtual environment. I found out that it handy to use PyDen in this situation, so i downloaded and install PyDen and PyDen Manager. And as i started to in... See more...
I'm trying to use python scritps with my own virtual environment. I found out that it handy to use PyDen in this situation, so i downloaded and install PyDen and PyDen Manager. And as i started to install python distributive, error occured: configure: error: expected an absolute directory name for --prefix: local/lib/dist/3.9.0 How should i fix this, or is there any other way to use python scripts with wirtual environment?
Hi, Could you help me in editing the below search  index=test sourcetype="centino" | stats count, values(change_asset) as changed_asset, values(brief) as description, values(severity) as severity... See more...
Hi, Could you help me in editing the below search  index=test sourcetype="centino" | stats count, values(change_asset) as changed_asset, values(brief) as description, values(severity) as severity, values(exploitation_method) as exploitation_method, values(first_find) as first_find, values(last_find) as last_find, , values(systems) as system by id. 1. In the below output of fields we need to display only the date 2023-01-22  first_find                                                 last_find 2. Instead of receiving all the notifications we require, if today's date matches the first _find or the last_find, raise an alert *todays date will change every day do not bound that with actual todays date* Thanks...
I'm trying to get the kong plugin to work with Splunk Observability cloud. Here is my agent_config.yaml relating to kong:     recievers smartagent/kong: type: collectd/kong host: 127.0.0... See more...
I'm trying to get the kong plugin to work with Splunk Observability cloud. Here is my agent_config.yaml relating to kong:     recievers smartagent/kong: type: collectd/kong host: 127.0.0.1 port: 8000 service: pipelines: metrics: receivers: [hostmetrics, otlp, signalfx, smartagent/signalfx-forwarder, smartagent/kong]     When I start my Splunk OTEL collector I am getting metrics from the server but not the Kong service, checking journalctl I see:     otelcol[25528]: 2023-01-30T17:08:05.855Z error signalfx/handler.go:189 Traceback (most recent call last): otelcol[25528]: File "/usr/lib/splunk-otel-collector/agent-bundle/lib/python3.8/site-packages/sfxrunner/scheduler/simple.py", line 57, in _call_on_interval otelcol[25528]: func() otelcol[25528]: File "/usr/lib/splunk-otel-collector/agent-bundle/collectd-python/kong/kong/reporter.py", line 56, in update_and_report otelcol[25528]: self.kong_state.update_from_sfx() otelcol[25528]: File "/usr/lib/splunk-otel-collector/agent-bundle/collectd-python/kong/kong/kong_state.py", line 63, in update_from_sfx otelcol[25528]: self.update_resource_metrics(status['signalfx']) otelcol[25528]: KeyError: 'signalfx' otelcol[25528]: {"kind": "receiver", "name": "smartagent/kong", "pipeline": "metrics", "monitorID": "smartagentkong", "monitorType": "collectd/kong", "runnerPID": 25545, "createdTime": 1675098485.8552756, "logger": "root", "sourcePath": "/usr/lib/splunk-otel-collector/agent-bundle/lib/python3.8/site-packages/sfxrunner/logs.py", "lineno": 56}       I have installed the kong plugin using these instructions:  https://docs.splunk.com/Observability/gdi/kong/kong.html
I have 3 panels across the top of my dashboard and one table panel underneath.  How do I fix the top row (3 panels) from scrolling? How do I remove the larger vertical scroll bar?  The image below... See more...
I have 3 panels across the top of my dashboard and one table panel underneath.  How do I fix the top row (3 panels) from scrolling? How do I remove the larger vertical scroll bar?  The image below shows how the table has a scroll bar but the overall dashboard also has a scroll bar.  How do I remove this outer scroll bar?          
Hello! I am trying to map a search in Splunk Studio Dashboards to create a time chart showing a machines utilization per day. I want to show it by day so I can add a trend line to my single value uti... See more...
Hello! I am trying to map a search in Splunk Studio Dashboards to create a time chart showing a machines utilization per day. I want to show it by day so I can add a trend line to my single value utilization panel. To do this, I am mapping my search by day so, the utilization will be calculated per day rather than over the whole-time range. Using the code below I am able to make a time chart displaying the machines daily utilization in dashboard classic but not dashboard studios: Code: index=example |bin span=1d _time |dedup _time | eval start=relative_time(_time,"@d-1d"), end=relative_time(_time,"@d") |eval day=strftime(_time,"%D %T") |eval End=strftime(end,"%D %T") |map maxsearches=30 search="search index=example earliest=\"$$start$$\" latest=$$end$$ | transaction Machine maxpause=300s maxspan=1d keepevicted=T keeporphans=T | addinfo|bin span=1d _time | eval timepast=info_max_time-info_min_time | eventstats sum(duration) as totsum by Machine _time  |dedup Machine _time | eval Util=min(round( (totsum)/(timepast) *100,1),100) | stats values(Util) as \"Utilization\" by Machine _time date_mday" |table _time Utilization Machine |chart values(Utilization) by _time Machine |fillnull value="0" Code Results in Dashboard Classic: Code result in Dashboard Studio:   Why can't I map on Dashboard Studio?? It states it is waiting for an input. How can I break up utilization by day to show the trend line?
I added 2 buttons (Delete + Update) to each row in a table. I used the example Script from from https://community.splunk.com/t5/Splunk-Search/How-do-you-add-buttons-on-table-view/m-p/384712 -> table_... See more...
I added 2 buttons (Delete + Update) to each row in a table. I used the example Script from from https://community.splunk.com/t5/Splunk-Search/How-do-you-add-buttons-on-table-view/m-p/384712 -> table_with_buttons.js.  In general all is working fine, the most time. But sometimes when I do a Browser Reload the Javascript is not running and the buttons are not colored. If I'm using one of the dropdowns and select an item, the buttons are colored immediately. It should be looking like this: I  adapted the script a little bit and I ran always a https.//host/en-US/_bump after each change.   require([ 'underscore', 'jquery', 'splunkjs/mvc', 'splunkjs/mvc/tableview', 'splunkjs/mvc/simplexml/ready!' ], function(_, $, mvc, TableView) { var CustomRangeRenderer = TableView.BaseCellRenderer.extend({ canRender: function(cell) { //console.log("Enable this custom cell renderer for field"); return _(["Update","Delete"]).contains(cell.field); }, render: function($td, cell) { //console.log("Add a class to the cell based on the returned value"); var strCellValue = cell.value; if (cell.field === "Update") { var strHtmlInput="<input type='button' style='background-color:DodgerBlue' class='table-button btn-primary' value='"+strCellValue+"'></input>"; } else if (cell.field === "Delete") { var strHtmlInput="<input type='button' style='background-color:OrangeRed' class='table-button btn-primary' value='"+strCellValue+"'></input>"; } $td.append(strHtmlInput); } }); mvc.Components.get('taskCollectionTable').getVisualization(function(tableView) { // Add custom cell renderer, the table will re-render automatically. tableView.table.addCellRenderer(new CustomRangeRenderer()); tableView.table.render(); }); });     And this part in the dashboard:   <row depends="$alwaysHideCSSPanel$"> <panel> <html> <style> #taskCollectionTable table tbody tr td{ cursor: default !important; } #taskCollectionTable table tbody tr td input.table-button{ width: 83px !important; position: relative; left: 5%; } </style> </html> </panel> </row>     What I'm missing or doing wrong?  Thanks