All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I'm trying to forward events to a Splunk instance using the HTTP event collector (http://<splunk_instance>:8088/services/collector/event) but it seems that the connection is being rejected by Splunk.... See more...
I'm trying to forward events to a Splunk instance using the HTTP event collector (http://<splunk_instance>:8088/services/collector/event) but it seems that the connection is being rejected by Splunk. The error I'm getting is: "read tcp 127.0.0.1:46660->127.0.1.1:8088: read: connection reset by peer" The HTTP event collector is configured as: Enable SSL: true HTTP Port number: 8088    
I created a dashboard a couple of weeks ago, but now I'm unable to edit it.   Any Change that I make results in a "Server Error" Message.   I've tried restarting Splunk, but no dice.    We're running... See more...
I created a dashboard a couple of weeks ago, but now I'm unable to edit it.   Any Change that I make results in a "Server Error" Message.   I've tried restarting Splunk, but no dice.    We're running 8.2.5 Error Message        
I know this is an unsupported app, https://splunkbase.splunk.com/app/3501/ But does anyone have any suggestion? After the app install, when the app is launched. the inputs and configuration page ... See more...
I know this is an unsupported app, https://splunkbase.splunk.com/app/3501/ But does anyone have any suggestion? After the app install, when the app is launched. the inputs and configuration page is stuck on a loading screen with an unending spinning circle. have installed on 2 heavy forwarders and same thing. Any insight is highly appreciated.
Splunk newbie here again, Currently search for a way to create a dependent drilldown where the Year would be the basis the independent and the Month would be the dependent since the reports that wo... See more...
Splunk newbie here again, Currently search for a way to create a dependent drilldown where the Year would be the basis the independent and the Month would be the dependent since the reports that would be generated would need to look back from the previous, current, and possibly future reports. My temporary fix is using the Input Panel to pass it like in a string format. However the team I am  creating this for was searching for a way to avoid the users from typing in the input. Attaching a screenshot of the code along with the code at the moment.     <form theme="dark"> <label>CSC/ERSC/PSI PAGING Report</label> <fieldset submitButton="true" autoRun="true"> <input type="dropdown" token="lpar"> <label>Select to View</label> <choice value="----">----</choice> <choice value="D7X0">D7X0</choice> <choice value="H7X0">H7X0</choice> <choice value="D1D0">D1D0</choice> <choice value="DAD0">DAD0</choice> <choice value="E1D0">E1D0</choice> <choice value="H1D0">H1D0</choice> <choice value="WSYS">WSYS</choice> <choice value="YSYS">YSYS</choice> <default>----</default> </input> <input type="dropdown" token="&quot;y&quot;"> <label>Select Year</label> <choice value="2022">2022</choice> <default>2022</default> </input> <input type="dropdown" token="month"> <label>Select Month</label> <choice value="earliest=@y latest=@y+1mon">January</choice> <choice value="earliest=@y+1mon latest=@y+2mon">February</choice> <choice value="earliest=@y+2mon latest=@y+3mon">March</choice> <choice value="earliest=@y+3mon latest=@y+4mon">April</choice> <choice value="earliest=@y+4mon latest=@y+5mon">May</choice> <choice value="earliest=@y+5mon latest=@y+6mon">June</choice> <choice value="earliest=@y+6mon latest=@y+7mon">July</choice> <choice value="earliest=@y+7mon latest=@y+8mon">August</choice> <choice value="earliest=@y+8mon latest=@y+9mon">September</choice> <choice value="earliest=@y+9mon latest=@y+10mon">Ocotber</choice> <choice value="earliest=@y+10mon latest=@y+11mon">November</choice> <choice value="earliest=@y+11mon latest=@y+12mon">December</choice> </input> <input type="text" token="from"> <label>From MM/DD/YYYY</label> <default>01/01/2022</default> </input> <input type="text" token="to"> <label>To MM/DD/YYYY</label> <default>01/31/2022</default> </input> </fieldset> <row> <panel> <title>$lpar$ Date Panel</title> <chart> <title>From &amp; To Input: $from$ - $to$</title> <search> <query>index=mainframe-platform sourcetype="mainframe:mpage" MVS_SYSTEM_ID=$lpar$ | eval DATE=strftime(strptime(DATE,"%d%b%Y"),"%Y-%m-%d") | eval _time=strptime(DATE." ","%Y-%m-%d") | where _time &gt;= strptime("$from$", "%m/%d/%Y") AND _time &lt;= strptime("$to$", "%m/%d/%Y") | eval epochtime=strptime(TIME, "%H:%M:%S")| eval desired_time=strftime(epochtime, "%H:%M:%S") | chart sum(VIO_PAGING_SEC) as "$lpar$ Sum of VIO_PAGING_SEC" sum(SYSTEM_PAGEFAULTS_SEC) as "$lpar$ SYSTEM_PAGEFAULTS_SEC" sum(SWAP_PAGIN_SEC) as "$lpar$ SWAP_PAGIN_SEC" sum(LOCAL_PAGEFAULTS_SEC) as "$lpar$ LOCAL_PAGEFAULTS_SEC" over _time</query> <earliest>0</earliest> <latest></latest> </search> <option name="charting.axisLabelsX.majorLabelStyle.rotation">45</option> <option name="charting.axisTitleX.text">Date of Occurrence</option> <option name="charting.chart">column</option> <option name="charting.drilldown">none</option> <option name="charting.legend.placement">bottom</option> <option name="height">789</option> <option name="refresh.display">progressbar</option> <drilldown> <link target="_blank">/app/mainframe-platform/csierscpsi_paging_individual_report?_time=$click.name2$</link> </drilldown> </chart> </panel> </row> <row> <panel> <chart> <title>Select a Month using $month$</title> <search> <query>index=mainframe-platform sourcetype="mainframe:mpage" MVS_SYSTEM_ID=$lpar$ | eval DATE=strftime(strptime(DATE,"%d%b%Y"),"%Y-%m-%d") | eval _time=strptime(DATE." ","%Y-%m-%d") | where $month$ | chart sum(VIO_PAGING_SEC) as "$lpar$ Sum of VIO_PAGING_SEC" sum(SYSTEM_PAGEFAULTS_SEC) as "$lpar$ SYSTEM_PAGEFAULTS_SEC" sum(SWAP_PAGIN_SEC) as "$lpar$ SWAP_PAGIN_SEC" sum(LOCAL_PAGEFAULTS_SEC) as "$lpar$ LOCAL_PAGEFAULTS_SEC" over _time</query> <earliest>$range.earliest$</earliest> <latest>$range.latest$</latest> </search> <option name="charting.chart">column</option> <option name="charting.drilldown">none</option> <option name="height">789</option> <option name="refresh.display">progressbar</option> </chart> </panel> </row> </form>  
Hi again, Seeking your advise on the topic above. The link target method was suggested to me as a work around on my concern on the standard Splunk chart limitations. However I am unable to find a w... See more...
Hi again, Seeking your advise on the topic above. The link target method was suggested to me as a work around on my concern on the standard Splunk chart limitations. However I am unable to find a way to pass link target where in the target is the X-Axis value where in this instance is the specific dates. For reference, attaching the screenshot of the chart along with the entire XML code. I have already considered making a separate panel where the dates would be generated but I was wondering if this type pf Link Target would be possible so that I could apply it with my other reports.     <form theme="dark"> <label>CSC/ERSC/PSI PAGING Report</label> <fieldset submitButton="true" autoRun="false"> <input type="dropdown" token="lpar"> <label>Select to View</label> <choice value="----">----</choice> <choice value="D7X0">D7X0</choice> <choice value="H7X0">H7X0</choice> <choice value="D1D0">D1D0</choice> <choice value="DAD0">DAD0</choice> <choice value="E1D0">E1D0</choice> <choice value="H1D0">H1D0</choice> <choice value="WSYS">WSYS</choice> <choice value="YSYS">YSYS</choice> <default>----</default> </input> <input type="text" token="from"> <label>From MM/DD/YYYY</label> <default>01/01/2022</default> </input> <input type="text" token="to"> <label>To MM/DD/YYYY</label> <default>01/31/2022</default> </input> </fieldset> <row> <panel> <title>$lpar$ Date Panel</title> <chart> <title>From &amp; To Input: $from$ - $to$</title> <search> <query>index=mainframe-platform sourcetype="mainframe:mpage" MVS_SYSTEM_ID=$lpar$ | eval DATE=strftime(strptime(DATE,"%d%b%Y"),"%Y-%m-%d") | eval _time=strptime(DATE." ","%Y-%m-%d") | where _time &gt;= strptime("$from$", "%m/%d/%Y") AND _time &lt;= strptime("$to$", "%m/%d/%Y") | eval epochtime=strptime(TIME, "%H:%M:%S")| eval desired_time=strftime(epochtime, "%H:%M:%S") | chart sum(VIO_PAGING_SEC) as "$lpar$ Sum of VIO_PAGING_SEC" sum(SYSTEM_PAGEFAULTS_SEC) as "$lpar$ SYSTEM_PAGEFAULTS_SEC" sum(SWAP_PAGIN_SEC) as "$lpar$ SWAP_PAGIN_SEC" sum(LOCAL_PAGEFAULTS_SEC) as "$lpar$ LOCAL_PAGEFAULTS_SEC" over _time</query> <earliest>0</earliest> <latest></latest> </search> <option name="charting.axisLabelsX.majorLabelStyle.rotation">45</option> <option name="charting.axisTitleX.text">Date of Occurrence</option> <option name="charting.chart">column</option> <option name="charting.chart.overlayFields">"D1D0 Sum of VIO_PAGING_SEC","D1D0 SYSTEM_PAGEFAULTS_SEC","D1D0 SWAP_PAGIN_SEC",D1D0_LOCAL_PAGEFAULTS_SEC</option> <option name="charting.drilldown">none</option> <option name="charting.legend.placement">bottom</option> <option name="height">789</option> <option name="refresh.display">progressbar</option> <drilldown> <link target="_blank">/app/mainframe-platform/csierscpsi_paging_individual_report?_time=$click.name2$</link> </drilldown> </chart> </panel> </row> </form>        
Hi, Iam trying a simple query where i want to see the percentage of calls with a particular response time in splunk and somehow the percent field is coming as empty in the table.   index = xyz ... See more...
Hi, Iam trying a simple query where i want to see the percentage of calls with a particular response time in splunk and somehow the percent field is coming as empty in the table.   index = xyz http_status=200 | stats count(request_ms) as web-calls by request_ms | eventstats sum(web-calls) as totalwb |eval percent=(web-calls*100/totalwb) | table request_ms web-calls totalwb percent
Hello All, I am wanting to create a user-defined "dictionary" for a dashboard and would desire for the user to click on a link within the dashboard to launch the Lookup Editor app on a specific *.cs... See more...
Hello All, I am wanting to create a user-defined "dictionary" for a dashboard and would desire for the user to click on a link within the dashboard to launch the Lookup Editor app on a specific *.csv file. I would seed the dictionary with specific columns.  Is this possible? I tried something like the following but got an error ... "The lookup could not be loaded from the server" <drilldown> <link> http://myServer:8000/en-US/app/lookup_editor/lookup_edit?owner=nobody&amp;namespace=myAppName&amp;type=csv&amp;Name=client_information.csv </link> </drilldown>  Appreciate the help.  
I want to show the time range of that panel that a dashboard has ran from the time select drop -down.For instance , if I select last 90 mins  at 1PM from time select the dashboard panel should show t... See more...
I want to show the time range of that panel that a dashboard has ran from the time select drop -down.For instance , if I select last 90 mins  at 1PM from time select the dashboard panel should show the time like  earliest_time= 11:30:00 AM to Latest_Time:13:00:00 .   I tried using            $field2.earliest$ To Latest_Time: $field2.latest$ but the output is showing as below  Earliest_Time: -90m@m To Latest_Time: now     Thanks in Advance
We would like to send our wineventlog data to the on-perm cluster as well as to the cloud. How can we do that? we can fork at the UF level but we are not happy about this approach.  
Hi All, what does eliminated_buckets mean in splunk index=_internal <sourcetype> mean in splunk ? Regards, NVP
I have multiple tables on a dashboard. Is there a way that i can sow all the values that show up in red in the page below to highlight on top of the page ?
Hi forum! I have a couple of tricky questions on working with same indata and same type of graphs... I am currently working on some Jira ticket weekly inflow/outflow/backlog column graphs, based ... See more...
Hi forum! I have a couple of tricky questions on working with same indata and same type of graphs... I am currently working on some Jira ticket weekly inflow/outflow/backlog column graphs, based on ticket data ingested nightly into Splunk index, from Confluence Rest API into simple CSV data. Essential JIRA fields -> CSV fields used: key (ticket ID), created (datestamp), resolved (datestamp), priority Each day we ingest ALL JIRA tickets for a project into Splunk index (lets say index "project" keyed with source "jira_bugs". Theoretically I should be able to use the last 24 hour ingest into splunk to drive the graphs based on the dates in the data.   Based on that I create a weekly Jira ticket inflow graph, over the last 6 months of data (last 24 hours data):       index="project" source="jira_bugs" | dedup key | eval created_date = strptime(created,"%Y-%m-%d") | eval resolved_date = strptime(resolved,"%Y-%m-%d") | eval _time = created_date | where _time > now()-15811200 | timechart span=1w partial=false count AS "TR inflow" BY priority | addtotals | convert timeformat=""W%y%V"" ctime(_time) AS date | sort _time | fields - _time | table date *       So, based on the ticket's created date, I use that as _time, I span the data in timechart into weekly scope and then change the time label to a Week label Wyyww. The output drives a stacked columns (per priority), and an overlay linegraph for totals. Outflow is dito, but _time is instead driven by resolved_date. The problem with this approach is that if there is no tickets created for a week, it should (but does not) render an empty space for that week. I am thinking that I may perhaps need to chain this query with a preceeding gentimes (7d increment starting 6 months ago?), and then somehow group the count of tickets into the generated time events?   Secondly, I need to create a weekly Jira ticket backlog graph, and this feels even more tricky. For this, I need to count the number of tickets per week that fits within a certain time range, meaning I need to count a ticket for each week if it was open (evaluating if [created_date] < week or certain day] < [resolved_date]). So same ticket shall be counted (duplicated with different _time stamps?) over several weeks (columns in graph) for which it was open. Seems like a simple thing, but each time I attack this problem I give up after googling and testing a number of ideas from Splunk! reference and forum.  
Hello, I have huge volume of data coming in under different source types (or indexes) for different applications/projects. Are there any ways we can assign any indexed fields for each of the data s... See more...
Hello, I have huge volume of data coming in under different source types (or indexes) for different applications/projects. Are there any ways we can assign any indexed fields for each of the data sources/indexes/apps? As an example, most of the cases ACCOUNTID and IPAddress are the unique fields for each of the applications/Projects. How would I assign these 2 fields as indexed fields? Any thoughts or recommendations would be highly appreciated. Thank you so much.
Does anyone know where I can find a copy of the original query for the "Spike in DNS Traffic (Last Day)" panel? This panel is under the Infosec Dashboard, Advanced Threats, Network anomalies section.... See more...
Does anyone know where I can find a copy of the original query for the "Spike in DNS Traffic (Last Day)" panel? This panel is under the Infosec Dashboard, Advanced Threats, Network anomalies section.  We've been altering the search for that panel and need to revert back to original search query. Unfortunately no copy was made prior to making changes.
Hello, we are trying to find a way to import sumo logic data  into Splunk, existing sumo logic is getting replace by Splunk. do we have any document which we can refer or any help would be greatl... See more...
Hello, we are trying to find a way to import sumo logic data  into Splunk, existing sumo logic is getting replace by Splunk. do we have any document which we can refer or any help would be greatly apricated.   thanks   
Currently .conf 2022 isnt available in the Splunk Events app. Is this going to be added? Is there any other way to view the sessions agenda easily? 
Hello everyone, I'd like to update one of the Splunk Apps -- Splunk Common Information Model, from 4.20.2 to 5.0.1 to eliminate its jQuery3.5 incompatibility issue. The platform I'm using is Splunk... See more...
Hello everyone, I'd like to update one of the Splunk Apps -- Splunk Common Information Model, from 4.20.2 to 5.0.1 to eliminate its jQuery3.5 incompatibility issue. The platform I'm using is Splunk Cloud 8.2.2201.1 (Victoria Experience)   Although I can update it to 5.0.1 via Splunk Cloud > Apps > Manage Apps, to my surprise, it rolled back to its previous version 4.20.2 after the Splunk Cloud restart. There is one thing I should mention: after the update, there was immediately a Setup request. Although I didn't know how to set it up, I did change many default configurations -- I even provided it with a "fake" API key. However, it still rolled back to the previous version. Besides, the provided key also disappeared after its rollback.   Any suggestions on how to update it, please? Thank you very much!
Is it possible to ship only specific statements from a log file, to be indexed on Splunk rather than indexing the entire log file? For example, let's say I have a log file that contains some data ab... See more...
Is it possible to ship only specific statements from a log file, to be indexed on Splunk rather than indexing the entire log file? For example, let's say I have a log file that contains some data about a failed process. If there was a 1 line statement in this log file that told me the time that process failed, would it be possible to index that line of data and not the entire log file? how would that be done? Thank you in advance!
Big warning at the beginning - it's not a question of "should I do that", it's not a question of best practices. I'm not going to do something like that in production (and probably not even in lab en... See more...
Big warning at the beginning - it's not a question of "should I do that", it's not a question of best practices. I'm not going to do something like that in production (and probably not even in lab environment). It's purely a theoretical question. As we all know, there is usually a separate port for deployment server, separate one for HEC, separate one for REST calls between SH(s) and indexer(s). I was wondering how much of those functionalities could be squashed into a single port (possibly with a help of an external reverse-proxy). I suppose HEC and DS could be really good candidates to squish together. Any others? Just to make myself absolutely clear - I don't want it to be quick and well-performing. I'm just wondering if it would work at all.
while trying to ingest the logs from log analytics getting below error  ERROR pid=40806 tid=MainThread file=base_modinput.py:log_error:307 | Get error when collecting events. Traceback (most recen... See more...
while trying to ingest the logs from log analytics getting below error  ERROR pid=40806 tid=MainThread file=base_modinput.py:log_error:307 | Get error when collecting events. Traceback (most recent call last): File "/opt/splunk/etc/apps/TA-ms-loganalytics/bin/ta_ms_loganalytics/modinput_wrapper/base_modinput.py", line 127, in stream_events self.collect_events(ew) File "/opt/splunk/etc/apps/TA-ms-loganalytics/bin/log_analytics.py", line 96, in collect_events input_module.collect_events(self, ew) File "/opt/splunk/etc/apps/TA-ms-loganalytics/bin/input_module_log_analytics.py", line 72, in collect_events response = requests.post(uri,json=search_params,headers=headers) File "/opt/splunk/etc/apps/TA-ms-loganalytics/bin/ta_ms_loganalytics/requests/api.py", line 110, in post return request('post', url, data=data, json=json, **kwargs) File "/opt/splunk/etc/apps/TA-ms-loganalytics/bin/ta_ms_loganalytics/requests/api.py", line 56, in request return session.request(method=method, url=url, **kwargs) File "/opt/splunk/etc/apps/TA-ms-loganalytics/bin/ta_ms_loganalytics/requests/sessions.py", line 488, in request resp = self.send(prep, **send_kwargs) File "/opt/splunk/etc/apps/TA-ms-loganalytics/bin/ta_ms_loganalytics/requests/sessions.py", line 609, in send r = adapter.send(request, **kwargs) File "/opt/splunk/etc/apps/TA-ms-loganalytics/bin/ta_ms_loganalytics/requests/adapters.py", line 473, in send raise ConnectionError(err, request=request) ConnectionError: ('Connection aborted.', error(104, 'Connection reset by peer'))   TA version is 1.0.3