Hello Splunk community, We have a device on the windows systeme. I tried to find a LOG file on it that is responsible for the Internet connection and connection quality. But unfortunately, this ...
See more...
Hello Splunk community, We have a device on the windows systeme. I tried to find a LOG file on it that is responsible for the Internet connection and connection quality. But unfortunately, this screen saves a limited amount of information in its LOG files regarding the Internet connection. I wanted to know, does Splunk have a solution for such situations? Perhaps there is an application that we can install on this device that will allow us to erase the necessary LOGs? Thank you in advance for you answer
We are building a small Splunk installation in Azure and I'm not sure what the architecture should look like. The client came up with the idea based on the following link - Deploying Splunk on Micros...
See more...
We are building a small Splunk installation in Azure and I'm not sure what the architecture should look like. The client came up with the idea based on the following link - Deploying Splunk on Microsoft Azure. They created an indexer, a search head, and a license server/cluster master. We do need to ingest syslog data from Meraki devices, so I wonder whether we need a heavy forwarder. Any thoughts?
I want to use HTML on multiple panels in order to create a custom layout of my Splunk Dashboard. I want to use this layout where each rectangle is a panel - Please advise. Is this possible to im...
See more...
I want to use HTML on multiple panels in order to create a custom layout of my Splunk Dashboard. I want to use this layout where each rectangle is a panel - Please advise. Is this possible to implement in a Splunk Dashboard?
index=test pod=poddy1 "severity"="INFO" "message"="IamExample*"
| rex field=message "IamExample(?<total>).*"
| rex field=message ".*ACCOUNT<accountreg>.*):"
| rex field=message ".*Login(?<login...
See more...
index=test pod=poddy1 "severity"="INFO" "message"="IamExample*"
| rex field=message "IamExample(?<total>).*"
| rex field=message ".*ACCOUNT<accountreg>.*):"
| rex field=message ".*Login(?<login>.*)"
| rex field=message ".*Profile(?<profile>"
| rex field=message ".*Card(?<card>)"
| rex field=message ".*Online(?<online>) "
| stats count(total) as "Total" count(accountreg) as "Account" count(login) as "Login" count(profile) as "Profile" count(card) as "Card" count(online) as "Online "
Choosing a bar chart to display has "Total" show on the left hand side is there a way remove it?
also hovering over the chart its showing the count is there a way to make it display like this example below? field, count , percentage we want to divide Account , Login , Profile, Online it by Total that we have above
How to Break a multiple events into a single event based on timestamp? My logs doesn't have a date and it only has timestamp - For Ex - it starts as below format.. 17:22:29.875 Splunk version - ...
See more...
How to Break a multiple events into a single event based on timestamp? My logs doesn't have a date and it only has timestamp - For Ex - it starts as below format.. 17:22:29.875 Splunk version - 9.2.1 i have tried many options in props.conf but no luck still i could see multiple events in my search and i couldn't see events are breaked as per each timestamp. will LINE_BREAKER works or BREAK_ONLY_BEFORE - tried both but no luck.. is it possible to break events with timestamp in splunk or it's possible to break events only with date and time ?? Thanks in Advance.
Some time ago, on Splunk Cloud, I deleted a couple of apps that were used only for testing. These apps had some alerts configured. Now, I see that those test alerts are still running. I found them b...
See more...
Some time ago, on Splunk Cloud, I deleted a couple of apps that were used only for testing. These apps had some alerts configured. Now, I see that those test alerts are still running. I found them by searching: index=_internal sourcetype=scheduler app=<deleted app name> However, I can't see these apps in the app list anymore. How can I fix this? Thanks!
How to create custom datalink in Splunk observability cloud for passing filtered values from chart to identify the rootcause of the issue by navigating to APM,RUM,Synthetics page.
Hi , we have instrumented sql server metrics using OTEL. https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/receiver/sqlserverreceiver/documentation.md we have a tempdb , ...
See more...
Hi , we have instrumented sql server metrics using OTEL. https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/receiver/sqlserverreceiver/documentation.md we have a tempdb , 1.need to identify space usage , 2.And which query contributes to more tempdb usage using sqlserver receiver OTEL metrics?
Hi In the App menu. I have a situation where I need to keep installing apps, with different version names. However, when this gets to high numbers it might not look so great (I might be difficult ...
See more...
Hi In the App menu. I have a situation where I need to keep installing apps, with different version names. However, when this gets to high numbers it might not look so great (I might be difficult to find the app you need). I have 2 questions - 1st Can I increase the size of the row - When the text wraps around it does not look good (In my image I needed to shorten the name to stop wrapping around) 2nd Can I make a multi-drop-down to the right? Like the image below
Good day! We would like to know if it is possible to reduce the number of fields displayed in the Alert Fields section or hide the section entirely for incidents created in Splunk OnCall (VictorOp...
See more...
Good day! We would like to know if it is possible to reduce the number of fields displayed in the Alert Fields section or hide the section entirely for incidents created in Splunk OnCall (VictorOps), please see the attached screenshot. Currently, ITSI is passing an excessive number of fields. Can the Splunk OnCall incident details UI be customized to address this? Thank you.
HI, All I am trying to ingest data from Oracle DB to Splunk Observability Cloud Q1:Should I Create a database user for this monitor OR just using the default account Q2: as the sample " datasourc...
See more...
HI, All I am trying to ingest data from Oracle DB to Splunk Observability Cloud Q1:Should I Create a database user for this monitor OR just using the default account Q2: as the sample " datasource: "oracle://<username>:<password>@<host>:<port>/<database>" Should I create a database OR I can use the default database thanks in advance
hi index=idx_myindex source="/var/log/mylog.log" host="myhost-*" "memoryError" I know that if I give the conditions above, I can search for the log that caused the memoryError. As in the example a...
See more...
hi index=idx_myindex source="/var/log/mylog.log" host="myhost-*" "memoryError" I know that if I give the conditions above, I can search for the log that caused the memoryError. As in the example above, when a log occurs in myhost-*, I would like to send a command to the host where the log occurred and execute a specific command on the agent. Is there a way?
Good morning, I am having consistent trouble with UI in the editor in both firefox and chrome in that I cannot get the Dynamic Element selector to do anything. It displays the available options but ...
See more...
Good morning, I am having consistent trouble with UI in the editor in both firefox and chrome in that I cannot get the Dynamic Element selector to do anything. It displays the available options but I cannot select any of them. When I click on one, e.g. Background, nothing happens and it still says Select. Has anyone seen the before and have a workaround, or know what's causing it and how to fix it? Thank you, Charles
Hi Team, I can see events related to all hosts in internal index but the only few hosts data is available in newly created index. Please help me to troubleshoot the issue. Thanks in advance.
I have dataset which have field INSERT_DATE now i want to perform search based the date which is match with Global Time Picker Search what i want to is index = ******* host=transaction source...
See more...
I have dataset which have field INSERT_DATE now i want to perform search based the date which is match with Global Time Picker Search what i want to is index = ******* host=transaction source=prd | spath | mvexpand message | rename message as _raw | fields - {}.* ``` optional ``` | spath path={} | mvexpand {} | fields - _* ``` optional ``` | spath input={} | search TARGET_SYSTEM="EAS" | eval _time=strptime(INSERT_DATE, "%m/%d/%Y") | chart sum(TRANSACTION_COUNT) as TRANSACTION_COUNT by INSERT_DATE | where INSERT_DATE =strftime($global_time.latest$, "%m/%d/%Y")
How splunk calls coldToFrozen.py script automatically once the script is setup in /opt/splunk/bin and indexes.conf file with needed arguements. once cold_db is full how this script gets invoked by sp...
See more...
How splunk calls coldToFrozen.py script automatically once the script is setup in /opt/splunk/bin and indexes.conf file with needed arguements. once cold_db is full how this script gets invoked by splunk
Dear experts Basic idea of what I try to do: the results of a search should be filtered in a way, that only data points are displayed which are not part of a "Blacklist" maintained as lookup table. ...
See more...
Dear experts Basic idea of what I try to do: the results of a search should be filtered in a way, that only data points are displayed which are not part of a "Blacklist" maintained as lookup table. The challenging thing is, there are 3 columns at the same time to be taken into account for filtering. After a lot of trials, I ended up in creating a key from the 3 columns (which is unique) and then filter on the key. It is working, I just don't understand why :-(. Question: Has anybody an idea why the Version 1 filter works, and why Version 2 filter fails? Question: What needs to be changed to get Version 2 also to work? index="pm-azlm_internal_prod_events" sourcetype="azlmj"
| strcat ocp "_" fr "_" el unique_id
| table _time ocp fr el unique_id d_1
| search d_1="DEF ges AZ*"
``` VERSION 1: the working one ```
``` As long the subsearch returns a table with the column unique_id ```
``` which is exactly the name of the column I want to filter on, all works great.```
| search NOT [| inputlookup pm-azlm-aufschneidmelder-j
| strcat ocp "_" fr "_" sec unique_id
| table unique_id]
``` VERSION 2: NOT working ```
``` As soon I change the name of the column in the subsearch, the filte won't work anymore```
| search NOT [| inputlookup pm-azlm-aufschneidmelder-j
| strcat ocp "_" fr "_" sec ignore
| table ignore]```
| timechart span=1d limit=0 count by unique_id And the final question: is there a way for such filtering without going through the key creation? Thank you in advance.
Hello, I would like to confirm if it is possible to upgrade Splunk directly from version 9.1.1 to 9.3 on Linux, without going through version 9.2. Could you please clarify if this is supported and...
See more...
Hello, I would like to confirm if it is possible to upgrade Splunk directly from version 9.1.1 to 9.3 on Linux, without going through version 9.2. Could you please clarify if this is supported and if there are any specific considerations for this process? Best regards,
How can we locate usage related data from splunk, I have onpremise splunk instance and looking for usage and billing related data grouped by day. I am not able to locate data in any index.