All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello everyone, I have been receiving the follow message: Splunk could not get the description for this event. Either the component that raises this event is not installed on your local computer or... See more...
Hello everyone, I have been receiving the follow message: Splunk could not get the description for this event. Either the component that raises this event is not installed on your local computer or the installation is corrupt. FormatMessage error: Got the following information from this event: <Shows the information>, the source is WinEventLog:Application.  But not from all the host have the same issue. And I don't know how to fix it.  The version that I used is 7.3.5. Thanks in advance. 
As a part of Health Check of UF, which information I need check as a part of scheduled search, apart from to verify whether UF is up or not???    
Several months back I created a macro with the following regular expressions to "clean up" and concatenate several strings that I  often use.  Is there a website or tool that would help me to underst... See more...
Several months back I created a macro with the following regular expressions to "clean up" and concatenate several strings that I  often use.  Is there a website or tool that would help me to understand regex so that I may figure out how to simplify the search string?  My goal is to speed up the search.  I think eliminating the redundant rex commands would help but if there is an even better solution I want to know what it is. The macro currently contains the following:   | eval source_clean=source | rex field=source_clean mode=sed "s/\\\u_\S+//g" | rex field=source_clean mode=sed "s/^[^\\\]*\\\//" | rex field=source_clean mode=sed "s/^[^\\\]*\\\//" | rex field=source_clean mode=sed "s/^[^\\\]*\\\//" | lookup Source-Lookup.csv source AS source_clean OUTPUT web_domain | eval pages = web_domain+cs_uri_stem​​​​​​​     I do not have access to the lookup table that would allow me to add slashes to the `source column` as a way to eliminate the need for lines 3-5.  
We use DBConnect to retrieve SQL data. We have a critical feed that is fairly stable. I added a retrieval for just the SQL row count. This is a Batch retrieval. I added an alert to let our team know ... See more...
We use DBConnect to retrieve SQL data. We have a critical feed that is fairly stable. I added a retrieval for just the SQL row count. This is a Batch retrieval. I added an alert to let our team know when/if the SQL table 'Row Count' differs from what we actually retrieved. I'd like to add an action to the alert. Runing a script looks like the likely candidate.  I need a script for DBConnect. The following line is from a log file indicating the Input name in DBConnect (Device_Inventory): 2020-09-25 03:33:00.001 -0500 [QuartzScheduler_Worker-9] INFO org.easybatch.core.job.BatchJob - Job 'Device_Inventory' starting. Any help would be most appreciated. I'm trying to avoid a 24/7 response team for this data. 
Hello,  We have Splunk 7.1.1 with 16 CPU and 8G physical memory, It's keep shutting down by its self, At the beginning it was every day and the disk was over 80% so  we made the frozenTimePeriodInSe... See more...
Hello,  We have Splunk 7.1.1 with 16 CPU and 8G physical memory, It's keep shutting down by its self, At the beginning it was every day and the disk was over 80% so  we made the frozenTimePeriodInSec=31536000, and that make the disk stable at 66%. The problem was better for few days but then it  returned back with message related to the number of search witch is more 5000 , To fix that we run a code to delete the the empty the Dispatch every 1hour. That's made the search more faster  but didn't fix the problem. The memory usage is always more than 70% now  and it's very unstable (the Cpu usage is good not more 20%).
Hi , I am trying to connecting Splunk DB Connect App with Azure Synapse SQL Pool (SQL DW) using JDBC connection string with Azure Active Directory Authentication then its throwing error related to l... See more...
Hi , I am trying to connecting Splunk DB Connect App with Azure Synapse SQL Pool (SQL DW) using JDBC connection string with Azure Active Directory Authentication then its throwing error related to loading ADAL4J library failed to load. Same connectivity working fine with SQL Authentication.   JDBC Connectivity string jdbc:sqlserver://sqlserversplunktest.database.windows.net:1433;database=spsplunktest;user=xxxx;password=xxxxx;encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30;authentication=ActiveDirectoryPassword  I am getting error message - "Failed to load ADAL4J Java library for performing ActiveDirectory authentication"   Can someone please assist in this case.   Regards IY  
Hi Splunkers, I'm fetching only internal logs  for UF in my single box of Splunk Enterprise which acts as search head and DS too. May I know that only  fetching internal logs of UF will be enough t... See more...
Hi Splunkers, I'm fetching only internal logs  for UF in my single box of Splunk Enterprise which acts as search head and DS too. May I know that only  fetching internal logs of UF will be enough to get the reports and alerts from DMC ???? To only monitor within aspects of UF.
Hello Splunkers, Is there any way to identify from which source the logs are not getting forwarded?? For example: if we have such stanzas [monitor://$WINDIR\System32\DHCP] [WinEventLog://System] ... See more...
Hello Splunkers, Is there any way to identify from which source the logs are not getting forwarded?? For example: if we have such stanzas [monitor://$WINDIR\System32\DHCP] [WinEventLog://System] [monitor://C:\Windows\System32\newapp.log] [script://.\bin\win_timesync_status.bat] I need to know from which source is the events forwarding is getting failed due to some reason. Can I get this from _internal logs of UF with surety? We dont have access to indexers and actual logs.  
Hi Team i'm new in splunk,  I need to Monitor programs that executing on task manager whether is successfully or Not. I will appreciate to see a sample  Thanks!!  
I need to get 3 columns: host, port, description from text: 10.224.19.18 | 2222| New server 10.198.18.18 | 2443 | IFT etc I use curl-command and output=text
I am following the example in this github example: https://github.com/splunk/Developer_Guidance_Setup_View_Example In order to get the setup page of my Tech add on working correctly. The add-on pre... See more...
I am following the example in this github example: https://github.com/splunk/Developer_Guidance_Setup_View_Example In order to get the setup page of my Tech add on working correctly. The add-on previously used a setup.xml in the default folder which would take in a URL and API key. I have changed the add on to now use html, JS and CSS like the example suggests in the github link above. Once I have hit the `Submit` button i get presented with the above error whenever trying to navigate to the /inputs endpoint of my TA then the following error message comes up:   Internal configuration file error. Something wrong within the package or installation step. Contact your administrator for support. Detail: instance.pages.configuration.tabs[0].entity[1].type is not one of enum values: text,singleSelect,checkbox,multipleSelect,radio    Watching the logs on both the splunk side and the server side where the API call is being made. I see that the correct URL/domain and API key are being used because the request is going through correctly. Also looking in the corresponding config file like so:   cat etc/apps/TA-canary/local/ta_canary_settings.conf [additional_parameters] api_key = ******** canary_domain = <server_domain> disabled = 0   Which is all correct. Following the logs doesn't give any helpful output as to which field is incorrectly in the config files as not one of the following field types like the error says:   text,singleSelect,checkbox,multipleSelect,radio      
Hi Community My indexer stopped indexing data after I tried accelerating 7+ data models for CIM. Since I'm working remotely I'm unable to reach the search head GUI (server not reachable).  Please h... See more...
Hi Community My indexer stopped indexing data after I tried accelerating 7+ data models for CIM. Since I'm working remotely I'm unable to reach the search head GUI (server not reachable).  Please help me fix this issue.   Thanks!
Hi,  I am trying to make a CSV table with users that have logged in the system. The CSV file contains a field "Time" and "Username". 1. Time = Last log in time of the user. 2. Username = The user ... See more...
Hi,  I am trying to make a CSV table with users that have logged in the system. The CSV file contains a field "Time" and "Username". 1. Time = Last log in time of the user. 2. Username = The user that logged in. The goal is the update the last log in time of each user without deleting the other ones. This is what happens now.  Let's say the first input corresponds to the next example.   Time | Username 1601251200 User_Alpha 1601254800 User_Bravo User_Charlie 1601265600 User_Delta    What happens sometimes is that when I'm searching for new entries, it doesn't find log in timestamp for a user so it is left blank like this.    Time | Username User_Alpha User_Bravo 1601272800 User_Charlie 1601280000 User_Delta   The output that I would like is this    Time | Username 1601251200 User_Alpha 1601254800 User_Bravo 1601272800 User_Charlie 1601280000 User_Delta   Basically, what happened in the last example, it kept the content of example one and added the content of example 2. It updated the last login time from "User_Delta" with the last known log in time.  I'm getting stuck at this level. Can anyone help me to find out how i do this? My whole search works, except for this part.  Thanks anyway! 
Hi everybody ! I have a map with points, with values : long, lat and the name of the city When the user click of a point i open a dashboard with the name of the city ($click.name2$) clicked ... See more...
Hi everybody ! I have a map with points, with values : long, lat and the name of the city When the user click of a point i open a dashboard with the name of the city ($click.name2$) clicked But for the all points the value of click.name 2 are the same  whereas when I fly over the dots, the names are quite different. I try everything with $click.name$, $click.name2$,  or $click.value$,  $click.value2$, I do this :      <link target="_blank">/app/dashboard_?name=$click.name2$</link>     With this query for the map :   . . . . . | geostats latfield=latitude_dgr longfield=longitude_dgr count by name    Thanksss
Hi, we use a lot of base64 encoded fields to save traffic bandwidth. Is there any way to decode these fields at index time so they are automatically available in index and remove encoded ones. Ideal... See more...
Hi, we use a lot of base64 encoded fields to save traffic bandwidth. Is there any way to decode these fields at index time so they are automatically available in index and remove encoded ones. Ideally all this using macro 'base64'. I have tried to do this by field transformations but failed. Thanks 
Hello Guys,  I am getting the below errors indexer after migrating to new CM. Does any has faced this issue: 09-26-2020 08:59:37.385 -0400 INFO CMSlave - bid=abc~243~D779C6CE-5D88-4AA7-81F0-6... See more...
Hello Guys,  I am getting the below errors indexer after migrating to new CM. Does any has faced this issue: 09-26-2020 08:59:37.385 -0400 INFO CMSlave - bid=abc~243~D779C6CE-5D88-4AA7-81F0-6A1C4036C3AE remoteGuid=D495A561-B0B2-4508-9ADC-12D3C3F53A0B remoteError=yes vsz=6417 vet=0 vlt=0 queued streaming error job    
i have one dropdown with month and year how can i separate and make 2 dropdown , one for month another one for year   my query is this | inputlookup ... | search TITLE = "*Microsoft*" OR TITLE = "... See more...
i have one dropdown with month and year how can i separate and make 2 dropdown , one for month another one for year   my query is this | inputlookup ... | search TITLE = "*Microsoft*" OR TITLE = "*Windows*" | eval new_date=strftime(strptime(PUBLISHED_DATETIME,"%Y-%m-%d"),"%Y %b") | dedup new_date | table new_date PUBLISHED_DATETIME | sort - new_date
Hello, I would like to display some VPN informations in splunk like username, host information, session id. My problem is that I cannot display username and host information in the same table, the u... See more...
Hello, I would like to display some VPN informations in splunk like username, host information, session id. My problem is that I cannot display username and host information in the same table, the user field doesn't exist?? below the search that I did. I search for user with specifique application use and I try to catch the username, session_id and client_info_host but I get nothing           index="index_X" partition="/Common/XXXXXXXXXXXX:Common" | stats count by user session_id client_info_host | table session_id user client_info_host             If I remove the field "client_info_host" I get the result below.            index="index_X" partition="/Common/XXXXXXXXX:Common" | stats count by user session_id | table session_id user client_info_host             If I filter only with the field "client_info_host" I don't get the value username on the filed with this event.   But for all events the commune value is the session_id, How can I collerate all fields from session_id ?   Regards,  
I want to search logs like this.(このようなログを検索したいのですが、)     { Operation:hoge Workload:ほげ }     This search works fine: (この検索は正しく動作するのですが、)     index="default" Operation="hoge"     ... See more...
I want to search logs like this.(このようなログを検索したいのですが、)     { Operation:hoge Workload:ほげ }     This search works fine: (この検索は正しく動作するのですが、)     index="default" Operation="hoge"     But this search doesn't work:(この検索は正しい動作をしません。)     index="default" Workload="ほげ"     I would like to be able to search in Japanese as well, but what should I do?(日本語でも検索できるようにしたいのですが、どのようにすればよいでしょうか。)
Is there mechanism for registering and monitoring log flows for on boarded assets and data for SIEM in splunk.   I can see  Meta Woot! Splunk app from Discovered Intelligence provides superior leve... See more...
Is there mechanism for registering and monitoring log flows for on boarded assets and data for SIEM in splunk.   I can see  Meta Woot! Splunk app from Discovered Intelligence provides superior levels of insight and intelligence from your Splunk metadata and now your Splunk license data too!   Is there any other method apart from meta woot?