All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I almost presume that this has been asked before, but I have still not been able to find any answers here that solved my problem. When I use trellis on a single value panel, the fonts sizes for ea... See more...
I almost presume that this has been asked before, but I have still not been able to find any answers here that solved my problem. When I use trellis on a single value panel, the fonts sizes for each single value display is all over the place. I guess this is because of the difference in text lenght, but there surely must be a way to keep font sizes equal? I have tried different CSS override solution I have found suggested on the forum, but have had no success yet. I am currently stuck trying to figure this out, anyone here able to help me? Also let me know if it is possible to remove the ugly trellis "split by" category values of 1,3,4. I tried renaming these but that breaks the coloring using range. Edit to add another question: Is it possible to stack these vertically instead of horizontal?
Dears What are the main differences between calculating SPLUNK daily license from the ready-made query located in SPLUNK master (DMC Alert - Total License Usage Near Daily Quota) And from using bel... See more...
Dears What are the main differences between calculating SPLUNK daily license from the ready-made query located in SPLUNK master (DMC Alert - Total License Usage Near Daily Quota) And from using below query: index=_internal source=*license_usage.log type=Usage | stats sum(b) as bytes by idx | eval LicenseUsage_GB = round(bytes/1024/1024/1024,5) |table idx LicenseUsage_GB As I found a remarkable differences when using the 2 above ways even when calculating the total available license. Thanks
Hi, I have like this <title>Report $time_token.earliest$</title> result : Report -30d@d can the result be changed to show  Report 30days  
Hi, We have a standard openldap logs which looks like this : Mar 17 07:01:46 abc123 slapd[1234]: conn=1001 op=1 RE SULT tag=97 err=49 text= Mar 17 07:01:45 abc123 slapd[1234]: conn=1001 op=1 BIND d... See more...
Hi, We have a standard openldap logs which looks like this : Mar 17 07:01:46 abc123 slapd[1234]: conn=1001 op=1 RE SULT tag=97 err=49 text= Mar 17 07:01:45 abc123 slapd[1234]: conn=1001 op=1 BIND dn="cn=my_username,ou=users,dc=my_dc,dc=fr" method=128 Mar 17 07:01:44 abc123 slapd[1234]: conn=1001 fd=12 ACCEPT from IP=10.111.22.123:46662 (IP=10.111.22.123:636) We’d like to detect error code other than “success” in our ldap logs: index=ldap sourcetype="openldap:access" err!=0   Unfortunately, we can't find a way correlate this event with other events in order to find the username and the ip address related to the same conn. Do you have an idea how we can do it?   Thanks for the help. Regards.
How to adjust trellis to the center of the panel with equal sizes of three trellis? I have trellis with three fields and it's showing at the beginning of the panel and I tried to increase the width a... See more...
How to adjust trellis to the center of the panel with equal sizes of three trellis? I have trellis with three fields and it's showing at the beginning of the panel and I tried to increase the width and height but of no use. Would you help me on this?
Hi, Can you please let me know, if Splunk latest version (8.1.2) supports/works in Centos 8  in the distributed environment.  Right now we are using the Linux- SLES and we are planning to migrate to... See more...
Hi, Can you please let me know, if Splunk latest version (8.1.2) supports/works in Centos 8  in the distributed environment.  Right now we are using the Linux- SLES and we are planning to migrate to Centos 8 and also the Splunk Enterprise version which we are using is 7.2.3. We are planning to upgrade the same to the latest version.
Splunk internal logs: INFO StreamedSearch - Streamed search connection terminated   Splunk search: index=oswinsec source="*WinEventLog:Security" action=success | stats count min(_time) as earlies... See more...
Splunk internal logs: INFO StreamedSearch - Streamed search connection terminated   Splunk search: index=oswinsec source="*WinEventLog:Security" action=success | stats count min(_time) as earliest max(_time) as latest by user | multireport [| stats values(*) as * by user | lookup account_status_tracker user OUTPUT count as prior_count earliest as prior_earliest latest as prior_latest | where prior_latest < relative_time(now(), "-30d") | eval explanation="The last login from this user was " . (round( (earliest-prior_latest) / 3600/24, 2) ) . " days ago." | convert ctime(earliest) ctime(latest) ctime(prior_earliest) ctime(prior_latest) ] [| inputlookup append=t account_status_tracker | stats min(earliest) as earliest max(latest) as latest sum(count) as count by user | outputlookup account_status_tracker ] | stats count   The search was working fine until today. Kindly suggest.
Hi, I cleaned the kvstore on the machine that also contains Splunk add-on builder projects. The list of Add-on builder projects is empty now. Is there any way to restore (or rebuild) the created pro... See more...
Hi, I cleaned the kvstore on the machine that also contains Splunk add-on builder projects. The list of Add-on builder projects is empty now. Is there any way to restore (or rebuild) the created projects in the add-on builder app? There is no backup from the kvstores nor an export of the projects was created. The created apps with the add-on builder are still there.   Thanks
I am trying to undesrtand what the option event_format_flags in  inputs.conf  file can be used for.[mscs_azure_event_hub://<name>] event_format_flags = <integer> The bitwise flags that determines th... See more...
I am trying to undesrtand what the option event_format_flags in  inputs.conf  file can be used for.[mscs_azure_event_hub://<name>] event_format_flags = <integer> The bitwise flags that determines the format of output events
Currently I am having some  blank cell or null data in some row. So how can I fill that cells or rows with the data  that is in below row of that columns: Currently my output for the query is: ... See more...
Currently I am having some  blank cell or null data in some row. So how can I fill that cells or rows with the data  that is in below row of that columns: Currently my output for the query is:   The Output that I want is:- Can someone please help me with this..
The structure is designed as search head clustering with 3 search heads and one of them has some errors as below. (the rest of them operates normally) When users access to port 8000, it displays th... See more...
The structure is designed as search head clustering with 3 search heads and one of them has some errors as below. (the rest of them operates normally) When users access to port 8000, it displays the following xml: This xml file does not appear to have any style information associated with in. This document tree is shown below. <msg type="ERROR">Connection reset by peer </msg> With the above message, When users check splunkd.log of search head which occurred error, it displays the following two logs. WARN HttpListener - Socket error from x.x.x.x:51229 while idling: error 14094416:SSL routines:ssl3_read_bytes:sslv3 alert certificate unknown WARN SSLCommon - Received fatal SSL3 alert. ssl_state='SSLv3 read client key exchange A', alert_description='certificate unknown'. -------------------------------------------------------------------. Even though users restart the mentioned search head instance, 8000 port does not open immediately but requires some time to open. However, the web doesn’t operate properly due to the mentioned phenomenon. Also, with this situation for 8000 port, the following logs can be found from splunkd.log: ERROR HttpClientRequest - HTTP client error=Connection reset by peer while accessing server=http://127.0.0.1:8065 for request=http://127.0.0.1:8065/ko-KR/. ------------------------------------------------------------------- How can we resolve the such error?
Hi, I have the  log file,i need to search the part of line in bold(which as only EmployeeServices/com) and index only that whole line in my index? How do i write regex to capture only the lines in ... See more...
Hi, I have the  log file,i need to search the part of line in bold(which as only EmployeeServices/com) and index only that whole line in my index? How do i write regex to capture only the lines in bold?please help me with props.conf and transform.con
We have observed the disabled rules got enabled automatically, what are the reasons to this. We need to find the root cause.  
Currently, we are using the Splunk Python SDK to get Splunk events based on a query and parse them. We sometimes make multiple searches on overlapping time frames, and we  have a deduping mechanism ... See more...
Currently, we are using the Splunk Python SDK to get Splunk events based on a query and parse them. We sometimes make multiple searches on overlapping time frames, and we  have a deduping mechanism based on hashing the entire JSON of the event. However, this mechanism relies on the fact that the same event will return exactly the same in each search - which doesn't happen. For example, the "_serial" field might be different for the same event in consecutive searches. My question is - are there any other fields like "_serial", that under some preconditions (any at all), might change their value between searches, without any actual change done to the event? Thanks so much for the help!
Hi, so I try to use Field Extractor (in Extract new fields) to extract some fields from raw logs to make a table. I have successfully show it on my end but other can't. So I want to apply the query t... See more...
Hi, so I try to use Field Extractor (in Extract new fields) to extract some fields from raw logs to make a table. I have successfully show it on my end but other can't. So I want to apply the query that it auto generate in my own search. The query is: ^[^>\n]*>\s+\w+<(?P<Portname>[^>]+)[^:\n]*:\s+(?P<Status>\w+) at <(?P<IP>[^:]+):(?P<Port>[^>]+) How do I apply it to Splunk search?
base search | stats dc(TransactionID),count(StatusSuccess), count(StatusFailure) as count by MainMethod |rename MainMethod as MicroService count(StatusSuccess) as Success count as Failed | replace Lo... See more...
base search | stats dc(TransactionID),count(StatusSuccess), count(StatusFailure) as count by MainMethod |rename MainMethod as MicroService count(StatusSuccess) as Success count as Failed | replace LostStolen with LostandStolen RetailConversion with TransactionRetailConversion TempCreditLimit with TemporaryCreditLimitIncrease Hi,  in the above query I want to add pie chart as visualization, right now in pie chart it is only showing the no of transactions hit per function and ignoring the Success and Failed status. What i want is when I hover over the functions in  piechart I want to see the no Failed and Sucess Tranactions. How do i do that?
Hello you guys! Im new to splunk and I have a BIG question, thanks in advance to everyone who is willing to take on this challenge. My data: events that contain only two fields: 1) ID_CLIENT and a fie... See more...
Hello you guys! Im new to splunk and I have a BIG question, thanks in advance to everyone who is willing to take on this challenge. My data: events that contain only two fields: 1) ID_CLIENT and a field named OP_CODE this last one contains numbers that represent where in a webpage a custumer is at the moment. F.I: The number 34 represents "candy products" and the number 18 represents "stuffed animals" what I want to do: I want to be able to count how many times an ID_CLIENT goes from OP_CPDE=34 to OP_CPDE=18 in a day, or last hour ect... IF YOU CAN HELP ME ABOUT WITH THIS I WILL BE FOREVER THANKFUL
I'm trying to develop on Splunk Enterprise a dashboard panel , where according to the range in which a single value result to a search is, the panel will display a success symbol (e.g. large red fill... See more...
I'm trying to develop on Splunk Enterprise a dashboard panel , where according to the range in which a single value result to a search is, the panel will display a success symbol (e.g. large red filled circle) with the result value next to it in red coloured font or a an error symbol (e.g. large green filled circle) with the result value in green coloured font next to it. It should be capable of adding link for drilldown. The idea is to have a simplified panel with traffic light effect (green for all good and red for errors) for monitoring use case.  The Single Value Decorations  examples using "single_decorations.css" shown in https://www.splunk.com/en_us/blog/tips-and-tricks/shiny-icons.html is exactly what I'm looking for, but I since I cannot install apps I need an alternative. How to achieve a similar effect with inline code would be perfect. Limitations: I do not have access to asset upload or app installation (I can't upload css/js files to the app so thus far I've been using inline css code).
Hi all, I've installed the free 60-day Splunk Enterprise trial for testing purposes on a CentOS box, but am having encountering an issue where every time I try to install an add-on, the Splunk servi... See more...
Hi all, I've installed the free 60-day Splunk Enterprise trial for testing purposes on a CentOS box, but am having encountering an issue where every time I try to install an add-on, the Splunk service crashes.  I am attempting to install through the browser, and this issue is not specific to any add-on (tried multiple different ones with the same result). The crash log files don't seem to give any clues as to why this is occurring:     TcpClientConnection: peer=172.17.1.69, port=8080In TcpOutputLoop 0x7f5eb6a5ab80, _toloopp=0x7f5eb73f26f0, _tstate=1, no async write data, isTerminated=N, destLoop=(nil), c/r/w/s timeouts=12.000/100.000/100.000/2.000, paused=0, timeout_count=0, ssl_shutdown_returned_zero=N SSL: version="TLSv1.2", state="SSL negotiation finished successfully", cipher="ECDHE-RSA-AES128-GCM-SHA256", compression="none" serr: No error, _wantEvents=8216, _setEvents=8219 rbuf: ptr=0x7f5eb6a5ad38, size=0x4000, rptr=0x0, wptr=0x313 HttpClientConnection: _hc_state=12, _gunzip_initialized=N _had_previous_transactions=Y, _can_reuse_connection=Y ApplicationUpdateTransaction: file="/opt/splunk/var/run/4024e94e0f412193.tar.gz", failureStr="", open=N HttpClientTransaction: Connecting to host=http://172.17.1.69:8080 Request details: GET https://cdn.apps.splunk.com/media/private/51e661e4-edfc-11ea-8fee-06add55d78f8.tgz?response-content-disposition=attachment%3Bfilename%3D%22microsoft-graph-security-api-add-on-for-splunk_121.tgz%22&Expires=1615870031&Signature=dPDDrPEHcebgKhSKS4SiX4BqntPkcvvJK1PAzosWRTkJUrf2JoRroh10sdTuFHZNcoDRi4qIqDFLT7WP6s29KZjDeEfe~tGrNeApUbggrienfdN49BjcVcsh0UXi1XPsYUXJaRAWb53jdHy13Qc856b8wRBYFESep8qMC~VADGGll4TPUROgIz5bHWEn0e~z8BycCGmOHSFdqssfmI9LIX2O7R6vkV5z-WD~HhjYOs~egPTn1knZkK0XuIzvOqUftBDXG6i070CpmBZ3XguRStHyFVqgZPn0B8QXdUBhIQBMYkHvZY62szK1NRFx6wolbrrkd73Hr0W5c~hrlY4QrA__&Key-Pair-Id=APKAISM7Q7KZPNKOIT7A X-Auth-Token: 9fdmh9gpu5z6bw2tekbu9k37sd62nn0h _lastError=No error, _terminateEloopAfter=Y _connect_done=Y, _addrElem=0, _connectErrorPriority=0, _resolveError="" _useHttp11=Y, _allowTrailers=Y, _use_idle_connection=Y, _avoid_idle_connection_for_next_only=N, _last_on_connection=N, _send_content_type_even_if_no_body=N, _sniToSend="" _interpret_redirects=Y, _redirects_left=29, _redirectReply=2 _doneSendingRequestData=N, _requestBytesExpected=0 RESPONSE: HTTP/1.1 302 Found Content-Type: text/html; charset=utf-8 Date: Tue, 16 Mar 2021 04:22:11 GMT Location: https://cdn.apps.splunk.com/media/private/51e661e4-edfc-11ea-8fee-06add55d78f8.tgz?response-content-disposition=attachment%3Bfilename%3D%22microsoft-graph-security-api-add-on-for-splunk_121.tgz%22&Expires=1615870031&Signature=dPDDrPEHcebgKhSKS4SiX4BqntPkcvvJK1PAzosWRTkJUrf2JoRroh10sdTuFHZNcoDRi4qIqDFLT7WP6s29KZjDeEfe~tGrNeApUbggrienfdN49BjcVcsh0UXi1XPsYUXJaRAWb53jdHy13Qc856b8wRBYFESep8qMC~VADGGll4TPUROgIz5bHWEn0e~z8BycCGmOHSFdqssfmI9LIX2O7R6vkV5z-WD~HhjYOs~egPTn1knZkK0XuIzvOqUftBDXG6i070CpmBZ3XguRStHyFVqgZPn0B8QXdUBhIQBMYkHvZY62szK1NRFx6wolbrrkd73Hr0W5c~hrlY4QrA__&Key-Pair-Id=APKAISM7Q7KZPNKOIT7A Server: Apache Vary: Cookie Content-Length: 0 Connection: keep-alive _bytesRx=0, _maybeCompressedBytesRx=0, _bytesExpected=0, _maxResponseSize=576460752303423487 _acceptAndPass=identity, _acceptAndDecompress=identity, _activeDecompressPolicy=0, _remoteIndicatedCompression=identity _connectTimeout=10.000, _readTimeout=100.000, _writeTimeout=100.000 TcpClientConnectionPool: allowSsl=Y, _idleCount=0, _maxIdle=25, _addressOrder=0 _sslShutdownTimeout=2.000, _idleTimeout=28.000, _idle_connection_trimmer_scheduled=N x86 CPUID registers: 0: 0000000D 756E6547 6C65746E 49656E69 1: 000206D2 04010800 9FBA2203 0F8BFBFF 2: 76035A01 00F0B2FF 00000000 00CA0000 3: 00000000 00000000 00000000 00000000 4: 00000000 00000000 00000000 00000000 5: 00000000 00000000 00000000 00000000 6: 00000004 00000000 00000000 00000000 7: 00000000 00000000 00000000 00000000 8: 00000000 00000000 00000000 00000000 9: 00000000 00000000 00000000 00000000 A: 07300401 0000007F 00000000 00000000 B: 00000000 00000000 000000FD 00000004 C: 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 80000000: 80000008 00000000 00000000 00000000 80000001: 00000000 00000000 00000001 28100800 80000002: 20202020 49202020 6C65746E 20295228 80000003: 6E6F6558 20295228 20555043 322D3545 80000004: 20303836 20402030 30372E32 007A4847 80000005: 00000000 00000000 00000000 00000000 80000006: 00000000 00000000 01006040 00000000 80000007: 00000000 00000000 00000000 00000100 80000008: 0000302B 00000000 00000000 00000000 terminating...     splunkd.log also doesn't show much:   03-16-2021 15:35:07.658 +1100 INFO DatabaseDirectoryManager - Finished writing bucket manifest in hotWarmPath=/opt/splunk/var/lib/splunk/_metrics/db duration=0.003 03-16-2021 15:35:08.092 +1100 INFO IndexerIf - Asked to add or update bucket manifest values, bid=_metrics~49~97934EC8-853C-4F5B-A25A-A2F76DB6E0FB 03-16-2021 15:35:08.093 +1100 INFO IndexerIf - Asked to add or update bucket manifest values, bid=_metrics~50~97934EC8-853C-4F5B-A25A-A2F76DB6E0FB 03-16-2021 15:35:08.309 +1100 INFO ProcessTracker - (child_6__Fsck) Fsck - (entire bucket) Rebuild for bucket='/opt/splunk/var/lib/splunk/_metrics/db/db_1615869125_1615868908_51' took 128.5 milliseconds 03-16-2021 15:35:09.303 +1100 INFO ProcessTracker - (child_7__Fsck) Fsck - (entire bucket) Rebuild for bucket='/opt/splunk/var/lib/splunk/_metrics/db/db_1615869125_1615868939_52' took 141.4 milliseconds 03-16-2021 15:35:09.759 +1100 INFO KeyManagerLocalhost - Checking for localhost key pair 03-16-2021 15:35:09.759 +1100 INFO KeyManagerLocalhost - Public key already exists: /opt/splunk/etc/auth/distServerKeys/trusted.pem 03-16-2021 15:35:09.759 +1100 INFO KeyManagerLocalhost - Reading public key for localhost: /opt/splunk/etc/auth/distServerKeys/trusted.pem 03-16-2021 15:35:09.759 +1100 INFO KeyManagerLocalhost - Finished reading public key for localhost: /opt/splunk/etc/auth/distServerKeys/trusted.pem 03-16-2021 15:35:09.759 +1100 INFO KeyManagerLocalhost - Reading private key for localhost: /opt/splunk/etc/auth/distServerKeys/private.pem 03-16-2021 15:35:09.760 +1100 INFO KeyManagerLocalhost - Finished reading private key for localhost: /opt/splunk/etc/auth/distServerKeys/private.pem 03-16-2021 15:35:10.072 +1100 INFO IndexerIf - Asked to add or update bucket manifest values, bid=_metrics~51~97934EC8-853C-4F5B-A25A-A2F76DB6E0FB 03-16-2021 15:35:10.160 +1100 WARN ProcessTracker - (child_8__Fsck) Fsck - Rebuilding entire bucket is not supported for "metric" bucket that has a "stubbed-out" rawdata journal. Only bloomfilter will be build 03-16-2021 15:35:10.160 +1100 INFO ProcessTracker - (child_8__Fsck) bloomfiltermaker - distinct_term_count failed: rc=-4 03-16-2021 15:35:10.160 +1100 WARN ProcessTracker - (child_8__Fsck) Fsck - Repair entire bucket, index=_metrics, tryWarmThenCold=1, bucket=/opt/splunk/var/lib/splunk/_metrics/db/db_1615535486_1615532541_4, exists=1, localrc=101, failReason=Bloomfilter rebuild for bkt='/opt/splunk/var/lib/splunk/_metrics/db/db_1615535486_1615532541_4' failed; rc=-4 03-16-2021 15:35:11.071 +1100 INFO IndexerIf - Asked to add or update bucket manifest values, bid=_metrics~52~97934EC8-853C-4F5B-A25A-A2F76DB6E0FB 03-16-2021 15:35:12.041 +1100 WARN BucketMover - BucketManifestUpdateExitHandler: process handling bucket="db_1615535486_1615532541_4" exited with code=101; search for any previous messages that might have been produced by the external process 03-16-2021 15:35:12.041 +1100 INFO IndexerIf - Asked to add or update bucket manifest values, bid=_metrics~4~97934EC8-853C-4F5B-A25A-A2F76DB6E0FB 03-16-2021 15:35:18.590 +1100 WARN LocalAppsAdminHandler - Using deprecated capabilities for write: admin_all_objects or edit_local_apps. See enable_install_apps in limits.conf 03-16-2021 15:35:31.579 +1100 INFO ScheduledViewsReaper - Scheduled views reaper run complete. Reaped count=0 scheduled views 03-16-2021 15:35:31.579 +1100 INFO CascadingReplicationManager - Using value for property max_replication_threads=2. 03-16-2021 15:35:31.579 +1100 INFO CascadingReplicationManager - Using value for property max_replication_jobs=5. 03-16-2021 15:35:34.964 +1100 INFO MetricSchemaProcessor - channel confkey=source::/opt/splunk/var/log/splunk/metrics.log|host::AUSPS1SL0041|splunk_metrics_log|CLONE_CHANNEL has an event with no measure, will be skipped. 03-16-2021 15:35:49.966 +1100 WARN LocalAppsAdminHandler - Using deprecated capabilities for write: admin_all_objects or edit_local_apps. See enable_install_apps in limits.conf 03-16-2021 15:35:50.236 +1100 WARN DateParserVerbose - Failed to parse timestamp in first MAX_TIMESTAMP_LOOKAHEAD (40) characters of event. Defaulting to timestamp of previous event (Tue Mar 16 15:35:01 2021). Context: source=/opt/splunk/var/log/splunk/splunkd_stderr.log|host=AUSPS1SL0041|splunkd_stderr|72   Also ran a packet capture and found nothing out of the ordinary. I'm aware that we can manually install add-ons from Splunkbase by extracting the .tar.gz but want to understand and solve the issue.  Anyone have any ideas?
How to check if a couple of hosts /VMs ever reported to Splunk? I have looked in Deployment server, no sign of them or history. Please advise