All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

When I configure INGEST_EVAL to replace _raw with something else, it duplicates the event.  Splunk Enterprise Version 8.2.1 props.conf: [source::http:splunk_hec_token] TRUNCATE = 500000 SHOULD_L... See more...
When I configure INGEST_EVAL to replace _raw with something else, it duplicates the event.  Splunk Enterprise Version 8.2.1 props.conf: [source::http:splunk_hec_token] TRUNCATE = 500000 SHOULD_LINEMERGE = false KV_MODE = json TRANSFORMS-fdz_event = fdz_event transforms.conf [fdz_event] INGEST_EVAL = _raw="Test" Output:  
Hello, I am trying to install and run a Splunk Univeral Forwarder v8.2.1 on a number of Solaris SPARC 11.3 servers but I am getting this error message. $ /opt/splunkforwarder/bin/splunk start --acc... See more...
Hello, I am trying to install and run a Splunk Univeral Forwarder v8.2.1 on a number of Solaris SPARC 11.3 servers but I am getting this error message. $ /opt/splunkforwarder/bin/splunk start --accept-license --answer-yes ld.so.1: splunk: fatal: relocation error: file /opt/splunkforwarder/bin/splunk: symbol in6addr_any: referenced symbol not found Killed The requirements here https://docs.splunk.com/Documentation/Forwarder/8.2.1/Forwarder/Systemrequirements state the system should have SUNW_1.22.7 or later in the libc.so.1 library, and it does. # pvs /usr/lib/libc.so.1 libc.so.1; SUNWpublic; SUNW_1.23; SUNW_1.22.7; SUNW_1.22.6; SUNW_1.22.5; SUNW_1.22.4; SUNW_1.22.3; SUNW_1.22.2; SUNW_1.22.1; SUNW_1.22; SUNW_1.21.3; SUNW_1.21.2; SUNW_1.21.1; SUNW_1.21; SUNW_1.20.4; SUNW_1.20.1; SUNW_1.20; SUNW_1.19; SUNW_1.18.1; SUNW_1.18; SUNW_1.17; SUNW_1.16; SUNW_1.15; SUNW_1.14; SUNW_1.13; SUNW_1.12; SUNW_1.11; SUNW_1.10; SUNW_1.9; SUNW_1.8; SUNW_1.7; SUNW_1.6; SUNW_1.5; SUNW_1.4; SUNW_1.3; SUNW_1.2; SUNW_1.1; SUNW_0.9; SUNW_0.8; SUNW_0.7; SISCD_2.3; SYSVABI_1.3; SUNWprivate_1.1; Does anyone have any suggestions? Thanks
How to pass a field from subsearch to main search and perform search on another source i am trying  to use  below to search all the UUID's returned from subsearch on path1 to Path2, but the below se... See more...
How to pass a field from subsearch to main search and perform search on another source i am trying  to use  below to search all the UUID's returned from subsearch on path1 to Path2, but the below search string is not working properly  source ="Path2" | eval id=[search source="Path1" "HTTP/1.1\" 500" OR "HTTP/1.1\" 400" OR "HTTP/1.1\" 404" | rex "universal-request-id- (?<UUID>.*?)\s*X-df-elapsed-time-ms" | |return $UUID] suggest me on where i am doing wrong  
I have a data in Splunk like   Fname Lname Country fname1 lname1 USA fname2 lname2 USA fname3 lname3 USA   And I have file in Splunk server that contains in each line  a na... See more...
I have a data in Splunk like   Fname Lname Country fname1 lname1 USA fname2 lname2 USA fname3 lname3 USA   And I have file in Splunk server that contains in each line  a name: MyFile.csv: Name fname1 lname3 fname123   I want to present only the lines that in the Name into CSV if equal to Fname or Lname in my index   In my example the result need to be Fname Lname Country fname1 lname1 USA fname3 lname3 USA       How can I do that?  
I have a scheduled search that outputs the results every 5 minutes using the outputcsv command to local disk. The file is stored with name abc_dns.csv       index=abc |fields _time _raw |fields - ... See more...
I have a scheduled search that outputs the results every 5 minutes using the outputcsv command to local disk. The file is stored with name abc_dns.csv       index=abc |fields _time _raw |fields - _indextime _sourcetype _subsecond |outputcsv abc_dns     Then I am forwarding that file to an external Indexer inputs.conf   [monitor:///opt/splunk/var/run/splunk/csv/abc_dns.csv] index = abc_dns_logs sourcetype = abc_dns #crcSalt = <SOURCE>   Below is the props.conf   [abc_dns] INDEXED_EXTRACTIONS = csv HEADER_FIELD_LINE_NUMBER = 1 KV_MODE = none NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = structured TRANSFORMS-t1 = eliminate_header   transforms.conf   [eliminate_header] REGEX = ^"_time","_raw"$ DEST_KEY = queue FORMAT = nullQueue     When I validate the results, I am seeing data is getting duplicated on the external Indexer. I attempted to add crcSalt = <SOURCE> to check if it makes any difference, which seemed that it did initially, however, afterwhile, I saw data was getting duplicated again. In reality, there is indeed duplicate data in original logs itself, but overall I am actually seeing data from the monitored file is also getting duplicated. Can anyone please help with this ?
We encounter an error configuring the VMware Carbon Black Cloud application (vmware_app_for_splunk 1.1.1 with Splunk Common Information Model Splunk_SA_CIM 4.20.0) with Splunk Enterprise (8.2.1). In... See more...
We encounter an error configuring the VMware Carbon Black Cloud application (vmware_app_for_splunk 1.1.1 with Splunk Common Information Model Splunk_SA_CIM 4.20.0) with Splunk Enterprise (8.2.1). In Application Configuration API Token Configuration when we select + we get the error message "Something went wrong. TypeError: Cannot read property 'length' of undefined" and "vmware_app_for_splunk: pavo_message:UNKNOWN"
Hello, I performed a "fresh" installation of ES 4.6.1 in a search head cluster through deployer. Splunk app version is 8.0.9.  The apps for the ES were pulled from a repository solution to deployer... See more...
Hello, I performed a "fresh" installation of ES 4.6.1 in a search head cluster through deployer. Splunk app version is 8.0.9.  The apps for the ES were pulled from a repository solution to deployer and then pushed to the search cluster. When I try to open the content management it is stuck in blank and the Incident Review displaying "Operation Failed, Internal Error. __enter__" error. Is there any log file I might check and permission I need to change a this behavior is quite strange? Thank you in advance
Hoping to find some physical copies of the Quick Reference Guide on card stock.  I was hoping they would be available from the Online Splunk store here: https://www.mypromomall.com/splunk  but they a... See more...
Hoping to find some physical copies of the Quick Reference Guide on card stock.  I was hoping they would be available from the Online Splunk store here: https://www.mypromomall.com/splunk  but they are not. I usually pick them up at .conf, but being virtual last year didn't have that opportunity.  I'm working with a rotating base of junior folks who use the heck out of them.  The cards have been an awesome aid in getting them up to speed.  
Hi everyone, I am looking for any document which can help to calculate log source volume. I have 10 different type of log sources i only have their log source description and quantity. Now i have t... See more...
Hi everyone, I am looking for any document which can help to calculate log source volume. I have 10 different type of log sources i only have their log source description and quantity. Now i have told to calculate the estimated total volume per day.  If someone can help me to get the Log source volume calculator.
HI all, in our identity feed there are some instances where different identities are registered with the same email address. ES by default merges using "key" fields and email. I want to disable this... See more...
HI all, in our identity feed there are some instances where different identities are registered with the same email address. ES by default merges using "key" fields and email. I want to disable this behaviour, but I cannot find how to do that. In the documentation it is written "The key field is identity and the default merge convention is email.". Anyone knows how can I change the default merge convention? Thanks Mario
I have a requirement to list the most used indexes in the platform. For this I need to prepare a report which shows when indexes was used last time, and how those indexes were used e.g. if the user u... See more...
I have a requirement to list the most used indexes in the platform. For this I need to prepare a report which shows when indexes was used last time, and how those indexes were used e.g. if the user used it via an ad-hoc search or the index is part of any scheduled saved search. I am looking into audit data model for it , but this is not listing out indexes when it's defined within a macro. for example I have a scheduled saved search which is having a macro with index and source type definition, how to get that index name extracted via audit logs of saved search execution? Any inputs please. Thank you!
Hi all, I usually onboard Windows Server 2008 and newer but 2003 it is not working with below Stanza  # Windows platform specific input processor. [WinEventLog://Application] disabled = 0 [WinEven... See more...
Hi all, I usually onboard Windows Server 2008 and newer but 2003 it is not working with below Stanza  # Windows platform specific input processor. [WinEventLog://Application] disabled = 0 [WinEventLog://Security] disabled = 0 [WinEventLog://System] disabled = 0 is it possible to read the files like this? [monitor://C:\WINDOWS\System32\config\AppEvent.Evt] Best, N.
Question: How can we find diff between log statements before and after a given date.  Applicability:  Let's say we release a new application code and I want to be able to see all new events that app... See more...
Question: How can we find diff between log statements before and after a given date.  Applicability:  Let's say we release a new application code and I want to be able to see all new events that application has started logging.  Now definition of new is very vague here but any suggestion would help.  Idea is that splunk should be able to compare type of events were being logged earlier and only show new events that were not present before.  That would help finding any new Exceptions Errors or Warning that are being logged and not yet surfaced as a failed customer interaction.  Example:  After a new code release,  our application started logging an WARN event regarding "open file handlers" that kept building up over the time and ultimately reached a stage where no more unix file handlers were available to process any new request. 
Hi, I have a custom search get input as raw string, but when I combine splunk don't understand that, it always return error  Example: |example rawstring="{"EventCode": "13","EventType": "SetValue",... See more...
Hi, I have a custom search get input as raw string, but when I combine splunk don't understand that, it always return error  Example: |example rawstring="{"EventCode": "13","EventType": "SetValue","TargetObject": "(?mi)Software[//\\\\]{0,2}Microsoft[//\\\\]{0,2}Windows[//\\\\]{0,2}CurrentVersion[//\\\\]{0,2}Run"}}"  Can anyone help me pass it, thanks in advance
Hi all, I have created a lookup table and imported it into SPLUNK. It has 2 columns, one called hosts the other called IPs. The columns are populated with the hosts I want to query. I'm very new to ... See more...
Hi all, I have created a lookup table and imported it into SPLUNK. It has 2 columns, one called hosts the other called IPs. The columns are populated with the hosts I want to query. I'm very new to SPLUNK and would like to create a search that returns errors/events worth investigating from the hosts specified in the lookup file. I'll display the results on a dashboard and will be checking this daily for preventative maintenance on my system. I'm just after events worth looking in to and need to filter out irrelevant events to save time. Can anyone help? Thanks in advance.
Hi all, First post here - So I'm a Splunk beginner & recently got this tricky task. So let's say I have these rows in my log file: 2020-01-01: error778 2020-01-02: error778 2020-01-03: error778 ... See more...
Hi all, First post here - So I'm a Splunk beginner & recently got this tricky task. So let's say I have these rows in my log file: 2020-01-01: error778 2020-01-02: error778 2020-01-03: error778 2020-01-16: error778 2020-02-01: error778 2020-02-04: error778 2020-02-06: error778 2020-02-10: error778 2020-02-18: error778 2020-02-19: error778 On Jan 2020, we can see that there are 4 rows of error778 On Feb 2020, we can see that there are 6 rows of error778 This means, from Jan 2020 to Feb 2020, there's 50% diff/increase of error778. The questions: How can I get/display the % difference? Ideally, the delimiters can be days, month, year, or date ranges (such as, diff of error778 between 1-5 Jan 2020 and 5-31 Jan 2020). What's the best way to set an alert based on % (say, alert when diff is > 15%)? I'm able to display the daily/weekly/monthly trend of a keyword using timechart like below index=mylog "error778" | timechart span=1month count by date     But I believe it's far from what I need. Any help would be appreciated, thanks.
<title> A B </title> how to add line breaks between A And B
hi. I have a splunk server that is on windows and a vmware that windows on it too for forwarding data from vmware to host system. I read documents and do it step by step to  install and configure th... See more...
hi. I have a splunk server that is on windows and a vmware that windows on it too for forwarding data from vmware to host system. I read documents and do it step by step to  install and configure the universal forwarder on vm . the ports for receiver on two machines are open with firewall rules. but when i add data to splunk server the message show "There are currently no forwarders configured as deployment clients to this instance" and i search more but not fixed.  there are solution for this subject on this community but not work for me. please help. thanks.
Hello, i need help. I have 6500 IIN (like id) and put this id to lookup then tried search: index=alfa [|inputlookup IIN_oleg.csv |rename IIN as search | fields search]  They given result only for o... See more...
Hello, i need help. I have 6500 IIN (like id) and put this id to lookup then tried search: index=alfa [|inputlookup IIN_oleg.csv |rename IIN as search | fields search]  They given result only for one firs IIN in lookup. If i search whit out lookup just 10 IIN whit "OR" the give me 10 result
Hi, I have a log that has the following: dn=site,dn=com,dn=au I would like to extract and concatenate all these fields into a single capture group with periods between the words so the extracted fi... See more...
Hi, I have a log that has the following: dn=site,dn=com,dn=au I would like to extract and concatenate all these fields into a single capture group with periods between the words so the extracted field looks like site.com.au How can I do this with regex?