All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

the below are two different drop down list as we have different host and index. Based on the index selection i do set/unset -show/hide - can we make as single query with single drop down list  based... See more...
the below are two different drop down list as we have different host and index. Based on the index selection i do set/unset -show/hide - can we make as single query with single drop down list  based on the value of the dropdownlist which is visible?     1) index=aaa (source="/log/test.log" $host1$ ) | rex field=name "(?<DB>[^\.]*)" | rename DB AS "Database" | table Database | dedup Database 2) index=bbb (source="/log/test.log" $host2$ ) | rex field=name "(?<DB>[^\.]*)" | rename DB AS "Database" | table Database | dedup Database      if ddl1 is visible fetch the value and pass to $host1$ to the  query if ddl2 is visible fetch the value and pass $host2$ to the query or based on the drop down list value selected can we set value and pass to query to avoid multiple queries for only differnt host/index?
I am looking to write a simple search that tells me if a host or hosts are reaching out to a specific IP address.  So far I have  index="firewall" host=hostname src_addr=x.x.x.x dest_addr=x.x.x.x ... See more...
I am looking to write a simple search that tells me if a host or hosts are reaching out to a specific IP address.  So far I have  index="firewall" host=hostname src_addr=x.x.x.x dest_addr=x.x.x.x When I run this it doesn't come back with anything.  Should I be searching under my domain instead?  I would like for it to be lined up like below,   Hostname | source IP | destination IP
Our Splunk instance is currently setup as a deployment server. All our clients have the Universal Forwarder installed and setup as deployment clients, phoning home to the server to get their neces... See more...
Our Splunk instance is currently setup as a deployment server. All our clients have the Universal Forwarder installed and setup as deployment clients, phoning home to the server to get their necessary apps. Under the "Forwarder Management" page of the Distributed Environment settings, can see all 20 of our clients and their respective host name and IP address actively talking with the server by phoning home and getting apps deployed... However, when I go to the Monitoring Console's, "Forwarders: Deployment" page, only 6 of the 20 Universal Forwarders are showing as installed and active? Sure we're messing up one of the many different config files somewhere but not sure which one...
Hello, Is there any way to add a Custom Logo in place of the Splunk Logo in an exported PDF? I am on Splunk Cloud not Enterprise so Im not sure if I can access a static folder to use with server s... See more...
Hello, Is there any way to add a Custom Logo in place of the Splunk Logo in an exported PDF? I am on Splunk Cloud not Enterprise so Im not sure if I can access a static folder to use with server settings > email settings > PDF report settings. Alternatively Ive tried just adding an image to the dashboard but even after having it appear (using embedded base64), the image does not appear when exported to PDF. Any guidance or alternatives would be appreciated.
Hi, I have a raw data as below, with the fields "ID, Date, Level, Logger, Message which needs to be dsiplayed in a dashboard.  index="wireless_retail" source="CPS.cpsLog" Level="ERROR", Logger="Uti... See more...
Hi, I have a raw data as below, with the fields "ID, Date, Level, Logger, Message which needs to be dsiplayed in a dashboard.  index="wireless_retail" source="CPS.cpsLog" Level="ERROR", Logger="Utils.Helpers.LogHelper". Can someone help me with this dashboard creation for this ID="39090", Date="2024-05-07 14:12:53.313", Thread="4", Level="ERROR", Logger=".Utils.Helpers.LogHelper", Message="UserName: abc Location:  Sales Channel: GW_STORE Error in Path: /pap/getcpsinput Raw Url: /pap/getcpsinput User Name: Error: System.Data.Entity.Core.EntityException: An error occurred while starting a transaction on the provider connection. See the inner exception for details. ---&gt; System.Data.SqlClient.SqlException: Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding. ---&gt; System.ComponentModel.Win32Exception: The wait operation timed out --- End of inner exception stack trace --- at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose) at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error) at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync() at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket() at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer() at System.Data.SqlClient.TdsParserStateObject.TryReadByte(Byte&amp; value) at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean&amp; dataReady)
Hello! My trial reached the end, how can I activate the Lite version? Thank you.
Hi, I just installed the TA-tenable add-on and was going to configure it; however, when I get to the account configuration it does not matter what account type I use, I always get "Error in processi... See more...
Hi, I just installed the TA-tenable add-on and was going to configure it; however, when I get to the account configuration it does not matter what account type I use, I always get "Error in processing the request". Has anyone seen this before? If so, what is the fix?
Hello, Good day! MS Graph API duplicate email ingestion into Splunk SOAR:  We have Splunk SOAR v 6.1.1, and the Graph API v3, the ingestion settings are polling selected to interval and every 4 min... See more...
Hello, Good day! MS Graph API duplicate email ingestion into Splunk SOAR:  We have Splunk SOAR v 6.1.1, and the Graph API v3, the ingestion settings are polling selected to interval and every 4 minutes. We are noticing duplicate ticket/email ingestion every 4 minutes causing our playbook (label_change) to error out with Validation error and could not update the record Re: SOAR Could not update record due to Validation... - Splunk Community Appreciate your guidance in advance!  
Hi  Can you please help me to find out how we can find the count of events between the 2 events in SPLUNK.  Example , i have to find the count of events (RPWARDA , SPWARAA , SPWARRA ) between eve... See more...
Hi  Can you please help me to find out how we can find the count of events between the 2 events in SPLUNK.  Example , i have to find the count of events (RPWARDA , SPWARAA , SPWARRA ) between events IDJO20P and PIDZJEA.  IDJO20P to PIDZJEA will be considered a day and i have to find count of events (RPWARDA , SPWARAA , SPWARRA ) in a day.    SPLUNk Query to find the events: index=events_prod_cdp_penalty_esa source="SYSLOG" (TERM(NIDF=RPWARDA) OR TERM(NIDF=SPWARAA) OR TERM(NIDF=SPWARRA) OR PIDZJEA OR IDJO20P)    
I have the following query that gives me a list of pods that are missing based off the comparison of what should be deployed as defined in the pod_list.csv inputlookup.   index=abc sourcetype=kubec... See more...
I have the following query that gives me a list of pods that are missing based off the comparison of what should be deployed as defined in the pod_list.csv inputlookup.   index=abc sourcetype=kubectl importance=non-critical | dedup pod_name | eval Observed=1 | append [| inputlookup pod_list.csv | eval Observed=0 | eval importance=if(isnull(importance), "critical", importance) | search importance=non-critical] | lookup pod_list pod_name_lookup as pod_name OUTPUT pod_name_lookup | eval importance=if(isnull(importance), "critical", importance | stats max(Observed) as Observed by pod_name_lookup, importance | where Observed=0 and importance="non-critical"     The data in the pod_list.csv looks like so: namespace pod_name_lookup importance ns1 kafka-* critical ns1 apache-* critical ns2 grafana-backup-* non-critical   This works as expected. I am now having difficulties creating a timechart with this data to be able to see when a pod wasnt deployed, not just what is currently missing. Any help is greatly appreciated.  
I have defined the following sourcetype for a CSV file data input without headers: [test_csv] SHOULD_LINEMERGE = false TRANSFORMS = drop_start_and_interim INDEXED_EXTRACTIONS = csv FIELD_NAMES =... See more...
I have defined the following sourcetype for a CSV file data input without headers: [test_csv] SHOULD_LINEMERGE = false TRANSFORMS = drop_start_and_interim INDEXED_EXTRACTIONS = csv FIELD_NAMES = 1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162,163,164,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181,182,183,184,185,186,187 KV_MODE = none LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true TIMESTAMP_FIELDS = 14 category = Structured description = Comma-separated value format. Set header and other settings in "Delimited Settings" disabled = false pulldown_type = true When I index a test file I see that there is one of the destination fields that is not correctly extracted, this field is bounded by 2 double quotes and is extacted together with the next field as a single field. A sample raw with the problem is the following where I have marked the field in red: 2,"127.0.0.1",5060,"258670334_106281015@83.72.181.1","258670334_106281015@83.72.181.1","258670334_106281015@83.72.181.1","SIP",,,"<sip:+34765300391@83.72.181.1;user=phone>;tag=gK0a655dd7","<sip:+376826792@193.178.74.21;user=phone>",1,1611,"14:35:43.412 CET Jan 09 2024","14:35:52.884 CET Jan 09 2024","15:02:43.220 CET Jan 09 2024",1,"s0p2",53,"s0p0",52,"IMS","IX","localhost:154311320","PCMA","IX","83.72.181.97",40072,"193.178.74.21",20526,"IMS","10.12.162.20",16864,"10.12.45.10",25732,0,0,0,0,0,0,0,1,17551834,80513,9284,440,"localhost:154311321","PCMA","IMS","10.12.45.10",25732,"10.12.162.20",16864,"IX","193.178.74.21",20526,"83.72.181.97",40072,0,0,0,0,0,0,0,2,17552488,80516,9284,440,,,,"0.0.0.0",0,"0.0.0.0",0,,"0.0.0.0",0,"0.0.0.0",0,0,0,0,0,0,0,0,0,0,0,,,,"0.0.0.0",0,"0.0.0.0",0,,"0.0.0.0",0,"0.0.0.0",0,0,0,0,0,0,0,0,0,0,0,"bb6c6d3001911f060e83641d9e64",""aaa://inf.tsa"","SCZ9.0.0 Patch 2 (Build 211)","GMT-01:00",245,"sip:+376826792@193.178.74.21:5060;user=phone",,,,,"sip:+34765300391@83.72.181.1:5060;user=phone","193.178.74.21:5060","83.72.181.1:5060","10.12.193.4:5060","10.59.90.201:5060",,3,2,0,0,"sip:+376826792@FO01-vICSCF-01.ims.mnc006.mcc333.3gppnetwork.org:5060;user=phone",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,"15:02:43.220 CET Jan 09 2024","15:02:43.220 CET Jan 09 2024","00:00:00.000 UTC Jan 01 1970","00:00:00.000 UTC Jan 01 1970","audio","audio",,,17551834,80513,17552052,80514,0,0,0,0,19516010 The content of the field 117 is: "aaa://inf.tsa","SCZ9.0.0 Patch 2 (Build 211) It corresponds to the fields 117 and 118 concatenated, and the following fields are all offset one position I have tried to replace the 2 double quotes by 1 in 2 ways:  Adding the line SEDCMD = s/""/"/g at the first line in the sourcetype definition in the props.conf but it only changes the _raw and still have the same issue extracting the field 117 and the offset of the following fields I have tried to overwrite de _raw replacing the 2 double quotes by 1 with the following transforms: [rewrite_raw] INGEST_EVAL = _raw:=replace(_raw, "\"\"", "\"") Applied in the sourcetype after the other transform that drops some kind of rows based on the value of the first field TRANSFORMS = drop_start_and_interim, rewrite_raw And the result is the same, the _raw is changed but the issue extracting the filed 117 and offset of the followings persists I also have tried to rewrite the _raw with the following transform and it neither has solved the problem, the result has been the same: [remove_double_quotes] SOURCE_KEY = _raw REGEX = (?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*)(?:\"\"|\"|) FORMAT = "$1","$2","$3","$4","$5","$6","$7","$8","$9","$10","$11","$12","$13","$14","$15","$16","$17","$18","$19","$20","$21","$22","$23","$24","$25","$26","$27","$28","$29","$30","$31","$32","$33","$34","$35","$36","$37","$38","$39","$40","$41","$42","$43","$44","$45","$46","$47","$48","$49","$50","$51","$52","$53","$54","$55","$56","$57","$58","$59","$60","$61","$62","$63","$64","$65","$66","$67","$68","$69","$70","$71","$72","$73","$74","$75","$76","$77","$78","$79","$80","$81","$82","$83","$84","$85","$86","$87","$88","$89","$90","$91","$92","$93","$94","$95","$96","$97","$98","$99","$100","$101","$102","$103","$104","$105","$106","$107","$108","$109","$110","$111","$112","$113","$114","$115","$116","$117","$118","$119","$120","$121","$122","$123","$124","$125","$126","$127","$128","$129","$130","$131","$132","$133","$134","$135","$136","$137","$138","$139","$140","$141","$142","$143","$144","$145","$146","$147","$148","$149","$150","$151","$152","$153","$154","$155","$156","$157","$158","$159","$160","$161","$162","$163","$164","$165","$166","$167","$168","$169","$170","$171","$172","$173","$174","$175","$176","$177","$178","$179","$180","$181","$182","$183","$184","$185","$186","$187" DEST_KEY =_raw Is there any way to solve this problem? Thank you
Has anyone worked with ./splunk check-integrity and if yes do you know how to interpret the results? This link does not provide information on how to interpret the results - https://docs.splunk.com/D... See more...
Has anyone worked with ./splunk check-integrity and if yes do you know how to interpret the results? This link does not provide information on how to interpret the results - https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/Dataintegritycontrol I was provided cursory information but it still does not tell me enough to know when a compromise may have occurred and where. Example
Hello there! After following this docs https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/Howtoself-signcertificates, https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/Howtoprepar... See more...
Hello there! After following this docs https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/Howtoself-signcertificates, https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/HowtoprepareyoursignedcertificatesforSplunk, https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/SecureSplunkWebusingasignedcertificate for SSL certificate installation I receive an error message when I tried to restart Splunk: Cannot decrypt private key in "/opt/splunk/etc/auth/mycerts/myServerPrivateKey.key" without a password web.conf [settings] enableSplunkWebSSL = true privKeyPath = /opt/splunk/etc/auth/mycerts/myServerPrivateKey.key serverCert = /opt/splunk/etc/auth/mycerts/splunkCert.pem Any solutions for this issue will be appreciated!
How do I get access to IA4S, app #7186?  Assume it is probably still in testing, but very interested
Does anyone know of a list of component codes and their meanings for at least _internal and _audit? I have asked instructors and Splunk direct with no help so far. 
Hello,  I have a really basic question I have a .csv file saved in SPLUNK, which I believe is indexed - this is not an output of a search but a file feed into SPLUNK from another source. I want t... See more...
Hello,  I have a really basic question I have a .csv file saved in SPLUNK, which I believe is indexed - this is not an output of a search but a file feed into SPLUNK from another source. I want to be able to open the file in SPLUNK search. Can you please advise what command I should use in SPLUNK search to be able to see the content of the .csv? thank you. 
I've been trying to get a new Developer License for more than a week and getting the same error message. I've also sent an email to devinfo@splunk.com but have not heard back yet.  Is there any othe... See more...
I've been trying to get a new Developer License for more than a week and getting the same error message. I've also sent an email to devinfo@splunk.com but have not heard back yet.  Is there any other way of getting a Developer License? Error: Developer License Request Error An error occurred while requesting a developer license. Please try again. If this error continues to occur, contact devinfo@splunk.com for assistance.    
"I installed splunkforwarder-8.2.9 on Oracle Linux 7.4 and added the Linux add-on to it through the Deployment Server. Although the logs from this server are being received by the HF (we verified thi... See more...
"I installed splunkforwarder-8.2.9 on Oracle Linux 7.4 and added the Linux add-on to it through the Deployment Server. Although the logs from this server are being received by the HF (we verified this using tcpdump), when we search the desired index, we don't see any logs in it. (Our Splunk version is 9.2.1 and UF version is 8.2.9, and due to certain reasons, we cannot use the latest version of UF.)"
I believe I have what is a very simple question, but with all my searching I have been unable to find an answer. I've made a simple dashboard to show successful and failed logins to our application.... See more...
I believe I have what is a very simple question, but with all my searching I have been unable to find an answer. I've made a simple dashboard to show successful and failed logins to our application.  I have created a dropdown/radio button panel with some static options shown below.  I can show all results with an asterisk and only successful logins with 0, but using "!=0" to get everything that doesn't equal 0 doesn't produce any results. I have tried some basic combinations of !=0, !="0", !=="0" here in the Static Options window. What am I missing?  The tutorials I've found don't specifically cover this type of syntax.  Thank you in advance!  
What are the various methods to integrate 3rd party SaaS applications with Splunk.