All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

It works, thanks!
I have defined the following sourcetype for a CSV file data input without headers: [test_csv] SHOULD_LINEMERGE = false TRANSFORMS = drop_start_and_interim INDEXED_EXTRACTIONS = csv FIELD_NAMES =... See more...
I have defined the following sourcetype for a CSV file data input without headers: [test_csv] SHOULD_LINEMERGE = false TRANSFORMS = drop_start_and_interim INDEXED_EXTRACTIONS = csv FIELD_NAMES = 1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162,163,164,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181,182,183,184,185,186,187 KV_MODE = none LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true TIMESTAMP_FIELDS = 14 category = Structured description = Comma-separated value format. Set header and other settings in "Delimited Settings" disabled = false pulldown_type = true When I index a test file I see that there is one of the destination fields that is not correctly extracted, this field is bounded by 2 double quotes and is extacted together with the next field as a single field. A sample raw with the problem is the following where I have marked the field in red: 2,"127.0.0.1",5060,"258670334_106281015@83.72.181.1","258670334_106281015@83.72.181.1","258670334_106281015@83.72.181.1","SIP",,,"<sip:+34765300391@83.72.181.1;user=phone>;tag=gK0a655dd7","<sip:+376826792@193.178.74.21;user=phone>",1,1611,"14:35:43.412 CET Jan 09 2024","14:35:52.884 CET Jan 09 2024","15:02:43.220 CET Jan 09 2024",1,"s0p2",53,"s0p0",52,"IMS","IX","localhost:154311320","PCMA","IX","83.72.181.97",40072,"193.178.74.21",20526,"IMS","10.12.162.20",16864,"10.12.45.10",25732,0,0,0,0,0,0,0,1,17551834,80513,9284,440,"localhost:154311321","PCMA","IMS","10.12.45.10",25732,"10.12.162.20",16864,"IX","193.178.74.21",20526,"83.72.181.97",40072,0,0,0,0,0,0,0,2,17552488,80516,9284,440,,,,"0.0.0.0",0,"0.0.0.0",0,,"0.0.0.0",0,"0.0.0.0",0,0,0,0,0,0,0,0,0,0,0,,,,"0.0.0.0",0,"0.0.0.0",0,,"0.0.0.0",0,"0.0.0.0",0,0,0,0,0,0,0,0,0,0,0,"bb6c6d3001911f060e83641d9e64",""aaa://inf.tsa"","SCZ9.0.0 Patch 2 (Build 211)","GMT-01:00",245,"sip:+376826792@193.178.74.21:5060;user=phone",,,,,"sip:+34765300391@83.72.181.1:5060;user=phone","193.178.74.21:5060","83.72.181.1:5060","10.12.193.4:5060","10.59.90.201:5060",,3,2,0,0,"sip:+376826792@FO01-vICSCF-01.ims.mnc006.mcc333.3gppnetwork.org:5060;user=phone",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,"15:02:43.220 CET Jan 09 2024","15:02:43.220 CET Jan 09 2024","00:00:00.000 UTC Jan 01 1970","00:00:00.000 UTC Jan 01 1970","audio","audio",,,17551834,80513,17552052,80514,0,0,0,0,19516010 The content of the field 117 is: "aaa://inf.tsa","SCZ9.0.0 Patch 2 (Build 211) It corresponds to the fields 117 and 118 concatenated, and the following fields are all offset one position I have tried to replace the 2 double quotes by 1 in 2 ways:  Adding the line SEDCMD = s/""/"/g at the first line in the sourcetype definition in the props.conf but it only changes the _raw and still have the same issue extracting the field 117 and the offset of the following fields I have tried to overwrite de _raw replacing the 2 double quotes by 1 with the following transforms: [rewrite_raw] INGEST_EVAL = _raw:=replace(_raw, "\"\"", "\"") Applied in the sourcetype after the other transform that drops some kind of rows based on the value of the first field TRANSFORMS = drop_start_and_interim, rewrite_raw And the result is the same, the _raw is changed but the issue extracting the filed 117 and offset of the followings persists I also have tried to rewrite the _raw with the following transform and it neither has solved the problem, the result has been the same: [remove_double_quotes] SOURCE_KEY = _raw REGEX = (?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*?)(?:\"\"|\"|)\,(?:\""|\"|)(.*)(?:\"\"|\"|) FORMAT = "$1","$2","$3","$4","$5","$6","$7","$8","$9","$10","$11","$12","$13","$14","$15","$16","$17","$18","$19","$20","$21","$22","$23","$24","$25","$26","$27","$28","$29","$30","$31","$32","$33","$34","$35","$36","$37","$38","$39","$40","$41","$42","$43","$44","$45","$46","$47","$48","$49","$50","$51","$52","$53","$54","$55","$56","$57","$58","$59","$60","$61","$62","$63","$64","$65","$66","$67","$68","$69","$70","$71","$72","$73","$74","$75","$76","$77","$78","$79","$80","$81","$82","$83","$84","$85","$86","$87","$88","$89","$90","$91","$92","$93","$94","$95","$96","$97","$98","$99","$100","$101","$102","$103","$104","$105","$106","$107","$108","$109","$110","$111","$112","$113","$114","$115","$116","$117","$118","$119","$120","$121","$122","$123","$124","$125","$126","$127","$128","$129","$130","$131","$132","$133","$134","$135","$136","$137","$138","$139","$140","$141","$142","$143","$144","$145","$146","$147","$148","$149","$150","$151","$152","$153","$154","$155","$156","$157","$158","$159","$160","$161","$162","$163","$164","$165","$166","$167","$168","$169","$170","$171","$172","$173","$174","$175","$176","$177","$178","$179","$180","$181","$182","$183","$184","$185","$186","$187" DEST_KEY =_raw Is there any way to solve this problem? Thank you
Sounds like when when you generated the key for your Splunk web server (privKeyPath = /opt/splunk/etc/auth/mycerts/myServerPrivateKey.key) you set the password/passphrase) So now you need to use t... See more...
Sounds like when when you generated the key for your Splunk web server (privKeyPath = /opt/splunk/etc/auth/mycerts/myServerPrivateKey.key) you set the password/passphrase) So now you need to use this in the web.conf (Password that protects the private key specified by 'privKeyPath'.) sslPassword = (your privKeyPath Key password/passphrase) Set your Password with the above settings, under [settings] and restart - see if that works. 
Please change the sourcetype and try
I do understand that "my_field" was just a placeholder since you did not know the name of my tokens.  My actual field is status.errorCode and creating a token "errorCode" from that does pull my resul... See more...
I do understand that "my_field" was just a placeholder since you did not know the name of my tokens.  My actual field is status.errorCode and creating a token "errorCode" from that does pull my results into the dashboard.  The problem comes when I tried to filter my token "errorCode" to show anything that isn't a value of 0.   I posted my code in another reply here.
Has anyone worked with ./splunk check-integrity and if yes do you know how to interpret the results? This link does not provide information on how to interpret the results - https://docs.splunk.com/D... See more...
Has anyone worked with ./splunk check-integrity and if yes do you know how to interpret the results? This link does not provide information on how to interpret the results - https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/Dataintegritycontrol I was provided cursory information but it still does not tell me enough to know when a compromise may have occurred and where. Example
Thanks again! The btool command is new to me. The results are much bigger in Forwarder 2, so there might be something in there.
Well... it's kinda complicated because we're talking about CSV Normally most of the data is just split into separate events (usually one event per line), some metadata is added and the fields are... See more...
Well... it's kinda complicated because we're talking about CSV Normally most of the data is just split into separate events (usually one event per line), some metadata is added and the fields are extracted in search time. But in case of CSV the fields can be split right in the moment of ingestion and can be indexed and immutable after ingestion (so called indexed-extractions). So it depends heavily on your configuration.
Where you able to get past this error? If so what was the resolution as I am facing the same issue now.
I tried the command you gave me, but nothing is displayed when adding _time in the BY. Additionally, I added other data, but I would like to display one user per line rather than grouping multiple... See more...
I tried the command you gave me, but nothing is displayed when adding _time in the BY. Additionally, I added other data, but I would like to display one user per line rather than grouping multiple users together because they share the same IP address. For instance, on a certain IP address, multiple services were used, but I don't know which service was used. So, if we display one user per line, I think it will be unnecessary to use earliest and latest and just display the correct _time, right? (index="index1" Users =* IP=*) OR (index="index2" tag=1 ) | where NOT match(Users, "^AAA-[0-9]{5}\$") | eval IP=if(match(IP, "^::ffff:"), replace(IP, "^::ffff:(\d+\.\d+\.\d+\.\d+)$", "\1"), IP) | eval ip=coalesce(IP,srcip) | stats dc(index) AS index_count values(Users) AS Users values(destip) AS destip values(service) AS service earliest(_time) AS earliest latest(_time) AS latest BY ip | where index_count>1 | eval earliest=strftime(earliest,"%Y-%m-%d %H:%M:%S"), latest=strftime(latest,"%Y-%m-%d %H:%M:%S") | table Users, ip, dest_ip, service, earliest, latest
Hello there! After following this docs https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/Howtoself-signcertificates, https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/Howtoprepar... See more...
Hello there! After following this docs https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/Howtoself-signcertificates, https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/HowtoprepareyoursignedcertificatesforSplunk, https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/SecureSplunkWebusingasignedcertificate for SSL certificate installation I receive an error message when I tried to restart Splunk: Cannot decrypt private key in "/opt/splunk/etc/auth/mycerts/myServerPrivateKey.key" without a password web.conf [settings] enableSplunkWebSSL = true privKeyPath = /opt/splunk/etc/auth/mycerts/myServerPrivateKey.key serverCert = /opt/splunk/etc/auth/mycerts/splunkCert.pem Any solutions for this issue will be appreciated!
Hi @HugheJass , what's the field you are using for the conditions *, 0 !=0? in the shared code there isn't any field. my_field is a common name that means the field that you're using. Ciao. Giuse... See more...
Hi @HugheJass , what's the field you are using for the conditions *, 0 !=0? in the shared code there isn't any field. my_field is a common name that means the field that you're using. Ciao. Giuseppe
How do I get access to IA4S, app #7186?  Assume it is probably still in testing, but very interested
Thanks, this is helping! I can see now that there is indeed separate events indexed into Splunk's own data format. Now... how can I ensure that the specific information within the events are used in ... See more...
Thanks, this is helping! I can see now that there is indeed separate events indexed into Splunk's own data format. Now... how can I ensure that the specific information within the events are used in a SPLUNK search? For example, one of the pieces of information within the event, is a name of a parent group. How can I ensure, that when I run a search, it will look into these events and match my results with the corresponding parent group? Thank you for your patience and please bear with me while I try to work this out!
Does anyone know of a list of component codes and their meanings for at least _internal and _audit? I have asked instructors and Splunk direct with no help so far. 
Thank you for your formatting advice.  This is my first detailed post as I dive into Splunk dashboards, so I will keep that in mind moving forward. The token works fine in general with a wildcard or... See more...
Thank you for your formatting advice.  This is my first detailed post as I dive into Splunk dashboards, so I will keep that in mind moving forward. The token works fine in general with a wildcard or 0, so I didn't add more detail on how it's used because I didn't think that part needs troubleshooting.  The data is there in my results.  I figured there is a simple syntax issue that's stopping me from filtering it properly.  I'm pulling login events from Azure AD.  The field I'm working with here is status.errorCode.  I'm using two tokens - UserID and errorCode.   index="mscloud" userPrincipalName="$UserID$" status.errorCode="$errorCode$" | spath userPrincipalName | search userPrincipalName="*@company.com" | spath status.errorCode | search status.errorCode="*"| sort _time + desc | table _time createdDateTime userPrincipalName appDisplayName status.errorCode status.failureReason status.additionalDetails clientAppUsed conditionalAccessStatus  
I have no idea what "Splunk explorer" you're talking about. Honestly. But it doesn't work the way you think it does. If it's ingested into an index, it's split into separate events and indexed into ... See more...
I have no idea what "Splunk explorer" you're talking about. Honestly. But it doesn't work the way you think it does. If it's ingested into an index, it's split into separate events and indexed into Splunk's own data format. There is no "csv file" of the data anymore on Splunk's side. Assuming you're indeed talking about indexed data, not the lookups.
I did verify it by comparing the inputs.conf and outputs.conf files. They are exactly the same. The files in etc/system/local (because that's where the splunk add monitor create entries as far as I ... See more...
I did verify it by comparing the inputs.conf and outputs.conf files. They are exactly the same. The files in etc/system/local (because that's where the splunk add monitor create entries as far as I remember) might be identical but you may be inheriting some settings from other configs. That's why I asked about btool. Do splunk btool inputs list --debug and splunk btool outputs list --debug to see the effective config on both forwarders. That's first thing to check. Another thing is to verify the props/transforms to see if they - for example - don't match only specific subset of data which is matched by events coming from one host but not from the other. It's hard to advise something specific without knowing your config and your data.  
Thank you for the tip to add the field into my condition, but this produces no results.  Leaving in * and 0 show those results, but even putting "my_field=0" shows none.
hi PickleRick, thank you for your answer. I know where the file is and I can open it from SPLUNK explorer. This indexed file is used in one of my searches, but unfortunately the search has recently s... See more...
hi PickleRick, thank you for your answer. I know where the file is and I can open it from SPLUNK explorer. This indexed file is used in one of my searches, but unfortunately the search has recently stopped providing correct information. When investigating the issue, I discovered that the data pulled from the indexed file, misses values in some columns (which are crucial) and therefore the search results are incorrect. When I open the .csv file directly from its location, the values in all columns are correct. I wanted to open the .csv file in SPLUNK search to see what it will look like, but if this is not possible, I will have to find another way of working this out.