Getting Data In

Indexing

vishenps
Path Finder

Hi folks, 
Happy new year to you all:-)

In my org the Splunk deployment is as follows:

Heavy forwarders running (HF1, HF2) > Collecting data from directories, HTTP > Sent to Splunk cloud (2 search heads).

Case: We have Active Directory add on HF1>which establishes connection to AD> write a CSV file in var/* of the host and > being indexed to the cloud. 

admin said we have input which write data to index=asset_identity : I AM NOT SURE WHAT THE ADMIN WAS REFFERING TO? IS IT CONF FILE ON HF? 

0 Karma

vishenps
Path Finder

 

| ldapsearch domain="default" search="(&(samAccountType=000000000) (|(sAMAccountName=*)))" attrs="sAMAccountName, distinguishedName, userAccountControl, whenCreated, personalTitle, displayName, givenName, sn, mail, telephoneNumber, mobile, manager, department, co, l, st, accountExpires, memberOf"
| rex field=memberOf "CN=(?<memberOf_parsed>[^,]+)"
| eval memberOf=lower(replace(mvjoin(memberOf_parsed, "|"), " ", "_"))
| rex max_match=5 field=distinguishedName "OU=(?<dn_parsed>[^,]+)"
| eval category=lower(replace(mvjoin(dn_parsed, "|"), " ", "_"))
| eval priority=case(match(category, "domain_admin|disabled|hold|executive") OR match(memberOf, "domain_admins|enterprise_admins|schema_admins|administrators"), "critical",
  match(category, "contractor|service_account|external"), "high", match(category, "employees|training|user_accounts|users|administration"), "medium", 1==1, "unknown")
| eval watchlist=case(match(category,"disabled|hold"), "true", 1==1, "false")
| eval startDate=strftime(strptime(whenCreated,"%Y%m%d%H%M"), "%m/%d/%Y %H:%M")
| eval endDate=strftime(strptime(accountExpires,"%Y-%m-%dT%H:%M:%S%Z"), "%m/%d/%Y %H:%M")
| eval work_city=mvjoin(mvappend(l, st), ", ")
| rename sAMAccountName as identity, personalTitle as prefix, displayName as nick, givenName as first, sn as last, mail as email, telephoneNumber as phone,mobile as phone2, manager AS managedBy, department as bunit, co AS work_country
| fillnull value="unknown" category, priority, bunit
| table identity,prefix,nick,first,last,suffix,email,phone,phone2,managedBy,priority,bunit,category,watchlist,startDate,endDate,work_city,work_country,work_lat,work_long | outputcsv xyz.csv

 

this the search that is being used to generate a csv file, and yes, it's same addon as you mentioned. 
I believe you're right that > they're writing to a directory (on the same host as HF) And ingesting it by using a input. conf file. 
Because in cloud we cannot monitor directories directly from cloud instance. 
Correct me? thanks

0 Karma

m_pham
Splunk Employee
Splunk Employee

Can you clarify what technical addon you're using? Also, couldn't you ask your admin to clarify on the question you have originally?

If you're using this addon here, then you can write a search using the LDAP command to write to an index with the collect command. Otherwise, whatever you're doing with the CSV file and then having a file monitoring to ingest the CSV is the long way to do it.

0 Karma
Get Updates on the Splunk Community!

Fastest way to demo Observability

I’ve been having a lot of fun learning about Kubernetes and Observability. I set myself an interesting ...

September Community Champions: A Shoutout to Our Contributors!

As we close the books on another fantastic month, we want to take a moment to celebrate the people who are the ...

Splunk Decoded: Service Maps vs Service Analyzer Tree View vs Flow Maps

It’s Monday morning, and your phone is buzzing with alert escalations – your customer-facing portal is running ...