Activity Feed
- Karma Re: comparing fields to find identical values for gcusello. 12-15-2020 09:41 AM
- Got Karma for Re: comparing fields to find identical values. 12-15-2020 09:23 AM
- Posted Re: comparing fields to find identical values on Splunk Search. 12-15-2020 09:18 AM
- Posted Re: comparing fields to find identical values on Splunk Search. 12-15-2020 08:05 AM
- Posted Re: comparing fields to find identical values on Splunk Search. 12-15-2020 06:43 AM
- Posted Re: comparing fields to find identical values on Splunk Search. 12-15-2020 05:51 AM
- Posted comparing fields to find identical values on Splunk Search. 12-14-2020 04:45 PM
- Got Karma for Re: Filtering wmi events on a heavy forwarder.. 06-05-2020 12:45 AM
- Posted Re: Parsing XML into fields is not working properly on Dashboards & Visualizations. 03-19-2020 02:58 PM
- Posted Re: Parsing XML into fields is not working properly on Dashboards & Visualizations. 03-18-2020 04:42 PM
- Posted Re: Parsing XML into fields is not working properly on Dashboards & Visualizations. 03-18-2020 11:26 AM
- Posted Re: Parsing XML into fields is not working properly on Dashboards & Visualizations. 03-18-2020 11:20 AM
- Posted Re: Parsing XML into fields is not working properly on Dashboards & Visualizations. 03-18-2020 09:46 AM
- Posted Re: Parsing XML into fields is not working properly on Dashboards & Visualizations. 03-18-2020 09:28 AM
- Posted Parsing XML into fields is not working properly on Dashboards & Visualizations. 02-21-2020 12:21 PM
- Tagged Parsing XML into fields is not working properly on Dashboards & Visualizations. 02-21-2020 12:21 PM
- Tagged Parsing XML into fields is not working properly on Dashboards & Visualizations. 02-21-2020 12:21 PM
- Tagged Parsing XML into fields is not working properly on Dashboards & Visualizations. 02-21-2020 12:21 PM
- Tagged Parsing XML into fields is not working properly on Dashboards & Visualizations. 02-21-2020 12:21 PM
- Tagged Parsing XML into fields is not working properly on Dashboards & Visualizations. 02-21-2020 12:21 PM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 |
12-15-2020
09:18 AM
1 Karma
I tried creating a new index and imported a much smaller subset of data that I knew the results for so it was easier to verify the Splunk search results. This worked: index=easy_c [ search
index=easy_c
| dedup User_Full_Name
| rename User_Full_Name AS Customer_Full_Name
| fields Customer_Full_Name
]
| table _time User_Full_Name Customer_Full_Name Thank you so much for your help in getting this figured out @gcusello
... View more
12-15-2020
08:05 AM
So here's my modification on what you provided index=import [ search
index=import
| dedup Customer_Full_Name
| rename Customer_Full_Name AS User_Full_Name
| fields User_Full_Name
]
| table _time User_Full_Name Customer_Full_Name I think it's working as there are much fewer results 1,241 (The total number of events in this index is around 60,000) and only 7 users are listed having accessed records. If I reverse the field values thus: index=import [ search
index=import
| dedup User_Full_Name
| rename User_Full_Name AS Customer_Full_Name
| fields Customer_Full_Name
]
| table _time User_Full_Name Customer_Full_Name My results are different. Shouldn't I get the same results?
... View more
12-15-2020
06:43 AM
If I only want the events where they look up their own record I can use this index=customers | where Customer_Full_Name=User_Full_Name Which is similar to the eval you posted, just without tags. | search "Customer_Full_Name" = "User_Full_Name" returns no results at all. The search needs to take the value of Customer_Full_Name in a single event and compare it with the results of User_Full_Name in every event to see if it matches then return only those results. I'm hoping that I'm making at least some sense.
... View more
12-15-2020
05:51 AM
The first search function works just fine but not exactly what I'm looking for. The second option just ends up returning all of the events and doesn't break anything down. What I want to do, is eliminate from the results all customers that aren't users.
... View more
12-14-2020
04:45 PM
I imported a csv into Splunk and now I need to compare two of the fields to find identical values. Compare the values of "Customer_Full_Name" and "User_Full_Name" to find who, if anyone, is both a customer and a user. I feel like eval should be able to help here but can't think of how to do it. Once I have that figured out I need to see if there are users looking at the records of customers that happen to also be users but I'll leave that for another question later.
... View more
Labels
- Labels:
-
fields
03-19-2020
02:58 PM
I ended up doing a custom field extraction for the fields I wanted. I had to write my own regex since the auto regex wasn't cooperating.
For username:
^(?:.*)<Username>(?P<username>[^<]+)
For source IP:
^(?:.*)<IpAddress>(?P<src_ip>[^<]+)
For the workstation that the user connects to:
^(?:.*)<Resource>(?P<workstation>[^<]+)
... View more
03-18-2020
04:42 PM
Tried it but no change. Still not parsing everything in the XML data.
... View more
03-18-2020
11:26 AM
Looking more at the live data. The following section of the XML gets parsed into a single feild called UserData_Xml Still no idea how to have it parse deeper.
<EventInfo xmlns="aag">
<Username>domain\username</Username>
<IpAddress>173.x.x.x</IpAddress>
<AuthType>NTLM</AuthType>
<Resource />
<ConnectionProtocol>HTTP</ConnectionProtocol>
<ErrorCode>0</ErrorCode>
</EventInfo>
... View more
03-18-2020
11:20 AM
Looking at the live data again. <message> gets parsed.
... View more
03-18-2020
09:46 AM
Should my props.conf and transforms.conf look exactly as you have them? And is REPORT-xms_second a typo? I tried it as written and as REPORT-xml_second but it didn't make a difference either way.
... View more
03-18-2020
09:28 AM
Tried it as shown and it worked. However, if I collapse the XML into a single line of text (like it is as it gets ingested), it breaks. Played with it a bit and it looks like the <Message> section is what breaks it because the makeresults parses fine when I remove it.
... View more
02-21-2020
12:21 PM
Splunk isn't completely parsing the xml into fields in search results, only sections. For example, in the sample event below, the system and userdata sections are fields but the xml headers inside them are not parsed into fields (i.e. Username and IpAddress .)
Based on some of what I've read here in the forums, I've already edited my props.conf for sourcetype=XmlWinEventLog but haven't seen any change.
[source::XmlWinEventLog]
KV_MODE=xml
TRUNCATE = 0
I don't know what I'm missing and could use some help. (Hell, what I put in there, Splunk was probably already doing)
Here's a sample event (I added line breaks to make it easier to read. Raw data in search results it's a single line):
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event" xml:lang="en-US">
<System>
<Provider Name="Microsoft-Windows-TerminalServices-Gateway" Guid="{4D5AE6A1-C7C8-4E6D-B840-4D8080B42E1B}" />
<EventID>200</EventID>
<Version>0</Version>
<Level>4</Level>
<Task>2</Task>
<Opcode>30</Opcode>
<Keywords>0x4020000001000000</Keywords>
<TimeCreated SystemTime="2020-02-21T18:54:19.913701800Z" />
<EventRecordID>1219</EventRecordID>
<Correlation ActivityID="{BEA11342-474B-47DE-907D-F2FBEBD40000}" />
<Execution ProcessID="5480" ThreadID="8416" />
<Channel>Microsoft-Windows-TerminalServices-Gateway/Operational</Channel>
<Computer>gatewayserver.domain.com</Computer>
<Security UserID="S-1-5-20" />
</System>
<UserData>
<EventInfo xmlns="aag">
<Username>domain\username</Username>
<IpAddress>173.x.x.x</IpAddress>
<AuthType>NTLM</AuthType>
<Resource />
<ConnectionProtocol>HTTP</ConnectionProtocol>
<ErrorCode>0</ErrorCode>
</EventInfo>
</UserData>
<RenderingInfo Culture="en-US">
<Message>The user "domain\username", on client computer "173.x.x.x", met connection authorization policy requirements and was therefore authorized to access the RD Gateway server. The authentication method used was: "NTLM" and connection protocol used: "HTTP".</Message>
<Level>Information</Level>
<Task />
<Opcode />
<Channel />
<Provider />
<Keywords>
<Keyword>Audit Success</Keyword>
</Keywords>
</RenderingInfo>
</Event>
... View more
08-25-2010
06:10 PM
1 Karma
I was completely off!
Process_Name can't be used (unless I want to get really hard core in editing other conf files)
Here is the working config:
props.conf
[WinEventLog:Security]
TRANSFORMS-null = setnull
transforms.conf
[setnull]
REGEX=(?msi)^EventCode=(520.*netman\.exe|4656.*rtvscan\.exe)
DEST_KEY=queue
FORMAT=nullQueue
... View more
08-23-2010
06:59 PM
I even used the msi option.
... View more
08-23-2010
06:57 PM
bump! Help? Anyone?
... View more
08-21-2010
01:34 AM
Never mind. It still doesn't work!
... View more
08-20-2010
11:54 PM
I think i found it. There is a "." in the REGEX . I needed to put a backslash "\" before the "." So the lines should have been:
REGEX=(?m)^Process_Name="C:\\Winpds\\Prismexe\\netman\.exe"
REGEX=(?m)^Process_Name="C:\\Program*\\Symantec\\Symantec Endpoint Protection\\Rtvscan\.exe"
Did I mention that I'm new to this whole REGEX thing?
... View more
08-20-2010
06:29 PM
Well, apparently on the 2 problem systems, this is the solution:
I had to add the line
index = windows to
%SPLUNK_HOME\etc\system\local\inputs.conf
and
removed %SPLUNK_HOME\etc\apps\SplunkForwarder\local\inputs.conf (on the central heavy forwarder)
removed %SPLUNK_HOME\etc\apps\SplunkLightForwarder\local\inputs.conf (on the Windows 7 light forwarder)
I still have no clue why it needed to be different for these 2 systems.
... View more
08-20-2010
06:18 PM
OK, I got the central forwarder fixed. I had to add the line index = windows to %SPLUNK_HOME\etc\system\local\inputs.conf and removed %SPLUNK_HOME\etc\apps\SplunkForwarder\local\inputs.conf
... View more
08-20-2010
06:09 PM
OK, any idea as to why these 2 systems aren't cooperating then?
... View more
08-20-2010
05:54 PM
I don't think my syntax is correct though. These events are still being forwarded to the central indexer.
... View more
08-20-2010
03:03 AM
So on those 2 systems do you think placing the inputs.conf in %SPLUNK_HOME\etc\system\local will do the trick?
... View more
08-18-2010
09:14 PM
I have a bunch of light forwarders sending data to a central heavy forwarder which then sends the data to the main indexer.
On the central heavy forwarder and the main indexer I've created an index called windows .
When I run a search on the main indexer, it doesn't look like all of the data is going into the windows index and I don't know why. In particular one Windows 7 Professional system (a light forwarder) and a Windows XP Pro system (the central heavy forwarder). The other systems are getting forwarded just fine (They consist of Win 2000, XP, Server 2003 and Server 2008 R2). Help?
On each light forwarder Ive placed inputs.conf in: %SPLUNK_HOME\etc\apps\SplunkLightForwarder\local\inputs.conf
[default]
index = windows
On the central heavy forwarder I've placed the inputs.conf in: %SPLUNK_HOME\etc\apps\SplunkForwarder\local\inputs.conf
[default]
index = windows
... View more
08-18-2010
08:24 PM
I have a bunch of light forwarders sending data to a central heavy forwarder which sends the data to the main indexer.
This is my props.conf and transforms.conf located on the central heavy forwarder. The light forwarders and main indexer do not have a props.conf or transforms.conf .
Is this correct and/or what am I doing wrong and is there a more efficient way to do this?
Thanks.
props.conf (located %SPLUNK_HOME\etc\system\local\props.conf )
[wmi]
TRANSFORMS-wmi=wminull
transforms.conf (located %SPLUNK_HOME\etc\system\local\transforms.conf )
[wminull]
REGEX=(?m)^Process_Name="C:\\Winpds\\Prismexe\\netman.exe"
DEST_KEY=queue
FORMAT=nullQueue
[wminull]
REGEX=(?m)^Process_Name="C:\\Program*\\Symantec\\Symantec*Endpoint*Protection\\Rtvscan.exe"
DEST_KEY=queue
FORMAT=nullQueue
... View more
- Tags:
- filter
- forwarding