The log information contains say 10,000 lines which has status as "SUCCESS"or "MAJOR." Currently the query contains two searches - first a main query to search the log lines marked with "SUCCESS" and then append the result of a subquery search of the log lines marked with "MAJOR." In this case the two searches will run separately contributing to 2n searches (n - number of lines in a log file)
Could you please let us know how to run the search in an optimized way, may be n searches alone. The information contained for "SUCCESS" and "MAJOR" log file will contain the same information.
2014-01-09 02:04:09,121 [450d450d] MAJOR: FTP Inbound Agent : Directory [] File [] on Server [10.6.16.222] Service [FTP VPN Inbound_FINCEN] User [tony] could not be scanned: Could not connect because of error [org.apache.commons.net.ftp.FTPConnectionClosedException: FTP response 421 received. Server closed connection.]. The FTP configuration attributes are server [11.120.110.111:2442], username [tony@10.8.12.211:10021], account [], Site CMDs prior [], Site CMDs after []. BizManager filename [], datastore filename: [], Cascading message: [Could not connect because of error [org.apache.commons.net.ftp.FTPConnectionClosedException: FTP response 421 received. Server closed connection.]. The FTP configuration attributes are [ username [tony@10.8.12.211:10021], account [], Site CMDs prior [], Site CMDs after []]] [.io.agents.nftp.inbound.FtpInboundAgent]
20131220.dbg-11-trc-0.log:2013-12-20 09:37:24,652 [7e4e7e4e] SUCCESS: File successfully uploaded using SFTP. Filename was [BYGSFB20.F00]. File length was [1407178]. Connected to host [ftp.uuu.co.uk]. Key fingerprint is [4g:52:77:5v:5b:67:b4:fx:c8:a6:c6:33:74:77:f7:b1]. Bit length of key is [1024]. Connected to [CLIENT@ftp.uuu.co.uk:22] via HTTP proxy [11.60.120.322:34567]. Authenticated using password. Transfer Mode [BINARY]. CD to DIR [/Incoming] was successful. Final Filename was [BYGSFB20.F00] in directory [/Incoming] on server [ftp.uuu.co.uk:22]. The upload process took [5509] milliseconds. Upload Transfered at [249.45] kbps. Chmod 0644.Total time to upload file, including retries and encryption, if any, was [22442] milliseconds. [.io.agents.sftp.outbound.SFTPOutboundAgent s
The log information contains say 10,000 lines which has status as "SUCCESS"or "MAJOR."
The requirement is to extract useful information like user id, filelength, file name, file size, Status of FTP-ing the file etc from the events (listed below). Status could either be "SUCCESS"or "MAJOR" (Major: Failed). In both the events mentioned below, the information to be extracted are available and we are able to extract the fields through a query.
Our question is related to the Optimization of the Search query. Currently the query contains two searches - first a main query to search the log lines marked with "SUCCESS" and then append the result of a subquery search of the log lines marked with "MAJOR." In this case the two searches will run separately contributing to 2n searches (n - number of lines in a log file)
index=fxr file |search object="*SUCCESS"|regex _raw= "#*loaded" |search source="*.dbg-*trc*.log"|rex "\s\[(?<filename>\S+)].\sFile length (was\s)?\[(?<Sizeoffile>\d+)]"|rex field=_raw "(?:,\d+\s\S+\s|:\d+:\d+\s)(?<st>\S+):"|eval Status=case(st=="SUCCESS","completed",st=="MAJOR","failed",st=="+MAJOR","failed")|append [search index=fxr file |search object="*MAJOR"|search source="*.dbg-*trc*.log"|rex field=_raw "(?:,\d+\s\S+\s|:\d+:\d+\s)(?<st>\S+):"]|eval Status=case(st=="SUCCESS","completed",st=="MAJOR","failed",st=="+MAJOR","failed")| table filename Sizeoffile Status
Note : In the query, currently we are fetching only the Status. Other fields are removed for readability and understandability of the question.
... View more