One of the most important but tedious tasks a system administrator should perform regularly is to monitor several system log files for evidence of system trouble, security breaches, etc. Unfortunately, these system logs are not very exciting reading. You are to create a Perl program that reads one or more system logs and creates a summary report for an administrator to read, instead of the original log. I have placed links (under Assignment #4) on the course web page to copies of some log files (zipped) from our departmental Linux server.
Here are some ideas about possible summary reports:
access_log: This is the log produced by the Apache web server
of page requests. You might summarize the number of accesses by each
distinct IP address, or of each distinct file. How many requests
appear to come from google, yahoo, or msn search tools?
File error_log: This
is the log produced by the Apache web server of errors that it
detected. You might summarize the types of errors detected, or
the most frequent error, or the most requested file not found.
File proftpd.log: This is the log file created by the ftp server, proftp. Primarily it shows who opened and closed ftp sessions and from what IP address. How many sessions were opened by each user? What was the average duration of each session, by user? What errors were recorded?
File xferlog: This file is also created by the ftp server. It lists the files that were transferred to or from another host, the size of those files, and the user. How many files did each user transfer? How many total bytes did each user transfer?
File auth.log: This file is created by the authorization system. Most of the entries are generated by the cron daemon, but other entries are generated by su, sshd, proftpd, etc. These are usually more interesting. How many unsuccessful logins using sshd were recorded? By whom? How many ftp sessions were initiated by each user?