开发者

How to implement long-term statistics and short-time log?

We develop a larger database web application with Perl Catalyst and PostgreSQL under Linux. Users can login and upload and download data files (scientific measurements).

I wonde开发者_Python百科r how to implement a logging/statistics system.

  1. We need to view general access trends, and want to analyze traffic caused by certain users/IPs and get access numbers for certain files or topics. I was thinking about something like RRDtool to implement this or writing the total numbers to another database table. I would be nice to get some visual graphs from the access data:-)

  2. Additionally we need to analyze the activity over the last days in detail. If problems or attacks occurred it must be understood and undone. IMO this needs an action log in a database table.

Can you give me some inspiration on how to implement these things? I would love to use the same system both for logging and long-term statistics. Maybe we can accumulate log data after a period of e.g. 7 days. Not that I had no idea how to do it, but I'd like to hear some opinion from somebody else.

Hints to useful CPAN modules are appreciated. We know and already use log4perl but this is a bit too detailed to store it for ~7 days...


Actually I think you answered yourself, RRDTool is pretty good for long-term, I use it for 1/2hr automatic meter readings for a communal boiler system, with a 3 year window. Nice graphs too.

However, I'm assuming that all this runs under a web server and the uploads and downloads generate [for example] Apache logfile entries, then you have a great many options with this: http://httpd.apache.org/docs/current/mod/mod_log_config.html.

This would mean that you could use Webalizer for 'routine' reports and write roll-your-own for the detail, maybe starting from: http://search.cpan.org/~ulpfr/Logfile-0.302/Logfile.pod

Hope that's a little bit helpful, it's a broad, broad question though.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜