开发者

Basic awk help: want to scan a Rails log and find number of unique IPs per day

I have a Rails log that looks like this

Started GET "[path-omitted]" for 166.137.138.210 at 2011-06-07 13:03:29 -0400
etc.
开发者_如何转开发

I'd like to write an awk script that extracts the number of unique IP addresses per date. I'm an awk newbie. How I can do this?


If the IP is always in the fifth column:

awk '{ ips[$5]++ } END { for (ip in ips) { print ip } }' your_log_file

If you want to output the count, you can change the print statement to:

print ips[ip]" "ip

or something similar.


You can start with something like the following quick and dirty script -- and then apply awk easily to get your task accomplished. In the script below, sample.txt is the file containing the log information. I am leaving the final awk script as an exercise for you :) It should be really easy. If you are having trouble, let me know and I can give it you in an hour or two but it might be worthwhile to try it.

  sed 's/^[^"]*.[^"]*..for.//'  < sample.txt  | sed 's/\([^ ]*\) at \([^ ]*\).*/\2 \1/' | sort -u
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜