开发者

Locating most recently updated file recursively in UNIX

For a website I'm working on I want to be able to automatically update the "This page was 开发者_JAVA技巧last modified:" section in the footer as I'm doing my nightly git commit. Essentially I plan on writing a shell script to run at midnight each night which will do all of my general server maintenance. Most of these tasks I already know how to automate, but I have a file (footer.php) which is included in every page and displays the date the site was last updated. I want to be able to recursively look through my website and check the timestamp on every file, then if any of these were edited after the date in footer.php I want to update this date.

All I need is a UNIX command that will recursively iterate through my files and return ONLY the date of the last modification. I don't need file names or what changes were made, I just need to know a single day (and hopefully time) that the most recently updated file was changed.

I know using "ls -l" and "cut" I could iterate through every folder to do this, but I was hoping for a quicker-running and easier command. Preferably a single-line shell command (possibly with a -R parameter)


The find outputs all the access times in Unix format, then sort and take the biggest. Converting into whatever date format is wanted is left as an exercise for the reader:

find /path -type f -iname "*.php" -printf "%T@" | sort -n | tail -1


GNU find

find /path -type -f -iname "*.php" -printf "%T+"

check the find man page to play with other -printf specifiers.


You might want to look at a inotify script that updates the footer every time any other file is modified, instead of looking all through the file system for new updates.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜