开发者

Java max file caching solution

I'm trying to wr开发者_运维知识库ite many files to a directory, and when the directory reaches X number of files, I want the least recently accessed file to be deleted before writing the new file. I don't really want to roll my own solution to this because I'd imagine someone else has already done this before. Are there existing solutions to this? Note this is for a windows application.

This is related to my question Java ehcache disk store, but I'm asking this question separately since now I'm focusing on a file caching solution.

Thanks, Jeff


I would roll my own, because the problem sounds so easy that writing it yourself is probably easier than trying to learn and adopt an existing library :-)

It it's a low number of files and / or your cash is accessed from multiple processes, call the following method before writing a file:

void deleteOldFiles(String dir, long maxFileCount) {
    while (true) {
        File oldest = null;
        long oldestTime = 0;
        File[] list = new File(dir).listFiles();
        if (list.length < maxFileCount) {
            break;
        }
        for (File f : list) {
            long m = f.lastModified();
            if (oldest == null || oldestTime > m) {
                oldestTime = m;
                oldest = f;
            }
        }
        oldest.delete();
    }
}

If you only access the cache from one process, you could write something more efficient using LinkedHashMap or LinkedHashSet.

Update

Check the number of files instead of the total file size.


You can try this before creating a new file:

void deleteOldFiles(String dir, int maxFiles) {
    File fdir = new File(dir);
    while (true) {
        // Check number of files. Also do nothing if maxFiles == 0
        File[] files = fdir.listFiles();
        if (maxFiles == 0 || files.length < maxFiles)
            break;

        // Delete oldest
        File oldest = files[0];
        for (int i = 1; i < files.length; i++) {
            if (files[i].lastModified() < oldest.lastModified()) {
                oldest = files[i];
            }
        }
        oldest.delete();
    }
}

This would not be efficient for a large number of files, though. In that case I would keep a list of files in the directory, sorted by creation time.

Although all of this gets into the 'roll my own category'...


If you were using Cacheonix you could hook up to the cache events API and remove the files when receiving the notification that a cache entry was evicted by the LRU algorithm.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜