开发者

Using shared files as database

As a project for my database classes, I built a simple object-oriented database (coded in C++). The DB manages concurrency by using a gateway file, which grants read/write access to the entire DB. To access the same DB across different machines, you use shared folders.

I built a little quizzing application on top of that. Everything works fine on a single system with multiple users开发者_运维百科 as well as on a 3 computer network on my home. But when its run on my University's network, I keep getting inconsistent data corruption in the form of bad CRCs (in my database, not the disk), file headers being inconsistent with file data, and other weird errors, which I'm unable to track down. The network is problematic - sometimes some nodes on the network become unreachable, and sometimes copying a file takes across the n/w takes an inordinate amount of time.

Occasionally, I get an error message 'Windows delayed write failed', so I'm thinking the problems are being caused by problems with file sharing across the network. From some analysis its seems that data is being cached, and so I don't know really know whether a disk write is successful.

Does anyone have any experience in using shared files as databases? I want to know whether using shared files is reliable, and whether I should be looking at bugs in my code as the cause of the problems.

Thanks.


No, it's not reliable. It was the reason why CVS disabled mode that used shared files for repository. The solution is to create a server (e.g. a simple TCP/IP server).

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜