开发者

Version control on a HUGE number of XML files

I am working on a system that will have several hundred thousand XML files, ranging from 2K to 1MB i开发者_JAVA百科n size. Does anyone have experience using version control with >300k files? Will SVN or git become problematic?

I am familiar with SVN but have no experience with any other version control.

EDIT: I have tried both SVN and git with 120,000 XML files, weighing 1.2 GB. git works much better, SVN becomes very slow with this many files. On a Mac, both SvnX and gitX choke on the repos, so it's command line all the way.


I'm working on a project that involves somewhere around 300K XML (and other) files. Subversion (hosted on a Linux VM) seems to handle it just fine. The only caveat is that commits involving changes to large subsets (around 50,000 files) can take a very long time. I have had to parcel them out (e.g. execute an svn commit for each subdirectory instead of the whole) in order to get them to work.


Windows or Unix? It's been my personal experience that single directories with so many files can cause some performance issues in Windows unrelated to source control. If possible, I'd separate those XML files into subdirectory groupings.

As far as source control goes, I haven't had any issues with both SVN and TFS repositories containing 10k+ files, so I can guess that it will handle 100k+ files.

Hope that helps.


How about just trying? There are many factors involved (disk, memory, caches) and it depends on how you want to check them out (all at once vs. a couple)... On top of that, your definition of "what performs" might be different. For example, you might want to wait 2 min's for a checkout if it only happens every 6 months. But not if it happens every 5 minutes.

No substitute...

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜