开发者

What happens if a file I want to commit to SVN is updated so often I don't manage to do a merge quickly enough?

Consider a situation. I want to commit a changed file to SVN and see that someone else committed the same file after I checked it out, so I have to "update" and merge changes. While I'm doing that someone commits the same file again, so when I try to commit the merged file I have to update again.

Now if other users commit开发者_JS百科 often enough it looks like I will never be able to commit my changes. Is that really so? How is this problem solved in real development environments?


This is usually viewed as not a technical problem but a "people" problem. It usually indicates a failure in communication among the team, and if you find yourself in this situation, you should discuss with your fellow developers how to better partition your work.

In most cases, as you are finding, it is not practical for developers to be working in the same area of code at the same time without good coordination between them.

If you actually have this happening at such a rate that you can't even commit your changes, you have moved beyond a problem to what sounds like a Denial-Of-Service attack :).


First let us suppose it is not a game, or not done otherwise on purpose to make you angry and steal your nice chair while you went out with anger ...
but needed. ;-)


If this happens, the file is complex to merge (big) and updated very often (will also get big).

This file carries too much responsibility, it should be split logically.

For example, we had a unique property file like this, for the whole application. It even crashed Eclipse to compare two versions of the file ! :-) So some developpers would not compare and merge it, but commit overriding other's commit ! We split the file, one property file per module, and the problems went away.

There are usually other problems associated with this one, such as developpers loosing time to find what they want in a huge file. The split solution solves all these problems.


As a temporary solution, you could synchronize with the people so that they give you an open window to merge and commit. But the problem typically keeps appearing until the team solves it definitely. ;-)


Couldn't you just lock the file?

from the svn book http://svnbook.red-bean.com/en/1.5/svn.advanced.locking.html

svn lock banana.jpg -m "Editing file for tomorrow's release."


You might try creating your own branch and developing against that. Once you have completed your feature, then merge your branch back into the head. This way you defer your merge activities until your work is complete. It doesn't solve your problem -- you still have to do the merge -- but it does defer it and let you get on with your work. Perhaps if everyone on the development team followed this practice -- or something similar with branches only for people working on the same areas -- you'd have fewer issues with the files changing all the time in the main development branch.


You are looking at the problem as if the situation for those other users is somehow different from yours. It is not: the problem will affect not only yourself, but all those other users, too.

It's not as if those other users were members of an elite group that can only cause the problem, but never experience it itself.

In other words:

If other users commit often enough that you will never be able to commit your changes because you have to constantly update, then other users will not be able to commit their changes, either, because they will have to constantly update themselves. This will slow down all commits to a point where at any given time someone will be able to commit his changes without having to update, including yourself.

tl;dr: the problem will fix itself.


Whilst it is fairly common for developers to be working on the same file, it hardly ever happens that you need to perform an update twice in a row, because of people committing before you had a chance to finish the merge.

And if that's the case, and you were to have 5 people working on the same file (crazy in its own right), making micro-commits every 5 minutes, I'd say you got other problems you should worry about and need to restructure your code as well as give them (your peers) a right 'bollocking'.


OrbMan is right to say that this is a people problem, and you need to find a better way of working. However, it may also point to an underlying architectural issue. It is a bad idea to have a file which needs to be changed so often by so many different developers.


While I do totally agree with @OrbMan, I can suggest using "Get Lock" command, but only as a last resort.


Although I agree with OrbMan - if the file is being updated so quickly that you can't get your changes in, the underlying issue is one of developer communication - there is a technical solution.

SVN does support the concept of locking a file. You could try locking the file so that others cannot commit whilst you are merging you changes. When you commit your changes, you unlock the file and all should be OK.

That said, SVN does allow people to break locks. In case someone locked a file and forgot to unlock it before going on holiday, for example. So even if you do lock it, someone could break the lock and commit their code before your done merging. But if someone breaks a named users lock and commit without checking with them first, that is probably an even worse example of lack of inter-developer communication and again is more of a "people" problem.


If the file is being comitted that often, maybe it shouldn't be under source control? My test for this is: "Can I always give a meaningful English description of each revision (i.e. a tag)?" - if not, it probably should not be controlled.


This is not a solution, but a workaround: use git-svn as a local client to Subversion repository.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜