Is coding standard very important for every program a developer writes?
I want some questions to be answered by our fellow developers regaurding coding standards.
- Is it true a developer is judged by his coding standard?
- What are the best practices of coding standards which could be followed in a web application?
- What are the com开发者_如何学JAVAmon coding standards that budding developers fail to adapt in their application?
In my first year of work, at a large and well respected company with brilliant engineers, I got yelled at by an architect for writing if(foo)
instead of if (foo)
-- I asked him to explain, and he said "IF is not a function. Don't do that."
I'm glad that you're thinking about this stuff. Coding "standards" means different things depending on where you are-- there's everything from straight-up "style" (where you put your curly braces) to patterns (this sort of thing should be a singleton, this sort of thing should be produced by a factory object) to guidance specific to the stuff you're working on ("Here in this office, we always use such and such a package to validate and sanitize web parameters, so you should use that.").
In general, the answer that you'll hear from most people is "be consistent with other people already working on your codebase". If they're already not being consistent, then step up and make the rules yourself based on your instincts. If you're a real rookie, don't worry about it-- we all have been-- ask other to review your code as your commit it, and you'll often find you learn a ton about how experienced developers think.
yes, absolutely.
Coding standards are important for many reasons, but one of the most is that when someone reads your code, once they understand your coding style it will be helpful to make sense of what you write. If you keep changing your style then anyone else that reads your code will be hampered.
There are various coding styles on the web depending on the language you want to use, as what is a norm in one language may not be a norm in another.
It helps to follow standards that others are familiar with as it will be in a style that other developers will be familiar with, even if they don't follow it themselves.
Best practices in a web application depend heavily on the platform/language, as what will work well in PHP5 will be different than ASP or ASP.MVC.
I think is any language though, if you have a clean separation between the view and the lower levels, then make certain you unit test the controllers, down to the data accesss layer, as a minimum. Having the clean separation will be a standard, and the unit testing is a rationale for doing it.
I think that budding developers tend to blur layers so it makes it hard to change their application.
- Among other things, yes.
- As with any other application: document your assumptions, since they are the one thing that cannot be read from the code. (And of course follow all the common guidelines, like meaningful names, short methods, and testing, testing, testing.)
- Writing tests.
Coding standards are definitely important. It makes going through the code easy. It makes it possible to automate the code through some template mechanism. It makes it easy to describe to your project manager if he/she becomes interested in auditing.
In terms of your first question, a developer is judged in many different ways, of which adherence to coding standards may be one. It is also true that developers are not always judged on this adherence. Some environments prefer to judge on actual productivity rather than some obscure measure which can usually be automated anyway :-)
Don't get me wrong - if you have coding standards, you should follow them. I tend to code to a standard (however informal) even for my own code that no-one else will ever see, simply because it makes my life a lot easier in future.
Our latest project at work actually separates the semantics of the language from its layout. What gets stored in source control is the semantic content of each file and, when it's checked out, it's converted to a developer-specific format based on local configuration. Then the developer gets the code delivered in their preferred format and can write their code however they want (even in a totally different way to their preferred format if they're feeling masochistic).
And, when checked in, it's converted back into semantic form. That way, there's absolutely no problems with each developer using their own style.
In terms of your other two questions, they're almost certainly a subjective thing and, since I don't do web applications, I can't help you there.
Coding standards are like good writing. What makes good writing (compelling, interesting, etc) differs from person to person but there are some things that are universally true.
Consistency is one of the big ones, a standard that changes from module to module and file to file is the hardest to follow, and probably the least helpful. Best practices generally follow hot on the heels of that.
Other things like coding standards, naming, what gets tested, etc are flavor that every developer and team adds to make it distinctly their own and to help reinforce those things that make the team good at what they do. That is, in my opinion, why there is so much variance from team to team, what each team needs is distinctly different.
Coding standards are very important in any kind of development Environment, It has gots more of Positivity than the negativity, It will allow the ease of understanding. It may take some extra minutes to follow but that those extra minutes that we have put, will defnitely save our Extra Hours Of our time during the Maintainence or Bug fixing stage of the Same program.
精彩评论