开发者

When can a design pattern make your software worse? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.

Want to impro开发者_如何学编程ve this question? Update the question so it can be answered with facts and citations by editing this post.

Closed 4 years ago.

Improve this question

When can a design pattern make your software worse?

I have seen a program where they used the facade pattern between the GUI and logic. They considered that no objects may be transported over this, so only primitive types were used which made it difficult to code.


When you misuse it, or, sometimes worse, when you use a pattern unneccessarily. The latter drives me nuts because when I analyze a piece of code, I naturally first assume there's a reason for everything in there, and when I can't see a reason for something, the safest default assumption is that I'm missing something. And since I don't want to risk breaking code in unknown or unpredictable ways, I waste time trying to figure out why it's in there before I mess with it.


For decades programmers have often spent time making software worse by applying practices they don't understand, rather than learning why that practice is good. The use of certain tools/keywords/frameworkes can make a programmer feel sophisticated, when they're just screwing things up. Some common examples:

  • Throwing in lots of try/catch/finally blocks can make a programmer feel like they have an error handling scheme, when in fact they don't.
  • Replacing sql with stored procedures can make programmers feel like they have a data access layer that magically gives them efficiency, reuseability, encapsulation, when in fact they don't.
  • Using classes makes many programmers believe they're engaging in object oriented programming, when in fact they aren't (or are only scratching the surface of OO).

This list could go on forever. This tendency has always been around and always will be, because we humans always look for shortcuts. We see this a lot with patterns now because patterns are currently popular. The root problem is the same: developers not having a respect for how complicated software development is, and thinking a blindly implemented pattern or practice is a substitute for learning and humility.

The solution? Difficult to say. You can't go to wrong though handing a junior developer Code Complete or some such, to help them understand this is all about applying principles to particular situations, and that great developers keep it simple.


When it's name is singleton


A Design Pattern is simply a way of Naming and Documenting a common code structure. As such, you should not conclude that all patterns are good ones. The people that do a good job of documenting design patterns (eg the GoF) always include a section in the pattern description describing when you should, and should not, use the pattern. As such half the point of most design pattern books is to answer the question "When can a design pattern make your software worse?"

Many inexperienced developers never bother to learn the appropriate uses section and end up applying patterns where they are inappropriate. I think this is the cause of a lot of the negativity surrounding design patterns.

In conclusion the answer to this question should really be:- Go and read the design pattern and it will tell you.


A design pattern can make your software worse if you misuse it, applying it in a wrong situation. Common sense should be the perfect companion for a design pattern.

The misuse of patterns sometimes is associated with a lack of understanding of the forces associated with the problem and the pattern proposed solution to that problem. Also the intent of the pattern could be misunderstood leading to misuses.

To a better comprehension of patterns you also need to learn about the Anti pattern concept and read a lot more than the GOF book. One suggestion is to read Pattern Hatching to complement GOF's concepts.


It's interesting that, over the past year, I've invested a fair amount of time learning Design Patterns and their implementations - just as there seems to be a growing blow-back against Design Patterns. Design Patterns have, fairly or not, joined the ranks of processes and methodologies that were supposed to make software a quantifiable skill - i.e., the fallacy that using them would make you write better code. I think the problem with Design Patterns is that less skilled programmers use them as a "Golden Hammer". They learn the Strategy Pattern - which is a valuable thing - but then they use it everywhere, over-complicating their code and adding unnecessary features and "expandability".

I believe there is immense value in refactoring to patterns, because you're cleaning up existing code. Patterns are also valuable communication tools, that let you describe your code in higher-level terms. But stuffing Design Patterns into your software because they're Cool and Marketable (a.k.a "Resume-Driven Development") is a bad idea.


What makes easy difficult (EJB)


When it's the wrong pattern for your problem.


In general, when a coder takes the attitude that patterns are the lego-like bricks one builds software out of, you get some really odd code.


The software will get worse when you use design patterns as absolute building-blocks, instead of as blue-prints to help solve coding problems.

They weren't meant to be the blocks in which the problem is framed. They shouldn't show up in the Object naming conventions. Objects shouldn't be called *Factory or *Singleton, or any other pattern name. They shouldn't be locked to a specific pattern.

They are just really good "starting points" to help one build complex code more rapidly. You can (and should) mix and match the ideas as needed. Documentation? That's what the comments are for, not the Object namespace ...

Paul.


It is called anti-pattern, and please refer to AntiPatterns: Refactoring Software, Architectures, and Projects in Crisis book for examples.


You can read about antipatterns here, for example.


I think the strategy pattern can make software worse if it's not needed. The strategy pattern basically makes a piece of functionality "pluggable", meaning we can switch out different implementations on the fly. But this pluggable functionality often adds a lot of complexity (and sometimes extra overhead) with configuration, and if it's not needed, it's a waste. I see this pattern abused quite often.


Levi, I would only use a specific pattern if it made my life (and those that are responsible for maintaining the code) easier. I believe that proper design pattern selection is just as important as comments - maybe comments are more important? You'll find that design patterns, for the most part, formally describe techniques that we as developers have used for years.

TheEruditeTroglodyte


Any pattern that is mis-understood or mis-applied would tend to make the code worse not better. Also, any pattern that is applied just for the sake of the pattern, e.g. "We're SOA now!". Generally if a pattern introduces a "code-smell" then its use should be suspect.


I'd say whenever you choose a pattern based on solving one problem, and not all of the problems your software intends to solve.

Conversely, trying to over-engineer a solution to fit all possible eventual use cases can lead you to using a design pattern that is too bulky to fit the majority of your use cases.

So the ideal pattern/level of adherence to that pattern would be something that falls between the two extremes.


I can think a lot of ways a design patter can go wrong. The GoF book explains the problem, solution and drawbacks for all patterns it talks about. - A pattern is a solution to a problem. If that problem is not your problem, then it's probably it's going to become an antipattern. - A pattern has drawbacks, and you should be aware of them. - When you don't need it. A pattern may imply a lot of classes, and you're trying to solve a problem you may have in the future but don't have yet. - If you don't understand how it works and you implement it wrong, it becomes an antipattern as well. You're gonna end up with a lot of useless classes and code. - When you start to get addicted to it. Hierarchies that are too deep, bad use of polimorphism or singletons, etc. can pose a problem.

There are probably other reasons as well...


When people analyze the code and spend more time trying to understand the pattern than the business logic. Patterns are good, but only if everyone involved knows how to read them.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜