Bleeding edge vs field tested technology. How will you strike a balance [closed]
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this questionI have been pondering about this for some time. How do you pick a technology ( am not talking about Java vs .Net vs PHP) when you are planning for a new project /maintaining an existing project in an organization.
Arguments for picking the latest technology
- It might overcome some of the limitations of the existing technology ( Think No SQL vs RDBMS when it comes to scalability). Sometimes latest technology is backward compatible and only get to gain the new features without breaking the old functionality
- It will give better user experience (May be HTML 5 for videos, just a thought)
- Will cut down development time/cost and make maintenance of the code base relatively easy
Arguments for picking field tested technology/against picking a bleeding edge technology
- It has not stood the test of ti开发者_开发技巧me. There can be unforeseen problems. convoluted solutions might lead to more problems during maintenance phase and the application might become a white elephant
- Standards might not yet be in place. Standards might change and significant rework might be needed to make the project adhere to standards.Choosing the field tested technology will save these efforts
- The new technology might not be supported by the organization. Supporting a new (or for that matter a different technology) would require additional resources
- It might be hard to get qualified resources with bleeding edge technology
From a developer perspective, I do not see a reason not to get hands dirty with some new technology (in your spare time) but he/she might be limited to open source/free ware/developer editions
From am organization perspective, it looks like its a double edged sword. Sit too long in a "field tested" technology and good people might move away (not to mention that there will always be people who prefer familiar technology who refuse to update their knowledge). Try an unconventional approach and you risk overrunning the budged/time not to mention the unforeseen risks
TL;DR
Bottom line. When do you consider a technology mature enough so that it can be adopted by an organization ?
Most likely you work with a team of people and this should be taken into consideration as well. Some ways to test/evaluate technology maturity:
Is your team and management receptive to using new technology at all? This might be your biggest barrier. If you get the sense that they aren't receptive, you can try big formal presentations to convince them... or just go try it out (see below).
Do some Googling around for problems people have with it. If you don't find much, then this is what you'll run into when you have problems
Find some new small project with very low risk (e.g. something just you or a couple people would use), apply new technology in skunkworks fashion to see how it pans out.
Try to find the most mature of the immature. For instance if you're thinking about a NoSQL type of data store. All of the NoSQL stuff is immature when you compare against RDBMS like Oracle that has been around for decades, so look at the most mature solution for these that has support organizations, either professionally or via support groups.
Easiest project to start is to re-write an existing piece of software. You already have your requirements: make it just like that. Just pick a small piece of the software to re-write in the new technology, preferably something you can hammer at with unit/load testing to see how it stands up. I'm not advocating to re-write an entire application to prove it out, but a small measurable chunk.
A few rules of thumb.
Only use one "new" technology at a time. The more new things you use, the greater chance of there being a serious problem.
Make sure there is an advantage to using it. If that cool, new technology does not give you some advantage, why are you using it?
Plan for the learning curve. There will be aspects of the new technology that you do not know about. You, and your team, will have to spend more time learning about them then you think you will.
If possible try the new technology on a small less important project first. Your companies accounting system is not the best place to experiment.
Have a back up plan. New technologies don't always turn out to be worth it. Know when you are in a "coffin corner" and it is time to bail out.
There's a difference between "field tested" and "out-of-date." Developers (myself included) generally prefer bleeding edge stuff. To some extent, you have to keep your development staff happy and interested in their jobs.
But I've never had a customer unhappy with field tested technology. They are generally unaware or unconcerned about the technology that is used in producing a product. Their number one priority is how it works in their daily interactions with it.
When starting a new project, two questions come to mind in evaluating if I should move to a new platform:
1) What benefits do I get from going to the new platform. If it offers me a dramatically reduced development time or significant performance increases for the users, I will consider a semi-bleeding edge technology.
2) What risks are associated with the new platform. Is it likely that there are some scenarios that I will encounter that aren't quite worked out in the new platform? Is it likely that support for this new platform will fizzle out and I'll be left holding the bag on supporting a deprecated environment? Are there support channels in place that I can use if I get stuck at a critical juncture of my project?
Like everything, it's a cost/benefit analysis. Generally speaking, while I always learn and train on new technologies, I won't build something for a client using a technology (environment, library, server platform, etc) that hasn't been widely adopted by a large number of developers for at least 6-12 months.
That depends on the context. Each organisation has to make its own decisions. The classic literature on this topic is Crossing the Chasm by Geoffrey A. Moore.
If the company/community developing the product is known for good products then I'm very happy to put a safe bet on their new products.
For example I would be quite happy to develop on Rails 3 or Ruby 1.9, as I am quite sure they will be fine when finalized.
However I wouldn't write much code in superNewLang untill I was convinced that they had a great, well supported product, or they had a feature I couldn't live without.
I will tend to get the most trusted product, that suits all my needs.
- You got to ask yourself only one question ... do I feel lucky ?
- Where's the money ?
- Do you get to profit big and fast enough even if tech X is a flop ?
- If not, does new tech bring higher perf for a long time?
- Like 64-bit CPU, Shader Model 4, heavy multi-threading
- Do you see a lot of ideological trumpeting around it
- "paradigm shift" blurbs etc. - wait 2-8 yr till it cools off and gets replaces :-)
- Is it bulky and requires 2x of everything just to run?
- let your enemy pay for it first :-)
- Can you just get some basic education and a trial project wihtout risking anything?
- might as well try unless it looks like a 400 lb lady who doesn't sing :-)
- There is no general answer to such question - please got to #1
精彩评论