How to evaluate "enterprise" platforms?
I'm tasked with evaluating an "enterprise" platform for the next-gen versi开发者_JS百科on of a product. We're currently considering two "types" of platforms - RAD (workflow engine, integrated UI, small cores of "technology plugins" to the workflows, automatic persisting of state...) like SalesForce.com / Service-Now.com and "cloud based" (EC2 / AppEngine...).
While I have a few ideas on where to start, I'd like your opinions - how would you evaluate platforms for an enterprise suite of products? What factors would you consider? How would you eliminate weak options quickly enough to be able to concentrate on the few strong ones?
Also interesting is how would you compare enterprise RAD (proven technology, quick to develop - but tends to look "the same as the competition") to cloud-based technology (lots of "buzz", not that many competitors - easy to justify to management, but probably lacking (?) enterprise tools and experience).
As said before - I have a few ideas, but would like to see some answers before I post mine so I wouldn't drive the discussion to a specific place.
RB.
Creating a software evaluation scorecard will go a long way to do a technology assessment for your enterprise solutions.
Optionally you can add a weigh to your criteria and have multiple people evaluate and fill this up so you can take an average and have an unbiased analysis.
--Adding more info to help with evaluation criteria---
There are a variety of research and information sources on which you can base your evaluation criteria. These resources can provide the most recent information to use in performing technology assessment for your enterprise platforms.
- Industry-focused groups.
- Technology publications/industry publications
- InformationWeek and CIO magazine.
- Technology vendors.
- Peer companies.
Technology industry-focused web sites: - CIO Magazine: www.cio.com - CMP Media: www.techweb.com - Computer Business Review: www.cbronline.com - E-Week: www.eweek.com - Tech Republic: www.techrepublic.com - Darwin magazine: www.darwinmag.com - Analyst Views: www.analystviews.com - Venture Beat: www.venturebeat.com
End-user web logs:
- Slashdot.org - Techdirt.com - techrepublic.com - hardocp.com - anandtech.com - zdnet.com
Technology research firms:
- Aberdeen Group. - AMR Research. - Forrester Research. - Gartner Group. - Giga Information Group. - International Data Corp (IDC). - META Group.
CodeToGlory is right - a "scorecard" is useful, but the big trick to get right is the evaluation criteria that the scorecard is based on.
Generic scorecards are probably a good place to start if you have nothing else; in any case you need to carefully consider you're own position and context.
If you have an Enterprise Architect (or someone who does that role) they are who you need to talk to, ideally your eval criteria will align with what they are doing.
TOGAF and Zachman are both Enterprise Architecture Frameworks that you might find useful.
Once you have your criteria you might also want to think about weighting them; it's quite likely you'll do this sub-consciously anyway.
The basic idea is that you want to be looking at candidates from different viewpoints, a solution that is strong in one will probably be weak in another; by closely looking at the different views that are relevant to you you'll be able to make an informed decision.
Examples of views / viewpoints to consider:
- Security
- data
- integration
- usability (both end users, support staff, developers, etc)
- the size of the local community (is there stong vendor support, and is that relevant)
- operational view (how easy is it going to be to support it in an operational sense)
- licensing, etc, etc...
UPDATE _ PLEASE SEE THE LINES AS A TABLE - flecking MARKDOWN
Yes the other too are good starts to add ..
Step 1. the way that i normally do this is to do a "high level solutions appraisal"
System A | System B | System C | MoSCow RANK
High Level Requirement*-- --------------------------------------------
1. Must enable MultiUser | | |
2. Single Sign On | | |
3. Development Tools IDE | | |
4. Depolyment Tools | | |
5. Commercial Licensing | | |
6. Support Layering | | |
TOTAL --------------------------------------------
*Example Requirement
MoSCow = Must do Score 300 - Should Do Score 200 - Could Do Score 100
You can then get a "score" of the high score level
Step 2.
Gap Analysis
System A | System B | System C |
High Level Requirement*-- --------------------------------------------
1. Must enable MultiUser | | |
2. Single Sign On | | |
3. Development Tools IDE | | |
4. Depolyment Tools | | |
5. Commercial Licensing | | |
6. Support Layering | | |
TOTAL --------------------------------------------
Simple Tick Box - The Count the ticks - higher ticks less gaps
Step 3. Trade off
System A | System B | System C |
High Level Requirement*-- --------------------------------------------
1. Must enable MultiUser | | |
2. Single Sign On | | |
3. Development Tools IDE | | |
4. Depolyment Tools | | |
5. Commercial Licensing | | |
6. Support Layering | | |
TOTAL --------------------------------------------
Is the Gap between A and B on 1 >= B or C or is B>= A on 1 or is etc...
精彩评论