Harddrive rpms really significant for Visual Studion development experience?
Is Harddrive rp开发者_开发百科ms really significant for Visual Studio development experience?
Check to see where your bottleneck lies. If you hit CTRL-SHIFT-ESC, it will bring up task manager and you can view the Performance TAB. When you compile, if your computer is very slow but the CPU usage is low, you are probably HD (IO) bound. If the CPU is at 100%, you are likely CPU bound.
Your two most likely problems for performance if your HD is the bottleneck in VS development is either HD speed itself or lack of memory (and excess paging / bad HD caching).
For better HD performance, you can acquire a fast reliable SSD like one of the Intel Drives or a Western Digital Velociraptor. The Velociraptor is about $230 for 300GB and the Intel SSD is around $500 for 160GB. Either one should be large enough to handle most development projects.
For RAM issues, buy as much RAM as you can afford without buying more than your OS can see (Win32 can only see 3GB usually). Windows will use the extra RAM for HD caching and it will also avoid any paging to memory that would happen on a RAM limited machine. If possible, run a 64-bit version of Windows with 6 or 8 GB of RAM. You can still use VS to compile and build 32-bit projects under Win64 but it will do a much better job of HD caching.
Depends... if you have a really slow harddrive, it will be a problem.
Take a look at: http://weblogs.asp.net/scottgu/archive/2007/11/01/tip-trick-hard-drive-speed-and-visual-studio-performance.aspx
Get an SSD. You will not be disappointed.
VS2003 with Visual Source Safe was crazy sluggish with lots of disk hits. I currently use vs2005 which still seems to be a little more sluggish than I would like but a marked improvement. I have also noticed better performance since we switched from Visual Source Safe to Team Foundation Server. Hope I don't sound like a M$ hater here, but I have come to expect bloat from Microsoft.
What matters the most is how quickly the hard drive can access the data (usually called the access time). The faster your drive can access the hundreds of tiny files that the compiler accesses as it builds your project, the better. RPMs may improve the access time a bit, but not as much as switching to drives that use different mechanisms and interfaces.
Unfortunately, most hard drive manufacturers don't publish the average access time. Instead, they promote the seek time (the time it takes to move the read/write head from track to track) and the latency (the time it takes the platters to rotate 1/2 revolution). The access time should be roughly equal to seek+latency.
Most consumer 3.5" SATA drives have a seek time of 8-12 ms, a figure which has remained relatively unchanged even as the RPMs went from 5400 to 7200.
By contrast, SCSI and SAS drives have seek times of 3.5-4.5 ms. They've always been faster than SATA/ATA drives, due to their more robust actuators and better interfaces.
SSD drives have near-zero access time.
You will likely not notice a huge difference between the RPMs, but if you have the ability to install a second hard drive you can have VS and your project files running off of that so windows is more free to hit its system and paging files.
I'm sure everyone knows this already, but what the heck... fragmentation is evil. It's surprising how many developer systems have badly fragmented NTFS drives. Adding a nightly scheduled defrag task is an easy way to mitigate this, at least on XP systems.
Also, when purchasing a drive, another reason to get a really big one is to avoid filling the drive up to the point where defragging starts to fail.
Personally I'm a fan of Jeff's suggestion, that being to use an SSD for a boot drive and key applications, and use a more cost-effective drive for actual data storage. I'm going to implement this myself once I eventually find some time to backup data and pull apart hardware.
精彩评论