Testing a Java library using 100 - 1000 GB of memory
I have an open source library which is designed to support very large collections efficiently. I have tested it on my PC with 24 GB but would开发者_如何学Python like to test it on much a larger system, up to 1 TB. Most hosting solutions don't support this sort of memory sizes and I only need access for brief periods of time.
What I have tested is a collection of 500 million objects with 12 fields the full GC time is below 0.11 seconds. I have another test were it stores 128 billion elements with just one bit.
The library/test is small, so I don't have require much other than a lot of main memory.
Do you have any suggestions on how I could do this testing without buying my own server with 96 or 192 GB?
EC2 has high-memory instances of up to 68.4 GB each, and they charge by the hour. Granted, that is not 100GB of memory, but, if you stack a few of them up together....
Contact Contegix - they may be able to help you for free since you have an open source library you need to test. I reached out to them recently for the same need and they followed up within a day via email and called the day after and were very open to the possibility of the need to use large amounts of memory spanned across several servers for scale out testing.
Their Advocates for Innovation page describes who they help already. Just fill our their contact form at http://www.contegix.com/contact/ and they should get back to you soon.
Perhaps you could use a "virtualized" JVM like Zing -- it has a maximum heap size of 512 GB. Maybe if you contact them they will let you perform this test for free -- it would be a great showcase for their garbage collector.
You should take a look at Amazon EC2 or Google app engine.
精彩评论