How often should applications be stress or load tested?
Is there a rule on how often an application should be stress or load tested? I normally do it before putting into production a new version, when the hardware changes or when the ex开发者_如何学Gopected amount of users is known to change.
But today i'm asked if this should be a standard practice for an application that is in production even if no changes are introduced. If so, how often?
It really depends on how you want to address it for your company's needs. Personally, we load test our integration (test) builds daily - just like the builds go out. After the build runs at approx 1a, we have it scripted to be load tested as well. Our goal is specifically looking for build over build changes in performance. Even if we do not introduce changes into the code, the servers that the code is load tested on still recieves updates/patches/hot fixes/service packs/etc. At worst, once automated, it provides additional historical data.
We are going this route (build relativity) because it is cost prohibitive to try and replicate our hardware environment in production. In the event that we see a sudden change (or gradual changes) to key performance monitors, we can look into what changesets were introduced at that time and isolate potential code changes that adversely impacted performance.
From the sound of it, you are testing against a lab that replicates production? That is a different approach then we had, because we are going under the assumption that most of our bottlenecks would be code-induced and not directly dependent on hardware. We use VMs to approximate, but not duplicate, our production environment.
One thing that affects system performance, even though the code is unchanged, is data.
An example might be performance of a database query. As data is added to a table the cost of maintaining indexes goes up. Page splits in the index can degrade performance. As indexes grow, the number of 'levels' in the index will every so often have to be increased. When that happens you see a sudden, apparently inexplicable change in performance.
Running stress tests in a production environment is not always possible - it affects your day-to-day business. More often systems are instrumented to provide on-going feedback about performance. Maybe using something like ganglia. The data are used to detect issues and for capacity planning.
I think that whenever you change something in the application - code, data files, use-cases - and that includes but is not limited to expected amount of users, you should test it.
My two cents: sometimes you won't have changed anything and your site's performance could suffer. It could be from the app handling too much data (ie: caches overflowing). It could be from third party advertisements on the site slowing down. Heck, it could be because of fault RAM!
The main thing is that while it's generally advised you do testing after any known change, it's also not a bad idea to do occasional performance testing to check for possible unknown changes that could affect performance.
My company, BrowserMob, provides a free website monitoring service that runs real Selenium scripts every X minutes against your site. While it's not a load test, it definitely can help you identify trends and bottlenecks in your production site.
This depends on how mission-critical your system is ... if it is just a small tool that one can do without, once you put it on production. If your life depends on it, after every single build.
As far as I'm concerned, that is.
Depends on how much effort the testing requires. If it is easy enough, more testing never hurts. However, I see no reason to test if there are no changes expected. If this would lead to the software not tested for a long time, then it might be appropriate to run the tests from time to time in case there are unexpected changes.
Needless to say, it also depends on whether your software is running a nuclear reactor or a bulletin board.
If no changes are introduced in your product and the load testing simply repeats the same process over and over for some interval, I don't see the benefit of re-running.
I used to have stress tests as part of my ant file, so I could run those tests every night, if I wanted, and I would run them when I was making any changes, or testing a possible new change, that would in one way or another impact what I was testing, to see if there was any improvement.
I think how often would depend on your environment.
If you can't really stress test in development then you may have to wait until you get to QA, where you are testing just before you go to production.
I think the sooner you do it the better, as you can find problems and fix them faster, since you know it worked two nights ago, last night it failed the test, so there was some change during the day that caused it.
I used junitperf for many of my stress tests.
I think we don't want to stress test Notepad. It's a joke. Never mind =).
Stress testing not for all applications. :-)
If your software can kill someone, i think you should.
Imagine someone dying because the blood didn't arrive because some timeout on the system.
"Timeout.exception: your heart are not coming anymore. Try again later"
精彩评论