
|
Contractual Performance metrics?
We getting into a unique scenario wherein prospective clients are looking for SPECIFIC performance metrics of our application (Java, XML/XSL-based, JDBC/JNDI services) under load for specific platforms. For instance, we're being pressed to commit to certain percentiles under load for specific application-servers (WebSphere 4.x, for example) on AIX. Moreoever, we're increasingly being pressed to commit to these metrics within the context of the legality of a Software License Agreement.
As you might imagine, we cannot possibly test for specific benchmarking metrics for all possible combinations of OS, application-servers, database types/versions, etc. Yet, we're continually pressured by prospective clients (and subsequently, Senior Officers who know nothing of the mercurial nature of load-testing) to draft contractual agreements that outline a hard-comittal to benchmarking metrics under specific environments.
Any help or opnions would be appreciated.
adk
Tuesday, January 27, 2004
It is impossible to know how any system will perform, without testing it.
However, a few years ago I believe that IBM (you mentioned AIX) had some metrics available. You can plug your criteria into their model and specify the desired load and out would pop the "answer".
Of course, it's only a guess, but it is an informed guess. I suspect it was being used to decide how big a server to buy, but it may be useful.
Could be a good time to spend a bit of time talking to an IBM consultant. Even better if you can get a written report from them that says for your kind of application, server, infrastructure, load, etc, it should work. This will be useful evidence in any pending PI/E&O case, should the worse happen.
Steve Jones (UK)
Tuesday, January 27, 2004
If yours is a Web based application (and it sounds like one), then it is generally a good idea to have some performance metrics at hand. Clients will definitely want to know how well the system will scale under load conditions (basically how will the response time be affected as the load on the server increases), what should be the ideal hardware configuration to support say x number of simultaneous users etc.
Check-out:
http://www.softwareqatest.com/qatweb1.html
Some points to consider:
1. Most of the tools are quite automated and you should not have to spend much time in taking the readings. Once you have the framework for taking performance readings ready (how many users, for how much time, with what test data), then its only a matter of setting up the environment and taking the readings on that environment.
2. You do not need to take load test all the supported operations. You should have the data only for the main (frequently-used) operations.
3. It is not necessary to take the readings for all the platforms available in the market. While testing you must've used some platforms as a sample base. Take the readings only on these platforms (if you've got them setup already, taking readings is not big deal). As and when a client requests that they want readings for some specific platform that you have not tested on already, you can take performance readings at that time.
Getting these metrics in the License agreement/contract is totally another thing. I believe these readings to be more of presentation material than contract agreement. But if you have do have confidence on your readings (i.e. you haven't fudged them up), then there should not be any issue in entering these figures in your contract. You have tested them right?
Moreover, I'm sure your legal advisor can put enough ifs and buts in the agreement to protect you from any un-necessary bother.
T-90
Tuesday, January 27, 2004
As an addendum, yes, I'm talking about a specific web-application that we build, and have benchmarked/profiled (ACT, WAST, JProbe, OptimizeIT, etc.) mostly on WebSphere AS 4. This major problem is that everyone presses (contractually, lately) for a committment to performance metrics within the bounds of a contract. We're pretty savvy right now on what our software can do within OUR environment; beyond that, there are just so many factors that can affect the application that it is nearly impossible to feel comfortable with a contractual bechmark obligation without including a TON of caveats around it. A hefty server running RedHat Linux is one thing, but not everybody can afford to buy that $100k AIX server just to re-assure everyone that the application can perform well under production-type loads. Moreover, many high-profile clients totally discount our Linux benchmarking metrics and demand committments specifically for WAS on AIX.
As most of you know, even fine nuances such as JVM command-line arg changes can drastically affect the performance of a J2EE application. Moreover, JDBC/JNDI performance is very much slave to environmental factors and database types/versions as well.
I'm really looking for some reasonable advice that I can use to formulate a standard policy on this stuff.
adk
Tuesday, January 27, 2004
Ask your client to provide AIX server for testing and benchmarking.
Evgeny Gesin /Javadesk.com/
Tuesday, January 27, 2004
Recent Topics
Fog Creek Home
|