Fog Creek Software
Discussion Board




Quality Assurance for Image Development

Hello All,

I used to work on the team that developed scripted images and maintained the image on existing systems for our corporation. We did not write programs, however we did write a lot of code to install and maintain images. We never had a independant QA team or process. It was just handled as part of the development process. Upper management recently decided we needed to improve the quality and so moved some of us into a separate group under a different manager, labeled us QA and told us to come up with a QA strategy. They broadened it out a bit and said that we would grow to be responsible for the quality of all products/services from our division which handles most facets of desktop.

We have already outlined most of what we feel is the best strategy but I am still seeking any suggestions and information I can gather. If anyone reading this has been involved in similar functions I would really appreciate if you would contact me and let me ask you some questions. If no one really wants to carry on a dialog but has some suggestions I would be happy to hear them. In a second I will also be posting a question about functional and technical specifications, so if you can help with that please look for respond to that post.

Thanks in advance,
Richard

Richard
Monday, January 27, 2003

Image outputs are hard to test a couple of ideas-

have some standard images (manually verified) and verify newer outputs bit by bit (don't use dates)

p.s. this test will be almost machine specific, as output will very by resolution, and, I believe

some other possibilites:

verify file sizes
verify colors

Daniel Shchyokin
Monday, January 27, 2003

Doh! Thank you for taking the time to reply but I guess I should have been more specific.

When I said "Image", I meant Image more like in the sense of a Ghost Image. We use scripted images as opposed to a Ghost or Drive Image type method, but the product is the same. It is a way of saying the OS (Windows XX), Apps (Email Client, MS Office, Host Emulation, so on...), Configuration (Network configuration, Application Configuration, so on...). All of these things are wrapped up into a nice neat tidy install from CD or network. They usually require very minmal intervention. Ours is setup so that a 10 year old could install one without messing it up. This results in lower installation and reimage cost as well as a very standard environment. So if the HelpDesk gets a call they know what they are dealing with because everybodies computer looks the same, no matter when they got it or what brand of computer it is. Image is really not an appropriate word. I think it began being used with products like Ghost and Drive Image because they basically take a snapshot of the hard drive and re-apply that snapshot to other hard drives. What we do achieves a similar result but it does not use a snapshot method it uses answer files and silent install scripts to install everything without stopping and without prompting the installer for install options.

Anyway, thanks again for trying to help though...

Richard
Monday, January 27, 2003

I know what you mean (unattended installs), not from experience, but from all the training CD's and books for the MCP/MCSE exams.

As MS itself now recommends using imaging software presumably your management has decided your old job is redundant and if it can reuse snippets of code, why not snippets of its development team.

The kind of scripting you've done doesn't have too much direct relevance to coding, though obviously more relevant than painting the pictures on the labels.

What you haven't stated is where management feels the quality is deficient. Is the software having a lot of bugs, or is it user unfriendly?

If you were more specific then others would be able to help you.

Stephen Jones
Tuesday, January 28, 2003

Hi Stephen,

Thanks for the reply.

As regards the imaging process, I did not give the full explanation of our process. But as you point this out I will. I agree with you but first lets get some background info for those who might not understand the process.

Ghost images have many numerous advantages when used properly. They have dramatically reduced install times. They are compressed and contain only the files needed, unlike the original media. They are therefore smaller and consume less bandwidth or media (CDs).

With Ghost or Drive Image, you first have to install the OS and all apps, configure everything and then take your snapshot. Most companies might achieve this one of two ways. First is to manually install everything and manually configure everything and then take the snapshot. The second method would be to use a previous Ghost image. For this you would take a previous image, load it on a system, make your changes and create a new image from that.

The problems with this method are that you either end up with a Ghost of a Ghost of a Ghost of a.... Well, you see where I am going with that. I know what some of you are thinking, This is not like copying a video tape or an audio tape where the information is analog and you see degradation. But the truth is that a Ghost image is not an exact reproduction of the original. If you take a computer, create a ghost image. Put that image on the computer and create a new ghost image it will not be identical to the previous ghost image, in size or in content. Some will still want to argue that, but even if it did, it is still not clean. What if you have one image with Outlook and another with Lotus Notes. I am sure everyone would agree that if you uninstall a product it will not remove all traces of the product. There will still be remnants. Some might still want to aruge this point but it doesn't matter. The cleanest way to do it is to reinstall everything for each image and I am not going to convince everyone involved in this from my company otherwise.

What about the first method you say? Just reinstall everything from scratch. Well, that too is easier said than done. Take any average technical person and tell them to do some process (say about a 15 step process) exactly the same twice and I bet they will do it slightly different the second time. What about a third or fifth or fifty-eigth time? What if it were a process that called for 100 steps? Even if you document the steps people will forget, not read, accomplish the same task in a different way. The result? You ask them to make a new image with a patch for Internet Explorer and you get an image from them where something that you didnt ask them to touch (like Lotus Notes) will have been installed or configured incorrectly. That could be a real painful way to handle things in a small company.

We however are not a small company. We are a rather large one with approx. 80,000 employees. Because of the various needs within the corporation we have to build new images, occasionaly. In addition to that, things like the frequency that computers change (requiring new drivers to be added) as well as the never ending need to include a new patches for or new versions of applications we have to continually create new versions of those images. Also as an image goes through testing, if a problem is found the same version of the same image may be rebuilt numerous times. With hundreds of rebuilds necessary within a relatively short time frame and the thousands required over a longer period of time, the above two processes are just not feasible. No one wants a thousandth generation Ghost of a Ghost of a... nor does anyone want that many manually installed images.

What is the solution for a company with these kinds of needs? Simple, we combine the old (Windows NT) style of scripted (unattend) installation and configuration of everything possible and then at the end of the process create a Ghost image. Now we have reliability and consistency of scripted installs between images and versions coupled with the convinience and speed of a Ghost based deliverable.

To make the entire process easier to maintain we had been in the process of writing some wrather elaborate scritps (mostly vbscript) to allow us to treat applications and configurations like modules. If you want one with a different mail client pull out the Lotus Notes module and put in the Outlook module. Also to make it easier on final install we perform many post installation customizations via various scripts. Things such as installing hardware specific applications (such as laptop management apps) and environment specific settings or applications (such as DHCP or Static IP and Novell Client or not). We also do things like automatically joining the domain and various other things that can only be done after install. All of these things add up to a lot more code than one might suspect. In addition to the scripts the team also writes some compiled applications in VB and C++ to accomplish specific tasks that can not be performed from a script. I am not trying to compare any of this to the complexity of a real, retail application, but it is certainly more than a XP unattend script and a few batch files.

Now, where are those quality issues? Well, they are not with most of the applications, in that if you manually install the application you would not see the problem. We might see problems where a patch appears to install but does not, or where something is configured incorrectly and does not work as expected, sometimes as a result of the combination of all of the various parts being installed in a specific sequence the user will be prompted with messages they should not see on their first login, another may be that something doesnt work as a user vs an administrator, a DLL installed by application "G" may break or change the behaviour of application "D". There are too many and too various of problems to even begin to touch on them all. It is usually not a problem with the application itself. It is usually something that was incorrectly installed, installed in an improper order, a portion of the install failed and went unnoticed by the image developers, the use of a specific app, setting or configuration does not work in a given environment that it needs to. The problem is that specific things have never really been recorded. That is part of what we are trying to do now. We are trying to implement a process where the criteria are clearly defined and tested against. In the old process an image may be developed, tested and approved. But once it was used in a different state it was discovered that it did not change the server it pointed to for host emulation or virus pattern files. If we had specifications that stated that "these criteria should affect these settings" then the image developers would have a guide by which to develop and the newly formed QA will have something against which to test.

Whew! another realy long one. I really need to learn to write more short to the point posts... 'Til then, I hope this clears up what I am talking about.

Thanks,
Rich

Richard
Tuesday, January 28, 2003

Thanks for the explanation.

I hadn't realized that you were such a large company, and thus were actually working on QA for the images of the scripted installs. I thought you worked for a software company and they had pulled you off the job because they decided their products needed QA.

I doubt if there are many people in your position. The only thing that would resemble a little is the testing of applications on all OS's with different service packs and language packs and other configurations. Someone who does this job for a large software company should be able to help you a little.

Stephen Jones
Tuesday, January 28, 2003

*  Recent Topics

*  Fog Creek Home