SLTF Consulting
Technology with Business Sense

 Home | Bio | Contact Us | Site Map


Although testing assures a product works, a test document defines what "works" means

Scott Rosenthal
June, 1996

In embedded development a never-ending concern is how to establish and maintain a healthy line of communication between yourself and a client. My previous column (see reference) addressed this issue with a discussion of invaluable documentation that assures that what I build is what the customer wants. However, with any development project, at the end of the day you still must prove to someone that what you designed actually does work. No one cares if technically a design functions the way you meant it to. What really matters is how well it works for the end user.

In theory, if a design meets the specifications as set out in a requirements document, then it should be acceptable. Unfortunately, because a requirements document sometimes doesn't cover every nuance of system operation, a huge opening often exists for ambiguity and misunderstanding.

Test documentation

To remove this potential problem area, I write a test document that precisely defines each project's acceptance criteria. It simply explains how to tell whether the device is working or not. Above all, you don't want to get to the end of a development cycle and find out that you and the client have different definitions of "working."

This document describes the tests necessary to prove that a design meets the criteria in the requirements document. Therefore, to keep things simple, the test document should follow the outline of the requirements document. Writing a test document that skips around through the requirements document is an open invitation to gaps in test coverage. Further, it makes it tough to prove to the client that the specified test really indicates how well the design satisfies system requirements. The test document is the mechanism that closes the loop in the development process. Without it, just think about an open-loop system and the stability problems it inherently suffers from.

Because you derive the test document from the requirements document, I find that test documents really aren't difficult or time consuming to create. For the display-controller example in the previous column, the test document is roughly 800 words and took an hour to write. As with all previous documents, I made sure the client approved this paper because it forms the critical link in getting me paid for my development work.

All told, I spent 5 hrs writing the two documents I covered last time, including the test document. With them, I know exactly what it is I’m designing, how I’m going to design it and how I’m going to prove to someone that the design works. Just as important, though, my client knows these things, too. Everything is clear and well defined—just what you need to ensure a lasting relationship.

Some of you probably still don't agree with me about the need for these documents. However, I want my productivity to be as high as possible. Productivity to me is related to the number of lines of usable debugged code I can create in an hour. If the code created is excessively buggy or unusable because I misunderstood what the client wanted, I've had zero productivity, regardless of how much code I ground out. So instead of just blindly writing code and grumbling about all the changes the client or management sends down, I invest a bit of time up front to make sure I know what I’m doing. As in other aspects of life, here brains should win out over brawn.

Doing the deed

At this point in the display controller project, all the documentation was completed, and I was ready to start coding. So I went ahead, fired up my favorite editor and proceeded to write the program. My technique isn't to write a bunch of code at once, but instead to do it in little pieces. This way I get positive feedback that something's working. In addition, if a problem arises, I’m still quite familiar with what I had just written.

As with all development projects, a few unexpected problems popped up. First, the only C compiler for the 8051 that we had was a shareware program that didn't even include a linker and thus would've restricted me to writing all my software in one module. Hence, I had to investigate other compilers on the market. In the end I picked one from BSO Tasking (Dedham, MA (617) 320-9400). However, when the compiler came in, I found to my dismay that this "Windows-based" program was actually running in DOS. The Windows interface was a demo program that only works for 30 days! As it turns out, buying the real Windows front-end software more than doubles the price for the complete package (from $900 to $1900).

The next glitch came after I compiled some code and tried running it on a simulator we had in the office. Unfortunately the simulator's user interface was in German! After a few tries, I managed to get some useful information out of it. The biggest problem I had was trying to exit. I would press the exit key and it would ask me to select "yes" if I truly wanted to exit. So of course, I pressed "Y" with no reaction. Now I'll never forget to press "J" (for "Ja") in a German program.

Does it work?

Once I got the program running under the simulator, the next step (lacking an emulator) was to burn the code into an EPROM and begin testing. One of the first hurdles you must overcome with this technique is knowing whether or not the microcontroller is executing the software. To assist in this effort, I stuck in a few lines of code to toggle an output line each time the program reached a certain point in its operation. When I found this "heartbeat" signal with a scope, I immediately knew that the program was running and that the microcontroller was correctly wired.

To fully test the display controller, I wrote a test program on a PC that allowed me to interactively exercise various commands and features in the microcontroller's software. With this infrastructure in place, I quickly wrote a test procedure that allowed me to verify the display controller's operation against the test document. Now I could exercise all the software's features and validate the display controller's operation against its requirements.

The last step of the checkout was characterizing the interface speeds between the PC and the display controller. For this task I modified the software to toggle various I/O bits while it was executing key code sections. By monitoring these signals with a scope I could determine the proper timings for the interface signals to the display controller. This technique also helped me tune the code for faster operation in some critical sections. With testing and characterization finished, the display controller as a completed unit was ready for integration into the rest of the design.

In closing, the purpose of my past few columns wasn't to just walk through a simple development project. I also wanted to share some techniques I've found useful for producing a quality design. In addition, you can see from this example that if thought goes into the process at the beginning, quality doesn't have to increase development costs. In the final analysis, the display controller required 1700 lines of code, the PC test program has 400 lines and the complete process from start to finish—including all documentation I've talked about—took 40 hrs to complete. In terms of productivity, it averages out to 420 lines/day or approximately 50 lines/hr of debugged documented code. Obviously, different development efforts will give different results, but the point is that quality development needn't be slow development. PE&IN


Rosenthal, S, "A requirements document before a project pays big dividends in time and money," PE&IN, April 1996, pgs 65-66.

About Us | What We Do | SSM | MiCOS | Search | Designs | Articles

Copyright © 1998-2014 SLTF Consulting, a division of SLTF Marine LLC. All rights reserved.