SLTF Consulting
Technology with Business Sense

 Home | Bio | Contact Us | Site Map


 

Validating data acquisition saves debugging time

Scott Rosenthal
May, 1991

One of the more difficult aspects of designing an embedded system is proving that it works as intended. Part of the problem is that "working" takes on different meanings if you’re the designer, salesperson or customer. However, as the designer you’re responsible when something doesn't meet expectations. Therefore, you must do everything possible not only to ensure that the system functions properly but also to satisfy other parties involved in the project. Luckily numerous techniques can quantify system performance such that others, who might not be literate in the jargon of the field, can also understand that what you've been slaving over for the past six months actually performs the intended task.

For an embedded measurement system to do its job, it obviously needs a sensor or transducer such as a thermocouple or thermistor for measuring temperature, strain gauges for weighing items and photodetectors for measuring light. After performing suitable conditioning, an instrument can use these signals for monitoring or control. Unfortunately, in the real world lots of little problems can creep in and render the instrument useless. For example, too much current through a thermistor causes self-heating and distorts the measurement. Similarly a strain gauge is sensitive to line-frequency interference, and ambient light can destroy photodetector readings. In this regard remember that one of your design goals isn't to isolate sensors from such error sources, but instead you want to prove to yourself and the customer that the software interface and its interaction with the hardware are functioning properly.

Feed a voltage in

A simple way to demonstrate that a system's working properly is to feed it a known voltage in place of the sensor and make sure the software correctly reads and converts it (Fig 1). This technique removes uncertainties associated with sensors and concentrates your effort on verifying the software. This technique came in handy for me a while back when a customer asked me to bail out a design in which nothing was working, but everyone was screaming that the photodetector had to be the culprit. I substituted a precision voltage source for the photodetector, showing that not only did the software have a bug in it, but the A/D converter wasn't functioning correctly, either. In this situation questions about the detector confused the situation and drew attention away from the real problems.

Unfortunately, many people fail to use this technique because it's so simple. In complicated systems where many interactions happen at once, measuring a known signal can isolate potential problem areas. For example, consider how to debug a fluorescence detection system where the instrument illuminates some fluorescent dye with pulsed ultraviolet light and measures the amount of light the dye emits. Not only must the system determine the amount of light, the measurement also depends on whether the dye system is functioning properly. Thus you must first break the connection between the illumination and detection and feed a known voltage into the system. That works? Great. Now take a step back and, using suitable filters and light levels, channel the UV light directly to the detector. Does the system still work? If so, any problems are now most likely in the dye system.

You can easily demonstrate this technique to any layman. Just remember that as the system's programmer you have no control over sensor measurements. Scratching the deepest recesses of my mind brings back an old word: GIGO---Garbage In, Garbage Out. Don't confuse an instrument's garbage readings with a software problem. Isolate the two.

A similar process is also valuable for proving out the mathematics in an instrument. Too often I've seen embedded systems where, during the debug phase, no one knew if the bugs were hardware or software problems. Like before, simply feed known data into the program to check the mathematics. If generating test cases with a calculator is too cumbersome, use a spreadsheet program. Not only is it helpful to see all the data and formulas in a spreadsheet, this method provides an easy way to document how you debugged the system.

A tip for process tuning

Now that you've got the instrument measuring properly, the next task is to control a process based on the measurements. How do you prove that the control portion works properly? Just because it appears to do so doesn't mean it's really working. A classic example is a PID controller for dynamically managing a process in real time. Compare it to a home thermostat, which is a primitive version of the classic Bang-Bang controller that's either full On or full Off. A PID version of a thermostat, in contrast, would gradually reduce the amount of heat the furnace puts out as the ambient temperature reaches the desired setting; then it would trickle in enough heat to keep the temperature from varying by more than a set amount. The system is always on guard for perturbations such as a door opening or a breeze sucking air through the chimney.

If you look at a graph of the ambient temperature, the result for an efficient PID loop should be reasonably flat. In that case, it's simple to prove to an unsophisticated customer or even your boss that your miraculous new design is properly controlling the process. Here, rather than supply reams of tabular data to prove the point, a graph showing the control system's reactions to external stimulus is clearer proof (Fig 2).

When this situation comes up in my work, I use the system's serial port to transmit PID input variables such as the room temperature and ventilator or heater control positions to an external computer. The computer collects this data using either a custom program that samples data every N sec or the communications module of Microsoft Works. The custom program saves data as a comma-delimited ASCII list. After collecting the data you can easily transfer it into a spreadsheet program such as that in Works for manipulation and graphing. In addition, because the collection process is automatic, you avoid the work of manual transcription errors. The spreadsheet allows you to try various mathematical functions such as IIR or FIR filtering, and the graphics built into most spreadsheets today make it easy to generate plots that demonstrate the effects of outside perturbations on the system.

The result is quick proof of how the system works, presented in an easy-to-understand manner. You can even use such a custom collection program to help hardware designers look for drift and stability problems by evaluating data collected over several days. Software helping the hardware, what will they think of next?

* * *

Last January this column addressed my uses of serial EEPROMs, and one example showed such a device housed in a connector to store probe configuration data. Subsequent to that article's appearance I received a letter from A Rather Large Company (ARLC), indicating that it had patented the idea in 1985 and would I consent to licensing its technology? I was appalled---not only because I've been using this technique since before the patent date, but also because of the way American businesses are trying to squeak out profits. For instance, last year a major semiconductor firm spun off a litigation department that reportedly became one of its most profitable divisions. That group's task way to track down companies infringing on patents. Apparently ARLC is trying to become profitable by following the same route.

Obviously, ARLC has the right to pursue patent infringements. My complaint, though, is that by scouring magazines and going after the writers, they’re shutting down the free flow of information. A major difference between patents and copyrights is that copyright is explicitly labeled whereas a patent isn't known unless you research it or it's brought to your attention. But how can an author in a technical field risk the discovery that somehow, something that seems so intuitive and non-novel, could've actually been patented? I've talked to several publishers and authors, and none have heard of technical writers being targeted like this before. I must now restrict myself to obvious examples and remove personal experiences so that I don't step on anyone else's toes. It's then up to you to investigate whether a concept is patentable or has already been patented. Once again, John Q Public gets abused for the sake of one company's bottom line. PE&IN



About Us | What We Do | SSM | MiCOS | Search | Designs | Articles

Copyright © 1998-2014 SLTF Consulting, a division of SLTF Marine LLC. All rights reserved.