Test & Measurement


What to look for in a logic analyser - Part 1

6 August 2008 Test & Measurement

A logic analyser is an essential instrument for debugging digital circuits. This document is intended to help users decide which logic analyser to buy and to avoid some common pitfalls.

1. Quality and capability

Why is it that a logic analyser from one manufacturer costs R20 000, while another manufacturer sells their product, which at first glance seems to have the same specifications (or even better), for just R4000?

Whether you buy apples, a logic analyser or a car, there is always a distinction between good and bad quality.

If anything, quality in electronic test instruments is more important than in most other products. If an apple has a small bad spot you can cut it out and enjoy eating the rest, but if your logic analyser is the source of intermittent glitches, you could spend days trying to sort out problems that actually originate in your test instrument and not in the hardware you are trying to debug.

A good logic analyser is not a simple instrument to design and manufacture. It connects to hardware by means of many channels using 'lengthy' probe leads and ground(s). It has to cope with fast switching buses which can generate considerable noise.

If 'cheap' is a measuring instrument's main design objective, you will pay extra for it by solving bugs, which actually originate in the instrument. A quality product gives you measurements you can trust.

Buying cheap: save now, pay later

* Obsessive cost saving leads to poor quality and unreliability. Poor quality leads to poor signal integrity. Poor signal integrity leads to you having a hard time debugging your hardware.

Low quality PCBs, omitting PCB testing by the PCB manufacturer, low quality workmanship, poor quality connectors, leads, capacitors, no burn-in QA testing, etc, are all aspects that can contribute to a low quality end-product.

A product that is supposed to measure high frequencies must be able to cope with problems that come with high frequencies, eg, poor quality ground leads would not be effective against high frequency skin effects. A very high quality ground lead can easily cost 50 times more than a very poor quality lead, but as good grounding is essential, it is well worth paying those extra few Rands.

An instrument with many channels that specifies that it can handle high frequencies must be able to handle high frequencies on many of its channels, without being overwhelmed by the amount of noise generated by the switching noise created by these inputs.

Poor connectors and probe leads are a major problem. Poor connectors corrode, whereas gold plated connectors will give many years of excellent connectivity. Good quality probe leads not only improve signal integrity, but are usually also very flexible, making the experience of using the instrument that much more convenient.

All this means that you may be debugging problems that arise from your test instrument and not the hardware that you are actually trying to debug. It may take you many hours before you realise that those glitches are generated by your cheap instrument and not from your development hardware. This can quickly wipe out any 'savings' that you made by buying a cheap measuring instrument. In the end you normally get what you paid for.

* Obsessive cost saving leads to more costly, but necessary, features to be left out. Typical cost-cutting shortcuts would be the use of a small buffer.

Some manufacturers implement a small buffer from the available RAM in their low cost PLD chip and try to convince customers that the 4 KByte (or even smaller) buffer that they provide is enough and that you do not need the 1 MByte buffer supplied by other manufacturers. They may try to point out that they use hardware compression that increases the size of the buffer. Do not be fooled by this. If hardware compression was the ultimate solution to buffer size problems, all manufacturers would be using this. The reality is that hardware data compression is limited in its use and can in fact have the result of reducing the buffer size in some very common test circumstances.

Note that low cost PLDs have many I/O pins and can easily provide many channels, if no external RAM is used and inferior input buffering and threshold detection is used.

* Obsessive cost saving leads to bad design practice. Bad design practice leads to bad signal integrity, reduced reliability, poor or more expensive repair support, etc.

Below are a few typical results if saving cost is put above all other design considerations:

* Using double layer boards, despite the chip manufacturer's strong recommendation of using at least 4-layer PCBs.

* Insufficient power decoupling at chip level.

* Insufficient or no bulk decoupling.

* Insufficient or no onboard power regulation. There is for instance a strong reliance on the power regulation of the PC power supplied. A good PC power supply may help a bit, but even the best cannot replace even basic power regulation onboard inside the instrument. Proper power filtering is expensive, so manufacturers of cheap products simply do not implement it.

* Self regulation/calibration, self diagnoses built-in test (BIT) functions are all simply left out of the design. These components have effects on accuracy and improved support, after and even before any hardware failure.

* Protection circuitry such as over-voltage and -current protection is simply omitted.

* Obsessive cost saving leads to essential parts not being supplied. Check what is supplied in the packaging before you buy.

Test clips are expensive. After buying a logic analyser you may find that it does not have test clips included. So now you will have to shop around for test clips and buy a small quantity at excessive cost. Going back to the manufacturer, they will try to convince you of your own stupidity for not adding test pins to your design/PCB onto which their test leads that end in crimp contacts can fit. They will of course supply test clips to you - not so cheaply though.

Good test clips are important, even if you use a BGA on your board. Be sure that they are included in the package and add their cost.

* Obsessive cost saving leads to parts that actually cost very little, not being supplied.

Do not be surprised if your cheap logic analyser comes without any packaging material, not even in an outer shipping box. Some do not even have the software on CD, manual, etc. True, you can download the software from the Internet and print your own manual, and your logic analyser may still work if some heavy objects were shipped on top of it in the cargo carrier, but is this really what you wanted and how you wanted it? If you see yourself as a professional person, then buy professional tools of your trade.

* Obsessive cost saving leads to bad after sales service.

A product that is supposed to have sophisticated features, has a reasonable chance of creating technical queries for the manufacturer. If the manufacturer makes little profit out of his product, his profit can be wiped out by having to deal with problems or simply general questions from the field. Such manufacturers would normally simply refuse to accept that there could be anything wrong with their product and would simply blame the user for 'not using the product correctly'. Do not expect any help if you buy an inferior product.

* Obsessive cost saving sometimes leads to product piracy.

Hardware and software developments are expensive. If the hardware protection of a product is insufficient, it is possible for fraudulent companies to copy the hardware and use the original manufacturer's software on their illegal hardware.

Some cheap products are actually illegal copies made of products of authentic developers. If you get problems with such a product, you can forget about getting support. They may mostly in fact not be able to help you, since they have no detailed knowledge of the product.

Some manufacturers build in sophisticated hardware protection, which can in fact be activated as soon as they become aware of clones appearing on the market. You may find yourself stuck with a product in which the security hardware has been triggered, disabling the hardware.

2. Affordability and cost effectiveness

Cost effectiveness: what you need is an instrument of good quality that offers the features that you actually need at a reasonable price.

Some instruments offer features that are very sophisticated and expensive to produce and support, but it may be something that you will never need. An example of this is processor code assembly features, whereby you can capture data synchronously with the processor read and write signals and assemble the code to see exactly what the software code is that the processor has executed. This function is generally provided for specific target processors, it is very handy for the users that require this functionally, but most engineers would never use it, so don't pay R5000 extra if you are never going to use it.

3. Sampling rate

A high sampling rate is required for high-resolution data capture. The higher the sampling rate, the more accurate the representation of the captured signals on the screen. With higher sampling rates, more accurate measurements can be made between edges on different channels.

In analog electronics, people often refer to the Nyquist theorem, which states that to recreate any captured signal you need to capture the data at least twice the frequency of the maximum frequency component present in the analog signal. A logic analyser of course captures square waves. The highest frequency component in a square wave is infinite as required to create the sharp edges of the square wave.

So at what frequency should a signal be captured relative to the base frequency of the square wave?

A logic analyser simply distinguishes between a high (signal higher than the input threshold) and a low (signal lower than the input threshold). Let us say a 1 kHz square wave is captured at 4 kHz sampling rate. This means that the incoming signal could be sampled twice while the signal is high and twice while it is low. This will result in the signal being displayed with a 50-50 mark-space ratio.

If there is distortion on the signal and if the threshold is not set correctly for the incoming signal, you could easily end up sampling the signal say once while high and three times while low and the signal will be displayed with a 1-3 mark-space. This means you need to push up the sampling rate.

At 5 kHz sampling rate, with everything set correctly it is easy to see that you will most likely capture the signal say 3 times on high and 2 times on low and later on 3 times on low and 2 times on high, resulting in a nicely regular incoming signal being displayed with irregular mark-space features. In short you should push up the sampling rate even further.

If you work on relatively low frequency circuits, you do not need to buy the logic analyser with the highest sampling rate on the market.

It is recommended that you should try to sample as high as possible as allowed by your buffer depth. A deep sampling buffer is also important in this regard. In general, a sampling rate of 10 times the frequency of the incoming signal is sufficient. This also explains why logic analyser manufacturers would not limit the sampling rate to say 2 times the input bandwidth.

4. Large buffer

Why is a large sampling depth (buffer) important?

A large buffer allows longer captures without lowering the sampling frequency. To capture high and relatively low frequency signals simultaneously, both a high sampling rate and a large buffer are needed for a meaningful measurement. A high sampling rate without a large buffer is of little practical value if your signals have both low and high frequency components.

For example, say you need to measure a very high frequency serial data stream which is accompanied by a low frequency strobe that denotes a frame of 64 Bytes. To be able to capture the high frequency data meaningfully you need to sample at a sufficiently high frequency. If your data buffer is too small the low frequency strobe would not be completely captured before the buffer is full. To capture the low frequency strobe you need to bring down your sampling rate, but now the sampling rate is too low to get a meaningful capture on the high frequency data. So now it becomes difficult to capture and view your data.

With a large buffer you would have enough depth to set a high sampling rate and capture both the high frequency data and the low frequency strobe. Now, to view the bigger picture, you would simply zoom out and to see the high frequency details, zoom in.

Would a very small buffer with data compression solve the above problem?

If a logic analyser line remains low and never changes state, it is easy to see that the data can easily be compressed to a few Bytes, which simply indicates that the channel never changed state. This requires very little memory depth.

If the signal changed state once, all you need to record is the initial state and the sample number where it changes state. This takes a few more Bytes, but you are still saving a lot of memory.

If the signal you capture has a relatively high frequency, you would still need a few Bytes per transition and if the frequency becomes comparable with the sampling rate you would soon move into the situation where the compression requires much more memory than when the signal was simply captured without any compression and where every sample requires only one bit per channel. This means that, in the presence of high frequency signals, the compression may in fact decrease memory depth.

Another point on hardware data compression is that the compression circuitry is situated between the inputs and the memory buffer and causes propagation skew between channels. This skew is difficult to remove. When capturing straight into memory the data path is straightforward, resulting in little channel to channel skew.

The conclusion is that hardware data compression has advantages when capturing slow signals, but has severe limitations in the presence of high frequencies and cannot replace 'real' deep memory.

For more information contact Janatek Electronic Designs, +27 (0)21 887 0993.





Share this article:
Share via emailShare via LinkedInPrint this page

Further reading:

High-speed AWG generates up to 20 sine waves
Vepac Electronics Test & Measurement
Spectrum Instrumentation has released a new firmware option for its range of versatile 16-bit Arbitrary Waveform Generators, with sampling rates up to 1,25 GS/s and bandwidths up to 400 MHz.

Read more...
Digitisers upgraded with pulse generator option
Vepac Electronics Test & Measurement
Spectrum Instrumentation has added the Digital Pulse Generator option to its ultrafast digitisers (with up to 10 GS/s speed) and arbitrary waveform generators.

Read more...
Network Master Pro to provide support of OpenZR+
Tamashi Technology Investments Test & Measurement
Anritsu Corporation has introduced the 400G (QSFP-DD) multi-rate module MU104014B that supports the new interface standard.

Read more...
Upgrade brings extra layer of detection to Fluke’s acoustic imagers
Comtest Test & Measurement
The firmware 5.0 update helps to boost efficiency and allows maintenance technicians to scan large areas quickly, and visually pinpoint technical issues before they become critical.

Read more...
Companies collaborate on EnviroMeter
Avnet Silica Test & Measurement
STMicroelectronics and Mobile Physics have joined forces to create EnviroMeter for accurate air-quality monitoring on smartphones. Time-of-flight optical sensing enables an accurate personal air quality monitor and smoke detector.

Read more...
PCB test points
Vepac Electronics Test & Measurement
Maintaining these access points in the final production versions will prove invaluable during the life of the equipment for service, adjustment, and debug, or repair activities.

Read more...
RFID reader
Test & Measurement
The EXA81 from Brady turns any smartphone or tablet into a personal radar that can pick up radio signals from all RFID-labelled items.

Read more...
Proximity sensor with VCSEL
Avnet Abacus Test & Measurement
Vishay’s newest small package proximity sensor, the VCNL36828P, combines low idle current with an I2C interface and smart dual slave addressing.

Read more...
CNH data output devices for AI applications
Altron Arrow Test & Measurement
STMicroelectronics’ CH family of time-of-flight sensor devices feature compact and normalised histogram (CNH) data output for artificial intelligence applications requiring raw data from a high-performance multizone ToF sensor.

Read more...
Webinar: The key to smart occupancy
Test & Measurement
This one-hour session will allow the attendee to discover the company’s latest infrared sensor with high-sensitivity presence and motion detection capabilities.

Read more...