NEW FOREST ELECTRONICS Home Up
Introduction to PC-based logic analysers

How could a PC-based logic analyser help me?

A logic analyser helps debug and validate digital electronic circuits. It is a tool that allows numerous digital waveforms to be acquired and viewed simultaneously. An oscilloscope can only capture two or four signals at once, the logic analysers we sell capture up to 8, 16 or 32 signals at the same time. A logic analyser processes data across buses, something a scope can't do.

Why PC-based analysers? In the past, logic analysers were large expensive "stand-alone" instruments only big companies could afford. With universal access to PCs, a new type of instrument has been developed using the "common resources" of the PC, i.e. its display, keyboard and processing power. An important example of these cheaper instruments is the logic analyser.

The advent of USB ports has improved these new instruments. Older PC based logic analysers typically used the legacy parallel printer port. USB ports are now more common, faster, can be physically more accessible, and provide automatic recognition when the analyser hardware unit is connected. Perhaps most useful of all they provide operating power, avoiding the need for a separate power source.

This all means that now small companies and even individuals can afford to benefit from the diagnostic power of the logic analyser.

FAQ: What to look for in a PC-based logic analyser

This section is a collection of some important aspects to consider while comparing the various PC based logic analyzers available on the market today. (Text © Janatek Electronic Designs, author E.J.Theron).

( Click here to view and download a reference article containing this information, use your browser back button to return)
 
1.What are the most important aspects to consider when buying a pc based logic analyzer?

    1.1 Quality and Reliability
    1.2 Affordability and Cost Effectiveness

2. Some common pitfalls of buying a cheap logic analyzer.

    2.1 Poor Quality and Unreliability
    2.2 Necessary features being left out
    2.3 Bad Design Practice
    2.4 Essential parts not supplied
    2.5 Expected things left out
    2.6 Bad after sales service
    2.7 Product Piracies

3. How fast should the sampling rate be?

4. Why is a deep sampling buffer important?

5. Would a small buffer with data compression be good enough?

6. How many channels do I need?

7. Other criteria to consider?

    7.1 Sufficient input bandwidth
    7.2 Input Impedance
    7.3 Variable Threshold Voltage
    7.4 Versatile Trigger Options
    7.5 External Clock Input
    7.6 Easy to use software, manual and help
    7.7 Nice Extras
    7.8 After sales support from dealer and manufacturer
    7.9 CE and other compliance standard
    7.10 Complete Package

1.The most important aspects to consider (top)

1.1 Quality and Capability

Why is it that a logic analyzer from one manufacturer costs $2000, while another manufacturer sells their product, which at first glance seems to have the same specifications (or even better), for just $400 ? Whether you buy apples, a logic analyzer or a car, there is always a distinction between good and bad quality.

If anything, quality in electronic test instruments is more important than in most other products. If an apple has a small bad spot you can cut it out and enjoy eating the rest, but if your logic analyzer is the source of intermittent glitches, you could spend days trying to sort out problems that actually originate in your test instrument and not in the hardware you are trying to debug.

A good logic analyzer is not a simple instrument to design and manufacture. It connects to hardware by means of many channels using “lengthy” probe leads and ground(s). It has to cope with fast switching busses which can generate considerable noise.

If “cheap” is a measuring instrument’s main design objective, you will pay extra for it by solving bugs, which actually originate in the instrument. A quality product gives you measurements you can trust. (top)

1.2 Affordability and Cost Effectiveness

Cost effectiveness: What you need is an instrument of good quality that offers the features that you actually need at a reasonable price.

Some instruments offer features that are very sophisticated and expensive to produce and support, but it may be something that you may never need. An example of this is processor code assembly features, whereby you can capture data synchronously with the processor read and write signals and assemble the code to see exactly what the software code is that the processor has executed. This function is generally provided for specific target processors, it is very handy for the users that require this functionally, but most engineers would never use it, so don’t pay $500 extra if you are never going to use it.

If you work on relatively low frequency circuits, you do not need to buy the logic analyzer with the highest sampling rate on the market. (top)

2.Some common pitfalls of buying a cheap logic analyzer

2.1 Obsessive cost saving leads to poor quality and unreliability

Poor quality leads to poor signal integrity. Poor signal integrity leads to you having a hard time debugging your hardware.

Low quality PCBs, omitting PCB testing by PCB manufacturer, low quality workmanship(!),  poor quality connectors, leads, capacitors, no burn-in QA testing, etc.

A product that is suppose to measure high frequencies must be able to cope with problems that come with high frequencies, e.g. poor quality ground leads, would not be effective against e.g. high frequency skin effects. A very high quality ground lead can easily cost 50 times more than a very poor quality lead, but as good grounding is very important it is well worth paying those extra few dollars.

An instrument with many channels that specifies that it can handle high frequencies must be able to handle high frequencies on many of its channels, without being overwhelmed by the amount of noise generated by the switching noise created by these inputs.

Poor connectors, probe leads:. Poor connectors corrode, whereas gold plated connectors will give many years of excellent connectivity. Good quality probe leads not only improve signal integrity, but are usually also very flexible, making the experience of using the instrument just that more pleasant.

All this means that you may be debugging problems that arise from your test instrument and not the hardware that you are actually trying to debug. It may take you many hours before you realize that those glitches are generated by your cheap instrument and not from your development hardware.  This can quickly wipe out any “savings” that you made by buying a cheap low cost measuring instrument. In the end you normally get what you paid for. (top)

2.2. Obsessive cost saving leads to more costly, but necessary, features being left out

Typically the product would not have a large buffer size, but rather a very small one. Stay clear of those!

Some manufacturers would implement a small buffer from the available ram in their low cost PLD chip and try to convince you that the 4K (or even less!) buffer that they provide is enough and that you do not need the 1Meg buffer supplied by other manufacturers. They may try to point out that they use hardware compression that increases the size of the buffer. Don’t be fooled by this. If hardware compression was the ultimate solution to buffer size problems, all manufacturers would be using this. Reality is that hardware data compression is limited in its use and can in fact have the result of reducing the buffer size in some very common test circumstances! (more)

Note that low cost PLDs have many IO pins and can easily provide many channels, if no external ram is used and inferior input buffering and threshold detection is used.  DO NOT be overly impressed  by the number of channels provided. Buying a good quality 8- channel logic analyzer is much better than buying a cheap 36-channel one. (top)

2.3 Obsessive cost saving leads to bad design practice

Bad design practice leads to bad signal integrity, reduced reliability, poor (more expensive) repair support, etc.
Below are a few typical results if saving cost is put above all other design considerations:

  • Using double layer boards, despite the chip manufacturer’s strong recommendation of using at least 4-layer PCBs

  • Insufficient power decoupling at chip level.

  • Insufficient or no bulk decoupling.

  • Insufficient or no onboard power regulation.  There is for instance heavily relied on the power regulation of the PC power supplied. A good PC power supplied may help a bit, but even the best PC power supply cannot replace even basic power regulation onboard inside the instrument. Proper power filtering is expensive, so the manufacturer of cheap products simply don’t do it.

  • Self regulation/ calibration, self diagnoses built in test (BIT) functions are all simply left out of the design. These components have effect on accuracy and improved support, after and even before any hardware failure.

  • Protection circuitry such as over Voltage/Current protection is simply left out.

2.4 Obsessive cost saving leads to essential parts not being supplied (top)

Check what is supplied in the packaging before you buy.

Test clips are expensive.  After you bought your logic analyzer you may find that it does not have test clips included! So now you will have to shop around for test clips and buy a small quantity at excessive cost.  Going back to the manufacturer, they will try to convince you of your own stupidity for not adding test pins to your design/pcb onto which their test leads that end in crimp contacts can fit. They will of course supply test clips to you – not so cheaply though.

Good test clips are important, even if you use a BGA on your board. Be sure that they are included in the package and add their cost. (top)

2.5 Obsessive cost saving leads to parts that actually cost very little not being supplied

Don’t be surprised if your cheap logic analyzer comes without any packaging material, not even in an outer shipping box.

Some don’t even have the software on CD, manual, etc. True, you can download the software from the internet and print your own manual, and your logic analyzer may still work if some heavy objects were shipped on top of it in the cargo carrier, but is this really what you wanted and how you wanted it? If you see yourself as a professional person, then buy professional tools of your trade. (top)

2.6 Obsessive cost saving leads to bad after sales service

A product that is suppose to have sophisticated features, has a reasonable chance of creating technical queries for the manufacturer.

If the manufacturer makes little profit out of his product, his profit can be wiped out by having to deal with problems or simply general questions from the field.

Such manufacturers would normally simply refuse to accept that there could be anything wrong with their product and would simply blame the user for “not using the product correctly”.  Do not expect any help  if you buy an inferior product. (top)

2.7 Obsessive cost saving sometimes leads to product piracies

Hardware and software developments are expensive. If the hardware protection of a product is insufficient, it is possible for fraudulent companies to copy the hardware and use the original manufacturer’s software on their illegal hardware.

Some cheap products are actually illegal copies made of products of authentic developers. If you get problems with such a product, you can forget about getting support. They may mostly in fact not be able to help you, since they have no detail knowledge of the product.

Some manufacturers build in sophisticated hardware protection, which can in fact be activated as soon as they become aware of clones appearing on the market.

You may find yourself stuck with a product for which the security hardware has been triggered, disabling the hardware. (top)

3. Sampling Rate

This is required for high capture resolution.

The higher the sampling rate the more accurate the representation of the captured signals on the screen.

With higher sampling rates, more accurate measurements can be made between edges on different channels.

In analogue electronics people may often refer to the Niquest theorem that states that to recreate any captured signal you need to capture the data at least twice the frequency of the maximum frequency component present in the analogy signal.

A logic analyzer of course captures square waves. The highest frequency component in a square wave is infinite as required to create the sharp edges of the square wave. So at what frequency should a signal be captured relative to the base frequency of the square wave?

 A logic analyzer simply distinguishes between a high (signal higher than the input threshold) and a low (signal lower than the input threshold).  Say a 1kHz square wave is captured at 4 kHz sampling rate. This means that the incoming signal could be sampled twice while the signal is high and twice while it is low. This will result in the signal being displayed with a 50-50 mark-space ratio. If there is distortion on the signal and say the threshold is not set correctly for the incoming signal, you could easily end up sampling the signal say once while high and three times while low and the signal will be displayed with a 1-3 mark-space. This means you need to push up the sampling rate. At 5kHz sampling rate, with everything set correctly it is easy to see that you will most likely capture the signal say 3 times on high and 2 times on low and later on 3 times on low and 2 times on high, resulting that a nicely regular incoming signal may be displayed with irregular mark-space features. In short you should push up the sampling rate even further. We recommend that you should try to sample as high as possible as allowed by your buffer depth. A deep sampling buffer is important (more on that in the next section). In general a sampling rate of 10 times the frequency of the incoming signal is sufficient. This also explains why logic analyzer manufacturers would not limit the sampling rate to say 2 times the input bandwidth. (top)

4. Why is a deep sampling buffer important?

A large buffer allows longer captures without lowering the sampling frequency.

To capture high and relatively low frequency signals simultaneously, both a high sampling rate and a large buffer are needed for a meaningful measurement. A high sampling rate without a large buffer is of little practical value if your signals have both low and high frequency components. For example, say you need to measure a very high frequency serial data stream which is accompanied with a low frequency strobe that denotes a frame of 64 bytes. To be able to capture the high frequency data meaningfully you need to sample at a sufficiently high frequency. If your data buffer is too small the low frequency strobe would not be completely captured before the buffer is full. To capture the low frequency strobe you need to bring down your sampling rate, but now the sampling rate is too low to get a meaningful capture on the high frequency data. So now it becomes difficult to capture and view your data. With a large buffer you would have enough depth to set a high sampling rate and capture both the high frequency data and the low frequency strobe. Now, to view the bigger picture, you would simply zoom out and to see the high frequency details, zoom in. (top)

5. Would a small buffer with data compression solve the problem above? (top)

If a logic analyzer line remains low and never changes state, it is easy to see that the data can easily be compressed to a few bytes, which simply indicates that the channel never changed state. This requires very little memory depth.

If the signal changed state once, all you need to record is the initial state and the sample number where it changes stage. This takes a few more bytes, but you are still saving a lot of memory.

If the signal you capture has a relatively high frequency, you would still need a few bytes per transition and if the frequency becomes comparable with the sampling rate you would soon move into the situation where the compression requires much more memory than when the signal was simply captured without any compression and where every sample requires only one bit per channel. This means that in the presence of high frequency signals the compression may indeed decrease memory depth.

Another point on hardware data compression is that the compression circuitry is situated between the inputs and the memory buffer and causes propagation skew between channels. This skew is difficult to remove. When capturing straight into memory the data path is straightforward, resulting in little channel to channel skew.

The conclusion is that hardware data compression has advantages when capturing slow signals, but has severe limitations in the presence of high frequencies and cannot replace “real” deep memory. (top)

6. Number of channels

The number of channels determines how many signals can be captured at the same time.
When buying a logic analyzer, do not be overly impressed if a large number of channels are offered at low price. The quality of the channels, whether they are backed up by an adequate buffer and have a quality input stage is more important. It is better to buy a logic analyzer with a few good channels than one with many poor channels.

Many channels combined with a very small buffer is of value only to view short data sequences, such as  a single read/write to a ram chip, but if you wish to zoom out to get the bigger picture, you will be disappointed wishing you bought a more professional instrument.

With the many IO lines available on today’s PLDs, it is quite easy to provide many channels at low cost. All you need is a DAC to create a threshold reference for all channels, and then a few small surface mount resistors and capacitors per channel. The PLD provides a small amount of ram which can be used as a data buffer. In this way you can make a “logic analyzer” that costs very little. Whether you provide 16 or 128 channels does not add much to the price. To create a top quality logic analyzer the cost per channel is much more than just adding a few capacitors and resistors and putting on a bigger external connector. This is the reason why a good 8 channel logic analyzer would most likely cost more than a 64 channel product as described above and why it makes sense to rather buy a quality 8-channel product.

It is surprising how far you can go with just a few logic analyzer channels, once you became “wise” in the usage of logic analyzers. Personally I think that 16 channels are more or less the optimum required for debugging most circuits. This includes debugging boards with processors with many address and data lines.  If you have gone through the pains of connecting 128 test clips of a large logic analyzer to your 32-bit processor with many address lines, you would most likely never do it again and rather start using your logic analyzer in a “smart way”. You will be surprised what you can do with a 8-channel logic analyzer. (top)

7. Other Criteria to Consider

7.1 Sufficient Input Bandwidth

This specification indicates the maximum frequency that can be measured.
A logic analyzer input acts as a low-pass filter and the “bandwidth” normally indicates the -3dB point where the input signal size has been reduced to half of its low frequency amplitude. The logic analyzer thresholds can be adjusted to cope with the signals getting smaller. Independent threshold voltages on different input ports, allow different thresholds for different signals.

The maximum sampling rate should always be at least four times higher than the maximum input bandwidth. This factor of four is needed for a reasonable representation of the signals after capture.

If the input bandwidth is too high compared to the sampling rate, external switching noise may be introduced to the logic analyzer circuitry, by signal frequency components, that are anyway too high for effective capturing. Such noise only serves in degrading the capture integrity. (top)

7.2 Input Impedance

The ideal measuring instrument can pick up information from the unit under test without influencing the functioning of the unit under test at all.

The input impedance should be as high as possible (high resistance and low capacitance), such that the instrument would not add excessive load to the unit under test. (top)

7.3 Variable Threshold Voltage

The variable input threshold is needed to measure signals of different amplitudes. An independently variable threshold voltage is preferable for every eight inputs. Independent thresholds allow measuring different technology types at the same time.

It is becoming increasingly common to have components operating on 5V, 3.3V, 2.5V, etc, all on the same PCB. If you buy a logic analyzer with many channels and only one threshold for all of these channels, you are likely to have problems, measuring on mixed technology boards, especially when the signal frequencies becomes high.

When input signal frequencies are relatively low, the actual threshold voltage is not all that important, because a perfect 2.5V logic signal will be a square wave varying between 0V and 2.5V and a 5V signal will vary between 0V and 5V. This means that a threshold of say 2V will display both logic types correctly.  In the real world though the square waves will look more like sine waves for relatively high frequencies, will have a DC offset and will diminish in size as a result of input bandwidth limitations. This means that especially for “high” frequencies independent threshold adjustments become more important. (top)

7.4 Versatile Trigger Options

This enables you to capture the exact data you want to see.

You need to trigger on edges, patterns, edges and sequences, e.g. edge condition, then pattern. Deep sequencing is seldom necessary and is  limited to 2 to 3 stages by some manufacturers to keep the user interface easy to use. (top)

7.5 External Clock Input

Synchronous capture is usually used to capture clocked data, using a clock signal from the hardware under test.

For example you could capture data read by a processor by using the processor read signal as clock input to the logic analyzer. As every sample is of a distinctly clocked moment it is usually best displayed as a text listing. (top)

7.6 Easy to use software, manual, help

Easy-to-use software and manual: Of course!

A logic analyzer is an instrument that you may use intensively for a while and then you may not need it again for 6 months or a year. It is therefore very important that the software is easy to use and you would not need to go through a heavy learning curve each time you use it.

It is easy for manufacturer’s to pack a lot of features into the onboard programmable chips, but the problem is that this could lead to overcomplicated user interfaces, in which the user has trouble finding the commonly used features between the many often completely unnecessary features that clutter the setup dialog boxes.

For manufacturers the secret of balance is to provide the important features clearly and in an easily understandable way and not to allow features that users would never use to overcomplicate the user interface. (top)

7.7 Nice Extras

How about an onboard pattern generator to provide signals to your unit under test, while the logic analyzer simultaneously captures the response

You can use this to set up communication protocols to send to your unit under test, or to simply create a controlled clock to your circuit. (Have a look at the Janatek La-Gold-36!)

Some manufacturers provide assembler features. This may add quite a bit to the price of the logic analyzer and should only be considered if you really need it. (top)

7.8 After sales support from dealer and manufacturer

Both the manufacturer and dealer need to provide good before and especially after-sales service. Capable sales personnel should know the basics of working with a logic analyzer.

Difficult technical question regarding the usage and details of the specific instrument should be referred to the manufacturer, which should be answered promptly.

Most PC-based logic analyzer manufacturers supply the latest software updates on their websites for free. (top)

7.9 CE or other compliance standard

This indicates that the product meets certain electromagnetic emissions and susceptibility and safety standards. (top)

7.10 Complete package

If everything you need is not included in the package, e.g. test clips, you will waste time finding it and pay more for it. Quality test clips are expensive. Not including them into the package is a way in which some products appear much cheaper than what they really are. Good test clips are important, without which the usability of the instrument is severely limited. (top)

(start of page)

                       © New Forest Electronics Tel. +44 (0) 1425 650089                 Issue 3.108    12 March 2024