Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

super precision Volt ref.

Status
Not open for further replies.
I can relate to the "Write Only Boilerplate Specs" which I once did for 5.25" OEM Disk drives ( 30 pages) and never again.... well almost .. contributed a few pages to a few hundred on our AMR 2 way ISM meter reader radio , system & network design spec. I can also relate to your pains. What me worry?

When I did something like this for a low cost Doppler tracking station for sounding rockets, getting perfect 50.0% duty cycle and low phase noise was my challenge with a super fast sawtooth S&H mixer . I discovered the non-symmetrical delays in chips and found tuning the delays for symmetry so the 2nd harmonic by >>60dB got me the solution, since it was hard to read 0.01% duty cycle error on a scope.. Getting a 1e-11 OCVO to survive 15g vibe and 100g shock during flight was another issue on all 3 axes. So that apparent Doppler error was minimal. I even tried CMOS foam for additional shock isolation since I was only allotted 0.25 cm of space clearance in the payload , and I tried to emulate the human brain fluid dampening and ended up with foam and solethane for a non-linear low Q spring. (circa '76 for Black Brandt programs)
 
Last edited:
Scope calibrator circuit with the current steering corrected

01CS03_SCOPE_CALIBRATOR_ISS05.00_2015_11_07_crop.png


ERRATA
(1) connection from S2 top to 12V supply should be deleted
 
Last edited:
I can relate to the "Write Only Boilerplate Specs" which I once did for 5.25" OEM Disk drives ( 30 pages) and never again.... well almost .. contributed a few pages to a few hundred on our AMR 2 way ISM meter reader radio , system & network design spec. I can also relate to your pains. What me worry?

When I did something like this for a low cost Doppler tracking station for sounding rockets, getting perfect 50.0% duty cycle and low phase noise was my challenge with a super fast sawtooth S&H mixer . I discovered the non-symmetrical delays in chips and found tuning the delays for symmetry so the 2nd harmonic by >>60dB got me the solution, since it was hard to read 0.01% duty cycle error on a scope.. Getting a 1e-11 OCVO to survive 15g vibe and 100g shock during flight was another issue on all 3 axes. So that apparent Doppler error was minimal. I even tried CMOS foam for additional shock isolation since I was only allotted 0.25 cm of space clearance in the payload , and I tried to emulate the human brain fluid dampening and ended up with foam and solethane for a non-linear low Q spring. (circa '76 for Black Brandt programs)

Yes, specifications love em hate em!

Sounds like you have worked on some pretty leading-edge projects! I'd be interested to hear more, although I expect it's all hush hush.

Taking about specs. I did part of a test set for a Navy destroyer trial once. My bit involved PECL, a 100MHz 12 bit ADC and DAC, both pre-production samples from Analog Devices and all pretty advanced stuff at the time. Anyway, as well as going to the system they required the DAC output via a BNC on the front panel. The spec was fantastic: 5V at 50 Ohms +- 0.5%, 10nS rise and fall, 2% flatness, 2% ground bounce and so on (not the actual figures).

We thought this output must something pretty special but did wonder about the BNC socket. I struggled for weeks to match that spec, working nights and weekends because of the short project time-scale and the hand-over date could not be missed because sea trials started on a particular day. Finally, I got there as long as you didn't look too closely at the waveform that is. As it was all 'need to know' we were told nothing about the application.

But some time later, I got talking to one of the tech operators and asked him what the output was used for. "Oh, that- it's just a test point; sometimes we plug a scope into it to make sure were getting some sort of return." The scope they used was about 40Mhz too. Later I threw up over the stern.

It tuned out that this output was an after thought and the procurement boys had just cut and pasted the spec from a scope manual.
 
Last edited:
None of the 6 or so companies I worked for are still around or doing the same thing. No military secrets either

I once had Burr Brown Mil-STD-883B ADC hybrids 12 bit with all the precap XRay inspection with 100% failure for missing monotonic codes .. It would skip codes often near the boundary ..xxx1111x ...xx1000x then I ordered a non-Mil- Spec Industrial part.. It worked. I reasoned that internally there was a flaw with ground shift from logic current affecting Vref.. affecting some ADC results near boundaries of ...0111 to ...1000 with TTL inside ( '77)

I never hard back from them for acknowledging the problem when I alerted them. Not many people test for this.....

I used a super simple method with analog scope and DAC to measure 0.1% accuracy on scope.

1. Connect binary counter parallel to DAC to generate staircase sawtooth... AC couple to scope to verify no missing codes. ok

2. connect Sig Gen to Burr Brown ADC and ADC out to Burr brown DAC and ..connect to SCope Ch2 and INVERT

3. Then A+B on scope vertical channels to get the difference AC coupled at first to see the dot and staircase loops of missing codes , then figure out why with slow DC ramp DC coupled on scope then call BB and get no answers and then order Industrial part.

Conclusion
....12 bit 10MHz ADC SAR type with 4 bits of noise internally on ground or Vref.. only near some but not all thresholds , where several comparators inside change state at once in SAR ADC.
Poor Digital /Analog Ground bondwire ESR inside. It was monotonic... it just had hysteresis. Later Companies like BB started including specs for no missing codes,, etc.


Got a million more stories like that.
 
Last edited:
:happy:

Shame when these companies disappear. The place where I worked has been demolished now and is just a heap of rubble, although the parent company is still going.

I know what you mean: some semi manufacturers just don't care and others couldn't be more helpful. Altera once sent an application engineer over from the States, complete with a bag of FPGS and all the programming gear, to work on one our projects. He stayed for a couple of weeks too. Analog Devices are also particularly good as are their chips which always outperform the spec unlike some manufacturers. Their data sheets and application notes are good too.

In the early days we had a batch of Texas Instruments transistors arrive. They were MIL-STD-88B and had all the paper work: release notes dates, batch codes, etc etc. The only problem was that instead of being PNP they were NPN. By the way, I think Texas are one of the good companies.

Having said that Analog Devices are good, they did caused havoc on the project I mentioned in my previous post.

I have a thing about schematics and PCB layouts and always try to have the data and address buses in order so, for example, Do might go to pin 1 of a connector, D1 would go to pin 2 and so on. We had a great guy from the drawing office do the layout of the DAC board: it was a work of art. When I fired it up though, the DAC output didn't make any sense. Of course, being a pre-production part I thought it was faulty but apart from the weird output everything seemed to be working fine. I just couldn't figure it out, so I asked another engineer to have a look at the schematic, board, and DAC data sheet. He said everything was ok.

In the end I found the problem. This is representative of what was in the data sheet:

AD_DAC_pinout.png


Would you believe it, D0 is MSB and D11 is LSB, the opposite of what you would expect. Analog Devices put it in brackets now but they still use the same back-to-front notation. So bang went the nice layout and, even though the wiring line did a neat job of reversing the connections, the circuit diagram looked like someone vindictive had been at work with the bus lines crossing and snaking all over the place.

Like you, I could tell many tales but I think I had better get back to the scope calibrator now.
 
Last edited:
Well good grounding is critical for any precision work.
I just spent about an hour tracing a 0.10V error where I expect 0.02V err max.
Circuit worked when relays weren't active....so guess what....4" of ribbon cable (chinese) introducing .03V ground bounce between circuit boards for the 100mA relay drain.
That shrunk the Vsense signal AND the 5.12Vref by .03V. Since the Vsense (8x over sampled) was divided by 2 to get a result the 0.03Vbounce error got turned into a 0.12V error. Reduce by 1% from the Vref drop and we have 0.118 V cumulative error. Tag that onto the 0.02V system tolerance and 0.10V error was the outcome.

So now I went ahead and over spec'd the design with an added pair of ground jumpers to strap the uC ground tight and upgraded the ribbon cable spec (from whatever it is) to 18AWG hookup wire for the molex friction lock supply input.

Sometimes it's easy to over look a part like a short bit of 'stock' ribbon cable as a serious spec. item.
 
I looked over the design for a friend. He had given it too me earlier and he complained that it didn't work the way it was supposed to, but it was usable (i.e. worked without the designed in functionality). He thought he had defective or counterfeit chips and he obtained a few more. Same issues. To make the long story short, Some inputs were TTL levels and others were CMOS. In a "Normal" design with these parts, one would typically strap the chip (this pin to Vcc, this to Gnd). He fixed with a couple of discrete level shifters/inverters and a corresponding software change.

So, he was using TTL levels to control CMOS inputs. CMOS levels were not allowed on the TTL inputs.
The chip had both TTL and CMOS I/O. I think it had both supplies too, like +5 and +12.
 
Hi Mosaic,

How right you are. It's the little things that are very often the killer especially precision stuff like the scope calibrator. Apparently in critical areas, the commercial calibrator is built using microwave techniques with cans feed- throughs and all that.

High speed is another area. Often, if you break the rules with ECL for example, it simply won't work. It does start with a good ground but also the supply lines. It's interesting with a rack full of TTL cards to put a DVM across the actual supply pins of the chips to measure what the chip actually sees. For nominal 5V TTL I have seen as low as 4V. That's why you often see the old dodge of cranking up the power supply to 5.25V. All this makes the difference between a system that works most of the time and one that is a solid reliable performer.
 
Last edited:
Hello Keep,

You make me smile. This is a great area that has a load of facets, some of them hilarious. Apart from newbee designers, who you can have sympathy with, there seems to be three broad types who go in for this sort of thing: the application report* cut-and-paste type, the emulator* addicts, and the plain cussed. I don't want to sound arrogant, especially as I have made some gross errors along the way, but here is a couple of things that crawled out of the woodwork: A circuit where the input current of an op amp was critical to the circuit function and a slow sine wave being fed into a non-Schmidt TTL gate.

At the sub-system level we had a situation where one card struggled to feed a 10V signal into another card where it was immediately attenuated to 5V. We even had a complete synchro resolver card feeding nothing. The synchro requirement had been delated by the customer months before. I'll stop now.

By the way, I have opened up a new thread to discuss areas like this. I hope you and Mosiac and all the others that I have met on ETO share some more of your experiences and views there.

* both vital tools
 
Last edited:
I looked over the design for a friend. He had given it too me earlier and he complained that it didn't work the way it was supposed to, but it was usable (i.e. worked without the designed in functionality). He thought he had defective or counterfeit chips and he obtained a few more. Same issues. To make the long story short, Some inputs were TTL levels and others were CMOS. In a "Normal" design with these parts, one would typically strap the chip (this pin to Vcc, this to Gnd). He fixed with a couple of discrete level shifters/inverters and a corresponding software change.

So, he was using TTL levels to control CMOS inputs. CMOS levels were not allowed on the TTL inputs.
The chip had both TTL and CMOS I/O. I think it had both supplies too, like +5 and +12.


HCMOS uses V/2 for input threshold,, LVCMOS uses a different level closer to TTL which is 2 diode drops or 1.3V the same voltage when TTL is floating on inputs.
 
Last edited:
Well good grounding is critical for any precision work.
I just spent about an hour tracing a 0.10V error where I expect 0.02V err max.
Circuit worked when relays weren't active....so guess what....4" of ribbon cable (chinese) introducing .03V ground bounce between circuit boards for the 100mA relay drain.
That shrunk the Vsense signal AND the 5.12Vref by .03V. Since the Vsense (8x over sampled) was divided by 2 to get a result the 0.03Vbounce error got turned into a 0.12V error. Reduce by 1% from the Vref drop and we have 0.118 V cumulative error. Tag that onto the 0.02V system tolerance and 0.10V error was the outcome.

So now I went ahead and over spec'd the design with an added pair of ground jumpers to strap the uC ground tight and upgraded the ribbon cable spec (from whatever it is) to 18AWG hookup wire for the molex friction lock supply input.

Sometimes it's easy to over look a part like a short bit of 'stock' ribbon cable as a serious spec. item.

Ground shift and I*R drops are easy to miss. Ensure termination is ideal for fast clocks. Current mode logic is ideal as is current loops for analog. DIfferntial loads keeps the current constant, which is why ECL is noise free. precision voltage sources can be sent long distances by converting them to precision current sources. Braid wire works well for common grounds otherwise flexible copper conduit for microwave.
 
That does help, the instrument looks fairly useful to 700Mhz based on its stated spec, just 4% at Full scale.
I suppose I'll have to change out the electrolytics and tantalums given its age though, but the construction should be old school serviceable!
Price isn't bad really.
But this one looks good....with all the accessories
**broken link removed**
 
The critical voltages are vertical and horizontal sweep. If you capture the high voltage (~100V) of the horizontal sweep with a precise S&H and the waveform required, then any accurate DMM and uncalibrated signal source could be used. The time base accuracy is easy with Xtal accuracy and the vertical gain and offset accuracy is easy with a good DMM and any uncalibrated voltage source. This is essentially how the Tektronix PG506 cal unit works, I think is the built in DMM. Deflection error is measured in % relative to 100V. Tek's DMM uses a modified dual slope integration method which has better noise immunity than older type SAR successive approximation types due to S&H errors, but modern ADC's have improved this with less crosstalk in special CMOS switches. The DMM is only 200 counts per ms. But the calibration of the Calibration unit was done by Tektronix using a JOHN FLUKE Model 8375A with 0.025% accuracy. The 50 Ohm termination is accurate within 1000 ppm. Risetime abberations were measured on DUT scope with 10 ps/div scope. or at least 7x faster than the typical mainframe scope.

Output amplitudes of 100 V, 50 V and 20 V originate directly from a precision voltage divider composed of 4 R's and a 5 mA Current Loop .
The 50Ω source section operates only when amplitude settings of 10 V or lower are selected.
The standard square wave output goes up to 100V? with 1/3 octave steps and a vernier scale. 1,3,5,...10,30,50....


The fast rising pulse source is essential to calibrating the scope's low frequency and high frequency phase and high frequency droop compensation for flatness.
 
I think the most important part of obtaining accuracy of 0.25% is when you use the scope in differential mode with both channels matched to get maximum common mode rejection across the entire spectrum so you do A+B(inverted) to sense any high voltage current or differential high speed signal with an offset CM DC voltage. I always found this to be very handy in my line of work.

But after you use a Tek scope you appreciate the precision of the triggering in noisy signals and the phase lock so there is no flicker on the screen such as when sweeping at a subharmonic. This is really clever in chop mode, so that it does not chop the signal synchronously and yet both channels are coherent.

But then after you use a 1GHz+ DSO you won't want to go back to analog.
 
Last edited:
Ah well..the 1Ghz DSO 'scope $9K barrier....maybe in a few years I'll get there when it's a $2k barrier.
 
I got my eyes on one of those 6 instruments in one scopes.

64808-split-scope-board-showing-original.jpg
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top