Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Programmable Circuit Board Chip

Status
Not open for further replies.

Ganz

New Member
Hello Everyone,
I'm just starting to get into electronics.
I was wondering if there is a chip that I can program using my computer. I need it to take 2 or 3 analog voltage signals and use them to do a bunch of math (just adding, subtracting, multiplying, and dividing). And then give an analog of digital output. Does such a thing exist? If you could help me out that would be great!
 
Since your "brand new".

Voltages need to converted into smaller voltaes that the computer can digest. These were mainly 0-5, 1-5, 0-10, 4-20 mA and now 2.5 V FS is common, And sometimes 2.048 v full scale. So, engineeringunits, using an example like temperaure from -32 to 100 deg F has to be converted to a smaller voltage that the A/D (Analog to Digital Converter) can digest.

There are different types of converters: a few examples:
Flash D/A's, not to be confused with flash memory. These convert their input almost instantaneously.
Sucessive approximation
A run of the mill D/A

Because the computer internally is a bunch of zero's and 1's, base 2 plays a dominant role. So, resolution depends on the number of bits. 16 bits can represent 0-65525 or -32767 to +32768 integers. 16 bit converters are not common.

So, for a unipolar signal (say 0-5 V), 1 bit is 5/65535 volts.

So yes, there are unipolar and bipolar A/D converters. The same holds for D/A converters. Going the other way to get a voltage output.

Issues always creap up:
1 - monotonicity - e.g +-1 bit error
2. Linearity - how linear the device is
3. resolution - how much your willing to pay
4. quantization error - harder to describe, but suppose our A/D can only count to 100.
1 unit represents a 50% erro if the value was 1.
As the signal gets bigger, the error gets smaller. This is one of the reasons you see 1-5 converters.

Yes, there are SBC (Single board computers) that you can program and chips themselves. You can also buy external floating point units to do the math more like a PC does.

Integer arithmetic is what the computer does best. A programming Language Library contains the floating point operations. Her is an example floating point unit: https://www.picaxe.com/Hardware/Add-on-Modules/uM-FPU-Floating-Point-Coprocessor/

I'll just point you here https://www.picaxe.com/ for a little introduction. Nattively =, just integer aritmetic. Wan't floating point - you need a co-processor and probably a fair amount of work.

You didn't lay down the entire requirements and I did go through an exhaustive list of options. Usually the microcontroller is programmed in C or BASIC and for die hards that have to the hardware machine language".

For now.
 
Another approach: Get a ready-made Data Acquisition Input/Output Interface for a desktop or laptop computer. Some of these plug into a USB. They come with software libraries to do the basic analog input/conversion/analog output. You get to write the code that operates on the data after it is digitized...
 
Status
Not open for further replies.

Latest threads

Back
Top