Hi i was wondering, what would be the best device or way to get very fast anolog to digital readings. I want to get up to 50 readings per sec. That should be the sampling rate. converting anolog voltage to digital voltage. How can i go about doing this. I think microcontrollers are too slow.Anyone know of a device i can buy that can do this????
any ideas?
Your average Microcontroller has an A to D that runs in the 10 to 15 kS/s. Thats 10 to 15 thousand samples every second. Even with a lot of software overhead you should have no problem sampling at 50S/s.
Hi i was wondering, what would be the best device or way to get very fast anolog to digital readings. I want to get up to 50 readings per sec. That should be the sampling rate. converting anolog voltage to digital voltage. How can i go about doing this. I think microcontrollers are too slow.Anyone know of a device i can buy that can do this????
any ideas?
As bmculla has already said, 50 samples per second is extremely easy with a microcontroller - to do 50 samples per second using a PIC you would take a sample, call a delay routine for 20mS, then take the next sample. Almost all of the time in the program will be sat in the delay routines, waiting until it's time to take the next sample.
If you check my analogue PIC tutorial they do 10 or 20 samples per second (20 for two channels), simply because of the updating speed of LCD displays, and to prevent excessive flicker on the displays. To change it to 50 samples per second just change the 'Call Delay100' to 'Call Delay20' - the delay routine is already in the code.
Basically 50 samples per second is a very slow sampling rate, you could do it with a clockwork micro-controller :lol: