Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

4800 Baud UART on internal 8 MHz oscillator (AT89LP51ED2) ??

Status
Not open for further replies.

Anduril

New Member
Two questions really:

1) Is it even advisable to run a 4800 Baud UART using the internal baud rate generator and operating on the internal 8 MHz RC oscillator (which is specified at +/- 2.5%)
My 4800 baud rates are all over the place (generally within 5%), but vary from one part to the next.
This design originally used an external crystal oscillator, and the 4800 was spot on.
I'm concerned about other equipment being able to understand what we're sending out on the serial port (which is converted to RS-485 after it leaves the upc.
I'll most likely go back to a crystal (for other reasons), but would like to know if slow-ish serial baud rates are "do-able" with the internal RC oscillator, and still be reliable (or do you have to use a crystal?)

2) Supposedly, address 0180h in the User Signature Array is the internal RC clock oscillator offset/adjustment value. The datasheet doesn't say much about this, only that the value is inverse to the frequency: so higher values mean a slower internal clock. The value in the array is a copy of one in the Atmel bootloader, not that it matters. My problem is I can't seem to reliably write a value to this location. How do you adjust the internal clock?! Somehow, I manged to get a 00h written to this location and apparently that just bricks the part. (Well, OK, the part still works fine - as far as I can tell, but there's no way to undo the 00h value -- and that threw my whole 4800 baud off so far I could no longer decode it - went to a much slower baud rate, probably something around 4350 Baud or so.

Frustrated after two days of tinkering with this.
I'm very familiar with the part generally, but don't usually run designs on the internal osc.
My setup is Keil uVision (C & Assembler), and a B&K 867C programmer.

Thanks in advance for any guidance. :)
 
What Micro are you using? I have run internal OSC micros up to 57600 without any issues.

Alternatively you could implement an auto baud detector in your software where it wouldn't matter much on the RC speed.

The caveat to Auto Baud Detection is that the ASCII equivalent of your first BYTE in your data packet needs to be ODD ... i.e. A = 65, C = 67 ...this way the START bit of each packet can be timed and used on the preceeding bits in the data packet.
 
AT89LP51ED2 in the 40-pin DIP package (-20 AU, I think?)
All this code runs on a "private" known wired network consisting of one "Master" and up to maybe a dozen remotes.
The protocol is half-duplex RS-485 fixed at 4800,N,8,1 with CRC-8 (CCITT)

I didn't think this would be a problem either, and never bothered to check the datasheet.
It "claims" (nebulously, I might add) that the +/- 2.5% internal RC clock accuracy is suitable for UART use.

However, if I program up say 50 chips, measure the bit rate time on an oscilliscope, and then take the highest and lowest value parts, I can't reliable get them to talk to each other. Without making more exact measurements, I'd say the communications start to fall apart about 5.3% error (worst case clock error difference between parts in my sample).

To put a finer edge on it, my Rigol scope (w/ RS232 decoder built-in), can reliably decode up to about 8% error.
But using Putty on a PC, and I suspect it's a lot tighter than that - maybe under 3%? Maybe even 2%.

Now, our design won't talk to Putty or a Rigol scope under normal use.
But you know.. we might want to make some add-on's later on that will sit on the 485 bus, so I'd like the clock (baud rate) to always be in the ballpark to be properly decoded by [who knows what down the road?].

One more thing I'll just add:
Data is sent in fairly small packets (under 10 ASCII characters per packet), and the return packages (if any), are even smaller than that (from the remotes).
So, for giggles, I inserted a small one millisecond delay between characters to see if that would help any. It doesn't.
And now thinking about it, of course it wouldn't!!
The clock accuracy isn't changing, no matter how much I want to dance around that.

But like I said, I'm pretty much going to revert this design back to a crystal regardless.
The only reason we went with the internal clock to begin with is that we needed that proverbial 'one more pin' to control an added function.
Well, that lead to a couple unintended downstream consequences, which we'll unwind on the next set of boards.

One last thing:
The AT89LP51ED2 datasheet doesn't say anything about the drive levels of the crystal pins when using them as GPIO. It turns out they are very whimpy - unable to source 1 or 2 milliamps! (Which means we had to add a FET, etc.. and it snowballed from there.)

I have since decided I can rob (i.e., do "double-duty") on an existing Port-0 pin and reclaim my crystal pins for use with an external crystal or oscillator.
 
With 1 start bit and 8 stop bits the bits are only 11% apart so an error of 5.5% will always cause problems. Does the AT chip have any kind of clock tuning like the Pics do? My only experience with internal clocks is on Pics with a 1% max error which has always worked for me.

Sounds like the crystal path is the right decision.

Mike.
 
With 1 start bit and 8 stop bits the bits are only 11% apart so an error of 5.5% will always cause problems. Does the AT chip have any kind of clock tuning like the Pics do? My only experience with internal clocks is on Pics with a 1% max error which has always worked for me.

Yes, PIC's are fine at 9600 using their internal clock, I use them all the time - not so good at 115200 though, you tend to get intermittent errors. So basically I stick to 9600 unless there's any need for faster.

As the OP is getting different results from different individual chips, then it's obviously not a very accurately calibrated device?. I'm presuming he's not over writing a factory calibration value? - some of the old PIC's had a calibration value stored in program memory (you called a subroutine to set the oscillator using that value), so you had to ensure you read it, stored it, and rewrote it, every time you erased or reprogrammed the device.
 
Even with a 1% error, it can accumulate. However, if you use two stop bits on the transmitter then it re-syncs every byte.

Mike.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top