char lcd[33] = { " lcd buffer demo HH:MM:SS" };
#define line1 0
#define line2 16
void interrupt() // 500 usec (1000 cycle) interrupts
{ static char line = 0xC0; //
static char bndx = 0; //
pir1.TMR2IF = 0; // clear TMR2 interrupt flag
if(line & 0x80) // if "new line"
PutCMD(line ^= 0x40); // toggle DDRAM address (0xC0 or 0x80)
else // not "new line" so
PutDAT(lcd[bndx++]); // refresh display, bump index
bndx &= 31; // pseudo %32, 0..31 inclusive
if((bndx & 15) == 0) // if new line (0 or 16)
line ^= 0x80; // toggle b7 pseudo "new line" flag
}
//
...
I posted a simple method for updating an LCD display from a 32 character display buffer in the background via an ISR so I'm a bit confused by your comment. Is your comment and concern about how your main program would write an eight character ADC value into the display buffer? If so, wouldn't you just write the new eight digit ADC value, buffered or not, all at once into the display buffer three or four times per second? Your LCD writes to the display buffer in 'main' would be perceived as instantaneous since the ISR refreshes the entire LCD display from the buffer every 17 msecs (when using 500-usec interrupts).
I'm sorry, Roman. I still don't see the relevance of your comment to my post. What exactly do you mean when you say "a display interrupt occuring between those two ADC bytes being updated"? Being updated in what way? Are you talking about a display interrupt occuring between reading the lo and hi bytes from the ADC result registers? Or, are you talking about a display interrupt that occurs while you're doing the math on those bytes? Or, are you talking about an interrupt occurring while converting the binary number into an ASCII string in preparation for display? None of these operations should be affected by a display interrupt or cause trash in the 32 byte display buffer or on the LCD screen.Sorry Mike I posted hurriedly with minimal and a poor explanation.
The problem is not just display, also in the example with the ADC the value is derived from the ADC in 2 bytes, which are then combined in a later math process to give a 10bit ADC value. If the display interrupt occurs between those 2 ADC bytes being updated the value may be written wrong and cause trash on the display.
Based on your comments, I wonder if you misunderstand how my driver works? This LCD driver method uses a single 32 byte display array (buffer). When you want to write a character onto the LCD display in your main program you simply stuff the ASCII value for that character into the array at index 0 through 15 for line 1 on the display or at index 16 through 31 for line 2 on the display. The ISR driver writes one character from the array (buffer) to the LCD during each interrupt and after 34 interrupts (17 msecs) the entire display has been updated from the array (buffer).Double buffering is generally used so the ADC read and math can be done first and placed in a buffer, then at some point guaranteed to not conflict with the interrupt the value is copied from buffer1 to buffer2, and buffer 2 is used within the int for display purposes.
#define line1 0
#define line2 16
unsigned char lcd[32] = { " " };
/*
* pseudo ADC code
*/
unsigned int result; //
unsigned char string[4]; //
result = readADC(0); // read ADC channel 0
result *= mymultconst; // result = 0..8191
string = bin2ascii(result); // 0..8191 in, "0000".."8191" out
lcd[line1+4] = string[0]; // huns
lcd[line1+5] = string[1]; // tens
lcd[line1+6] = string[2]; // ones
lcd[line1+7] = '.'; // manually insert decimal pt.
lcd[line1+8] = string[3]; // tenths
//
/*
* flash "standby" at 2 Hz rate
*/
if(msctr % 250 == 0) // if 250 msec interval
{ lcd[line2+2] ^= ('s'^' '); // toggle between 's' and ' '
lcd[line2+3] ^= ('t'^' '); //
lcd[line2+4] ^= ('a'^' '); //
lcd[line2+5] ^= ('n'^' '); //
lcd[line2+6] ^= ('d'^' '); //
lcd[line2+7] ^= ('b'^' '); //
lcd[line2+8] ^= ('y'^' '); //
}
//
/*
* flash RTC colon chars (HH:MM:SS)
*/
if(msctr % 500 == 0) // if 500 msec interval
{ lcd[line1+10] ^= (':'^' '); // flash the colon char
lcd[line1+13] ^= (':'^' '); //
} //
... What exactly do you mean when you say "a display interrupt occuring between those two ADC bytes being updated"? Being updated in what way? Are you talking about a display interrupt occuring between reading the lo and hi bytes from the ADC result registers? Or, are you talking about a display interrupt that occurs while you're doing the math on those bytes? Or, are you talking about an interrupt occurring while converting the binary number into an ASCII string in preparation for display? None of these operations should be affected by a display interrupt or cause trash in the 32 byte display buffer or on the LCD screen.
I'm sorry, Roman. I still don't see the relevance of your comment to my post.
...
But, double buffer what, Roman? You do realize that the display array is being used as a buffer in this method, right? A display interrupt during any of the ADC operations mentioned will NOT trash the display.Yep thats the point. ALL of those operations would cause a trashed display if you don't double buffer.
After you write the ASCII "200" characters into the display array (overwriting the old "199" characters), which shouldn't take more than a few microseconds at most, you'll see a nice smooth glitch free transition from "199" to "200" on the display. With a 17 msec refresh interval, ANY display update appears to be instantaneous. You'd actually have to work pretty hard to force the display to glitch as you describe above, like deliberately using long delays between writing the '2', '0', and '0' characters into the display array.Take a simple example where a var byte is converted into 3 digit decimal number 0-255 and displayed on the LCD once a second. And let's say the var value changed from 199 to 200, and was being processed into 3 decimal digits, and the hundreds digit was processed (changed from 1 to 2) then the int occurs and the digits are written to the display it will write 299 instead of the orig value 199 or the next value 200.
No apology necessary. Your original comment WAS a little cryptic and vague but it's been an interesting discussion.I apologise if my original post "No double buffering?" gave the impression of being overly critical as that was not the intent it was more a surpised (and low effort, sorry) response that you had not used one step in the cyclic sequence for a buffer copy.
...
After you write the ASCII "200" characters into the display array (overwriting the old "199" characters), which shouldn't take more than a few microseconds at most, you'll see a nice smooth glitch free transition from "199" to "200" on the display.
...
(outside interrupt)
* read ADC
* convert 2 ADC bytes to a 16bit var
* convert 16bit var to 5 decimal digits, in array blah[]
* (at a time when interrupt can not occur!) copy blah[] to blah_buffer1[]
(inside interrupt)
* on count0-4 display one digit from blah_buffer2[] to LCD
* on count5 copy blah_buffer1[] to blah_buffer2[]
I've been thinking that all along (lol). I suspect you simply didn't take time to understand some basic concepts behind my buffered LCD method which is why your original and subsequent comments about double buffering something (you haven't said exactly what) and mysterious "ADC trashing display interrupts" have been so puzzling.I'm beginning to think we are talking about different things?
If the 3 chars are written to the [display] buffer outside the interrupt, it is totally possible for the interrupt to occur between writing the 0 and 1 digits or between the 1 and 2 digits, unless you disable interrupts while writing or ensure the interrupt will not occur in some other way.Mike said:... After you write the ASCII "200" characters into the display array (overwriting the old "199" characters), which shouldn't take more than a few microseconds at most, you'll see a nice smooth glitch free transition from "199" to "200" on the display. ...
From what I understand so far, you have no double buffering, and rely on the fact the trashed display will be cleaned up 17mS later (or 3*17mS later with a 3 digit display), which is probably too fast to have seen.
But since the thread is about displaying four multi-digit variables ...
Hi Roman,I wasn't criticising the concept of writing one digit to the display each interrupt.
I was a little surprised by your original comment. In your subsequent posts you haven't really told me what it is you think I should double buffer, nor have you come up with any compelling premises to support a conclusion that I should be double buffering something(?). I've explained as best I can that neither the lack of double buffering nor the display interrupts will cause the trashed ADC values or the trashed displays you've mentioned.What I was surprised at is that someone of your coding skill did not double buffer!
Wow, you've contrived an "optimal procedure" complaint based on a problem and solution built into the HD44780 design. That's impressive.You can live with trashed display data if the display is constantly being updated very fast from the original variable, but living with a problem because it's "usually too fast to see" is not an optimal procedure.
This method does rely on interrupts and so I'm sure everyone realizes it's not the end-all solution for all your LCD applications. However, if you do want to use this method in these situations, then you can just as easily avoid a trashed display simply by waiting for the end of a full display refresh cycle before suspending interrupts or going to sleep. Possible solutions, which don't require double-buffering (sorry), might be;In the event that the code is changed, and the display may be updated once (your 32 chars over your 17mS) then the micro goes into sleep for a second, or does any task where interrupts are turned off for a second, you can easily get a trashed display which would remain for the whole second or for the entire sleep period.
while(bndx); // wait for end of refresh cycle
sleep(); // then sleep
while(bndx); // wait for end of refresh cycle
intcon.GIE = 0; // then suspend interrupts
Double buffering is professional and ensures that after writing all 32 chars, they will ALWAYS be correct, fully independent of any timing issues between when the ADC task is done compared to when the interrupt is done.
...
I'd be happy to continue the discussion if you'd like but it seems pretty obvious at this point that you are unable or unwilling to back up your claims with anything more substantive than opinion or deflection, and, you're simply inventing new problems, that don't really exist, without any premises to support your conclusions or claims.
Mr_RB said:...
In the event that the code is changed, and the display may be updated once (your 32 chars over your 17mS) then the micro goes into sleep for a second, or does any task where interrupts are turned off for a second, you can easily get a trashed display which would remain for the whole second or for the entire sleep period.
...
Actually, you haven't. One only need read the thread to see a pattern emerge.Mr RB said:I have already backed up my "claims";Mike said:I'd be happy to continue the discussion if you'd like but it seems pretty obvious at this point that you are unable or unwilling to back up your claims with anything more substantive than opinion or deflection, and, you're simply inventing new problems, that don't really exist, without any premises to support your conclusions or claims.
I think you should have a look through the Microchip appnotes on double buffering since you seem to be acting like this is some problem I just "made up".Mike said:This method does rely on interrupts and so I'm sure everyone realizes it's not the end-all solution for all your LCD applications. However, if you do want to use this method in these situations, then you can just as easily avoid a trashed display simply by waiting for the end of a full display refresh cycle before suspending interrupts or going to sleep. Possible solutions, which don't require double-buffering (sorry), might be;Mr RB said:In the event that the code is changed, and the display may be updated once (your 32 chars over your 17mS) then the micro goes into sleep for a second, or does any task where interrupts are turned off for a second, you can easily get a trashed display which would remain for the whole second or for the entire sleep period.
Code:while(bndx); // wait for end of refresh cycle sleep(); // then sleep
The solutions are simple, intuitive, and obvious to anyone willing to take time to understand the method.Code:while(bndx); // wait for end of refresh cycle intcon.GIE = 0; // then suspend interrupts
Anyway this is getting argumentative so I'm done here.
Hi Eric,
I'm sorry. I just wanted to contribute something useful. I'm not sure what the other member was trying to do.
Kind regards, Mike
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?