Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Micro processor/controller core discussion.

Status
Not open for further replies.

3v0

Coop Build Coordinator
Forum Supporter
It seems most everyone has their favorite core. Rather then expound on it is a thread where some newbie is searching for his own feet lets do it here.

This is a chance to blow your horn regarding your favorite. Other then keeping it civil I see no need to set limits. Feel free to compare A to B or tell us why a feature is great or blows.

My only caution is to compare apple to apples. For example do not compare the long obsolete PIC16F84 to a current processor.
 
**broken link removed** Originally Posted by 3v0 **broken link removed**
Just like other corps which support education by providing product.

Microchip could easily discontinue old parts but chooses to continue offering them at a increased price. It makes it possible to build older designs. In my book that is better then obsoleting them.

Mostly what I see here is bias.
Dan said:
Nope, practicality. It is impractical to pay twice as much for half the product in industry and it is impractical for someone to put the time in to learn how to use something that is impractical in
industry if they are looking to get a job using it.

One would be foolish to start a new design with these old chips.

You have the learning thing all wrong. We could be teaching people to program on most any processor. It would not be wasted because the person is learning to think in a way that will allow him to learn other processors, each a bit faster. Anyway the chance of landing a job programing on the processor you were trained on are slim. For this reason some schools teach asm on obsolete processors.

A fellow who can only use 1 processor family or language is very limited. In industry you most often work with what is traditionally used, or what the EE or others have specked for the project. It is rare for the firmware designer to be given processor choice. Maybe choice withing a vendors offering.
 
Ok so now you guys have picked my curiosity. About 2 years ago wanting to start learning about MCUs and ASM, at that time I found Microchip to be the most popular with the most support so I started with PICs. After a while I found ASM to be time consuming and wanted to get my projects done faster so I moved on to PBP with an investment of almost $300 and an other $50 for PICKit2 programmer.

Now I hear from professionals that PICs are too expensive for what they are so I would like for someone to point out what are the options indicating advantages and dis-advantages.

Mike
 
I did not say PICs were expensive. Just the obsolete ones which no one should be using for new designs. But since there are a lot of old tutorials and designs using them we should be happy that we can get them even if there are better buys out there.

Would you rather spend $2 more for a chip or spend hours updating the design to use a newer chip ?

If PICs were such a bad deal it would put Microchip out of business. They seem to be alive and well.

Most PIC haters (and have one here) always seem to talk about obsolete chips when they make their point. As I noted in the intro to this thread.
 
I must support 3v0's position here. There is value in the amount of material available on older PIC desgns. Students can develop on them and simulate them for free now. perhaps even build a few real devices, thus the $2.00 extra for an 'obsolete' device is a good investment given the simplicity and the availability of resources for noobies, like me.
I am able to make good progress on more complex devices cause i USED to do assembly on the C-64, I even sold assembly programs commericially, and I have been trained as an engineer in controls and automation.

Some ppl told me to start with the 18F PIC series since 16F is old hat and not very suitable for high level languages. I am learning MCU's to be an expert so I intend to start from the ground up. That way I don't miss anything and will know when a high level language is required as opposed to a bit of super efficent code and a tiny pic. I am also a bit of an environmentalist aka minimalist. Therefore I will look at smaller PIC's first and move to larger ones as required on a per project basis.
 
Most modern micro controllers use the Harvard architecture, as far as the instruction set goes you have two big sides and two very different ways of doing things. PICs are CISC, AVR's are RISC. Modern PICS can do some really high end specific tasks, AVR's tend to be more useful for overall efficient CPU over a wider range of tasks, for any specific device. To compete with the raw processing power of AVR's PICS have to use PLL modules to multiply the base clock, on a clock per clock basis the AVR has more processing cycles yet fewer instructions.

There honestly is no clear cut winner, nor should there be, they both have their places otherwise both products wouldn't exist. If one were clearly better than the other than only one would exist.

Personally I lean towards a rethinking of existing architectures and using what is efficient and works well, which is why I don't like PIC's, they've been around forever and it's one thing added to another to another to another with very few absolute major 'OMG' changes, with the micro controller market compatibility is a BIG issue. Atmels started in the game late so to speak but with a really rock solid core, now they're trying to (as can be seen by their product releases) move into the niche markets while maintaining there core architecture.

Take a look at the modern trend of PC's in general. The X86 architecture is so outdated someone should have hit the reboot key YEARS ago, yet we keep tacking instructions sets onto them. There is a growing trend primarily in the netbook area that use non-standard devices to run standard software, which I think is the market saying "CISC isn't perfect but it's better than the bloat we have now" These specific instructions sets like MMC and what not have proved they simply don't add anything, they're just crutches to avoid the basic architecture re=think that modern microcontroller/processor design needs.


Sony I think was on the right track with their core processor and the multi processor market nowdays proves that. More cores doing distributed massively parallel processing using RISC style cores rather than a single major core using a CISC core. The CISC cores of old fractured into the current multiprocessor RISC oriented architectures we see becoming more common, the way we use computers and devices in general just make sense for task basing things so why should we have hordes of code to do multitasking rather than cores dedicated to a single task?

Although not incredibly popular the Parallax Propeller chip is a good example. It's a multicored micro controller. In theory it can have massive advantages over even the fastest of RISC based machines if more than two things are going on at once especially if they're not symetrical, and has advancedments overs CISC where instead of having to add special instructions to maintain the same processor ouput for a given task it can simply toss a lot more simple cores at a task.

The pain in the rear is the coders. It's not an easy transition to multi-core for high level programmers because they have to thread their applications, and it's not easy for ASM coders either because it's seperate programs running concurrently with massive issues for timing. All in all it's a nightmare.

I peronally think the world will continue to lean towards RISC based Harvard architectures.
 
Last edited:
I'll stick my £0.02GBP in here.

Due to the plethora of firm-grounding tutorials and projects available online for the '84, I'd say it was a wise move for Microchip to continue production of this device, albeit it at an increased cost per unit. Who, as a noob/hobbyist, wants to pour over migration documents for a more recent device and spend time porting code, when the original is still in production? Most want to get and briefly study a listing, compile and flash it to see that it works, and then dissect/digest it in more detail.
 
And that's where things get rough Mickster, it's impossible to innovate while staying the same =)
 
And that's where things get rough Mickster, it's impossible to innovate while staying the same =)

But doesn't a level playing field not produce some buds which sprout much higher than others, given the same watering?

We all have to start somewhere, whether that is with a silver spoon in our mouth or a plasic one in our hand....

There are, most likely, engineers from impoverished nations looking at '84 tutorials not only seeking ways to elevate their own status and secure funding, but also hoping to improve the comforts of the fellow countrymen and women.

They mostly innovate not from a position of potential capital gain, but from one of necessity....
 
There is no level playing field Mickster.
I started from the best as can be thought of as a level playing field that I can think of. Access to the Internet and a lot of time.
I chose AVR after about 4 months. I looked at PICs, AVRs, 8051's Z80's and at least a half dozen others that had so little a following that they weren't even considered. PICs have maintained their core architecture up until recent years and they only changed that because you can't compete by continually adding onto a single idea forever. I chose Atmel AVR's because they started from scratch and are trying to add on from there a few generations later.
 
Horses for courses. I have four chip architectures now that I try to use what's best suited for the job. PIC 16 series was about the only one I tried and really didn't like so I gave up on it early. That was a long time ago, though.
 
We, in the 'developed' nations, have plenty of tools at our disposal in relation to our monthly income, to satisfy our interests/hobbies/whims of fancy.

What do some of these other guys on the forum have? Practically bugger-all and they are simply trying to sustain a pump for a well, or some other basic necessity...

What is already out there on the net, can be make or break for them.
 
And that's where things get rough Mickster, it's impossible to innovate while staying the same =)

Gotta agree with that. But you gotta also LEARN b4 u can innovate. There's the rub. You learn from established proven data,methods and tools. Then u develop your own approaches, determine weaknesses or opportunities, innovate and add to our knowledge base (hopefully)
 
Chapter 4.9 (What Others Do) of this book may be helpful to people wanting to join this thread. PIC microcontrollers - Google Books

The 16F core has only 35 single word instructions. I do not see why people here are calling that CISC?

Some decades ago RISC computers took fewer clock cycles to execute a single instruction. However it took more RISC instructions to do the same job the advantage was mostly an illusion. The true advantage of the two shifted as the ability to increase the clock speed favored one or the other.

There days we can build little uC's to fantastic clock speeds. So we simply buy one fast enough for the application. When programming in a high level languages you do not see the instruction set. No reason to care if it is CISC or RISC.

Personally I love orthogonal instructions sets. But since I mostly only look at the code to see what a compiler is doing (or to compare what compilers do) I do not much care.

Having said that you might ask why I started this thread. Because although it is mostly an academic question for 8 and 16 bit uC's it is still an interesting one. At some point it would be fun to talk about the 32 bit cores in that they are used for high demand tasks where every bit of speed can count.
 
I think all the major uC companies will be releasing some very advanced products in the next cycle. There is a lot of very advanced 200mm equipment in the field that's very cheap now due to the big boys upgrading everything to 300mm. The pic K series with low-k dielectrics is an early example. I think a large amount of the effort is in optimizing the 8/16 bit internals for higher level languages structures, "C" mainly. The R&D guys are going home late with a smile on their faces.


https://www.google.com/url?sa=t&sou...i_mRDg&usg=AFQjCNHvWA5zJZaOw1yCBuzgQ39t17o-iA
**broken link removed**
 
Last edited:
  • Like
Reactions: 3v0
Wikipedia give a good working definition
Orthogonal instruction set is a term used in computer engineering. A computer's instruction set is said to be orthogonal if any instruction can use data of any type via any addressing mode.
Instead of having several different instructions to load and store data to various registers or memory an Orthogonal computer like the PDP11 use a MOVE instruction. This one instruction can replace all CICS or RISC instructions that move data.

The MOVE instruction has source and destination fields that can specify any resource: register, memory, PC, STACK. You can move anything to anything in one instruction. The same is true for all of the small set of instructions.

I think the reason we do not see more of these machines is that it is complex to create the data paths needed to make all the addressing modes work. And given that little production work is done in ASM the beauty would be hidden anyway.

Once you have programed ASM on a Orthogonal machines others seem to be a painful set of special cases. Which they are.

Digital equipment was on the right track with their LSI-11 series of microcomputers. They used a microcoded processor to create the instruction set.
 
Last edited:
**broken link removed** Originally Posted by 3v0 **broken link removed**


One would be foolish to start a new design with these old chips.

You have the learning thing all wrong. We could be teaching people to program on most any processor. It would not be wasted because the person is learning to think in a way that will allow him to learn other processors, each a bit faster. Anyway the chance of landing a job programing on the processor you were trained on are slim. For this reason some schools teach asm on obsolete processors.

A fellow who can only use 1 processor family or language is very limited. In industry you most often work with what is traditionally used, or what the EE or others have specked for the project. It is rare for the firmware designer to be given processor choice. Maybe choice withing a vendors offering.

that is not entirely true: pretty much everyone supplies ARM cores these days. The variation is in the peripheral sets. That said, I did get burned by Atmel. They did replace 50 $10 chips, but that was a small fraction of the damage done.

While I am the exception, having programmed many processor's assembly languages, most programmers will be using C or some other high level language.

It is also worth noting that it was not my idea to compare the new ARM to the ancient version of the PIC, I only suggested using the ARM.
 
Wikipedia give a good working definition
Instead of having several different instructions to load and store data to various registers or memory an Orthogonal computer like the PDP11 use a MOVE instruction. This one instruction can replace all CICS or RISC instructions that move data.

The MOVE instruction has source and destination fields that can specify any resource: register, memory, PC, STACK. You can move anything to anything in one instruction. The same is true for all of the small set of instructions.

I think the reason we do not see more of these machines is that it is complex to create the data paths needed to make all the addressing modes work. And given that little production work is done in ASM the beauty would be hidden anyway.

Once you have programed ASM on a Orthogonal machines others seem to be a painful set of special cases. Which they are.

Digital equipment was on the right track with their LSI-11 series of microcomputers. They used a microcoded processor to create the instruction set.

I used to have an old PDP11 rack... wonderful language PDP assembly :)

Motorola 68K, ARM, and others (remember the TI 99/4?) are similar.

Dan
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top