Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

hardware is still coded with machine/assembly language for efficiency, speed, etc.

Status
Not open for further replies.
I trust you are not being flippant:)) but you are correct.

I was little sarcastic, but what I said is still true. I could have said that "C compilers have bugs because they are written using assembly", but that is not true. Only the first compilers were programmed in machine language.

I see this topic coming up about twice a year and it is always the same "fight" between ASM and C. I get the feeling that there a more experienced assemblers in this forum than experienced C programmers. And way more PIC users than AVR. But I think we all agree that we need both languages. I won't miss Java, C# or C++ in the world of microcontrollers, but C and ASM are both essential languages in embedded software. You just have to pick one that suits your needs. Here is a short list of pros and cons of both languages (as I see them). Add to the list please :)

C:
+ Extremely reusable and modular.
+ Portable from platform to platform (standardized language).
+ Large projects relatively easy to manage.
- Genereates considerable overhead.

ASM:
+ Gives full control to the user.
+ Extremely efficient in code size and performance.
- Large projects difficult to manage.
- Not portable between architectures.

Code lasagne, not spaghetti!
 
Last edited:
+ Portable from platform to platform (standardized language).

This is something the pro-C brigade always bring up, yet it's completely untrue - C seems no more portable than any other HLL (not even PC versions) - obviously it's more portable than assembler :D but that's about it.

Certainly as far as micro-controllers go, the different C compilers are extremely non-portable.
 
This is something the pro-C brigade always bring up, yet it's completely untrue - C seems no more portable than any other HLL (not even PC versions) - obviously it's more portable than assembler :D but that's about it.

Certainly as far as micro-controllers go, the different C compilers are extremely non-portable.

Depends on how well you design the software. I write most of my software modules using ANSI C and carefully separate the platform dependent code. I understand your point, but saying that C portability is "completely untrue" or that "(embedded) C compilers are extremely non-portable" is ridiculous. I have extensive collection of software modules for data structures, fixed point math, digital filters etc. and I can easily use them without modification on any platform.. whether it is PIC, AVR, ARM, Embedded Linux, PC, etc.

Well written C is the most portable language there is. What else language ports from microcontrollers to desktop computers and between different cpu architectures more easily? Can you think of one?
 
Well written C is the most portable language there is. What else language ports from microcontrollers to desktop computers and between different cpu architectures more easily? Can you think of one?

Any language which is written that way, there's nothing special about C that makes it more portable - standard BASIC is pretty well as portable as C.

But I notice you're only using a sub-set of the C language, picking only small parts of any language obviously makes it more portable, but much less versatile.
 
there's nothing special about C that makes it more portable
The number of available cross-compilers make it more portable than any other language. It is not a language feature, but a real world fact.

But I notice you're only using a sub-set of the C language

What makes you think that? What is this subset.. or what part of C I'm not using?
 
Last edited:
The number of available cross-compilers make it more portable than any other language. It is not a language feature, but a real world fact.

There's probably not far of as many BASIC compilers as C ones?, with the basic features mostly portable between them.

I think we have a different opinion what 'facts' are - if C is 'portable' you should be able to easily change from one device or compiler to another, this is FAR from the case. A program written in Microsoft C won't usually work under Borland C for example, without considerable modifications.

And as we're in a PIC forum, none of the PIC C compilers seem to be very interchangeable at all.

What makes you think that? What is this subset.. or what part of C I'm not using?

You said you're only using the basic ANSI C features, which is only a sub-set of most compilers.
 
And as we're in a PIC forum, none of the PIC C compilers seem to be very interchangeable at all.

They do not need to be since they are all compilers for the same platform.. Just choose the "best" one and stick with it. I believe they all can compile standard ANSI C, if not, the compiler is crap.

You said you're only using the basic ANSI C features, which is only a sub-set of most compilers.

That "ANSI C sub-set" happens to be the C language. Separation of plain and simple ANSI C code from platform specific code is the heart of good software design.
 
They do not need to be since they are all compilers for the same platform.. Just choose the "best" one and stick with it. I believe they all can compile standard ANSI C, if not, the compiler is crap.

So, not portable then?.

That "ANSI C sub-set" happens to be the C language. Separation of plain and simple ANSI C code from platform specific code is the heart of good software design.

Crippling your coding to antique standards is probably poor software design, not good.

But now we seem to have moved from 'portable code' to 'seperating out portable and non-portable parts' - - like I said, it's no where near as 'portable' as C fans like to claim.
 
Crippling your coding to antique standards is probably poor software design, not good.

I wouldn't talk about antique standards if your choice happens to be assembly.

But now we seem to have moved from 'portable code' to 'seperating out portable and non-portable parts' - - like I said, it's no where near as 'portable' as C fans like to claim.

That "portable part" is usually around 90% of an average software project. It would be stupid not to separate it into reusable modules.
 
Last edited:
BASIC is a nut case.

All languages has issues with representation of integers. 8 bit, 16 bit, 32 bit, 64 bit. You might not even have 64 bit integers on an Intel 4004 processor. ASM also has these same issues. So, word length is a big issue. Some BASIC's have a conditional processor.

The IEEE floating point system did a lot of nice things, but then there are BASICS with integer sine and cosine functions.

As an Aside. I once wrote a BASIC application to "pretty print" a BASIC program and it was "extremely fast. It also let me see the nuances that occur in the compiler/interpreter. I was also lucky to have access to the interpreter sources. It could very easily become the basis for a compiler/interpreter. One thing I didn't work out was removing string concatenation. e.g. Print "A"+"B"; PRINT "A";'B'; print "A";'"B" and issues such as these. I did it two ways. The second was elegant and very fast.

It consisted of a set of options:
1) A pre-processor
2) A scanner
3) At this point things could be turned into tokens.
4) The pretty printer itself.

Creating "tokens" would have been so easy. It would not have been the traditional way of compiler writing. Yea, I took a class in it.
 
I totally agree with MisterT, C is the most portable programming language you can program in – and calling ANSI C antiquated when you are a supporter of assembly is a bit odd to be quite frank :p.

Writing good modular code allows C to be very portable. I made the move from a PIC18F2550 to an ATMEGA328 in the space of about 10 minutes. It was just a case of including different libraries, different peripheral setup and changing my peripheral function calls (like for putting a byte out over SPI), and the rest of the code remained unchanged. I would like to see the assembly fans make such a move that quick.

Plus Nigel, anyone that has problems porting code from Borland C to Visual Studio or whatever should not really call themselves a C programmer as far as I am concerned (I originally learnt C in Borland C++ Builder 3). Firstly they have not used compile time directives to enable compilation on different compilers (something I do during writing, and most professionally writ code would also do the same), and secondly the biggest problem most Borland C users have is their reliance on conio.h or other Borland specific headers and libraries which is again a problem of sloppy coding practices and not C itself.

I am a big supporter of (Allman/ANSI styled) ANSI C. I think BASIC should have gone out with shoulder pads to be honest (I used to write it as a child on a Dragon 32). Assembly has its uses (optimisation) but it’s not something I would recommend people using for an entire project of any respectable size, much like in the Python community (who class C as low level) – people use C where there are bottlenecks which cannot be overcome with Python. I think the same should be done in embedded systems with C and Assembly - working projects in the quickest development time.
 
Last edited:
"A";'B'; print "A";'"B" and issues such as these. I did it two ways. The second was elegant and very fast.

It consisted of a set of options:
1) A pre-processor
2) A scanner
3) At this point things could be turned into tokens.
4) The pretty printer itself.

Creating "tokens" would have been so easy. It would not have been the traditional way of compiler writing. Yea, I took a class in it.

The formal/traditional way of using a "scanner" seems at first to waste time and resources but the first time you need to change a input stream specification or backtrack on a possible bug it's worth it's weight in gold. (or pulled out hair) I programmed a very simple version for a "Tiny Pilot" compiler for the 8080 on a home brew board. A VIC 20 PILOT interpreter: https://www.atarimagazines.com/compute/issue40/vic_pilot.php

The old Oberon page is full of links to information about "Thinking low level, Programming high level"
**broken link removed**

This is a must read to understand the difference between computer programming and computer science.
https://www.electro-tech-online.com/custompdfs/2012/04/AD.pdf
 
Last edited:
There is an old saying among managers: If you want to save your job fire the programmer that insists in programming in ASM. There are good reasons for that.

Today in many (most?) cases time to market is far more critical then code size.

Because higher level languages hide the processor the code is easier to read and maintain. That is not to say that knowing the processor is unimportant.

Structure and stronger data typing tends to reduce the bug rate.


But if you are a hobby type or your own boss, use what turns your crank! Toggle in hand written machine code from a front panel if it makes you happy.

Knowing a language is only one step in becoming a decent embedded programmer. Lets not compare embedded newbies to people who have worked in the field long term.
 
Last edited:
ASM efficiency however goes beyond embedded programming.
 
There is an old saying among managers: If you want to save your job fire the programmer that insists in programming in ASM. There are good reasons for that.

Today in many (most?) cases time to market is far more critical then code size.

Because higher level languages hide the processor the code is easier to read and maintain. That is not to say that knowing the processor is unimportant.

Structure and stronger data typing tends to reduce the bug rate.


But if you are a hobby type or your own boss, use what turns your crank! Toggle in hand written machine code from a front panel if it makes you happy.

Knowing a language is only one step in becoming a decent embedded programmer. Lets not compare embedded newbies to people who have worked in the field long term.

My issue is that I wish there were more embedded newbies. Sadly there isn't. Just script kiddies who can string a bit of Python together. These are what are being churned out into industry from our so called Higher Education system. Most are pretty useless.

Much the same with discrete electronics. Many can simulate something using Matlab but have never actually made anything. Again useless.
 
ASM efficiency however goes beyond embedded programming.

Yes it does, when the program detail is at the level of direct control of controller resources in a time critical manner like a context switch or synchronous I/O function. Some of us who feel that a HLL approach should be the prefered method really do understand the need for it at times and have learned and forgotten several machine languages over the years with the scars on the forehead to prove it. The ability to look at code written many years ago that you don't even really remember writing but still being able to "grok" the meaning and data flow because it is expressed at a human level of abstraction is truly a good thing.
 
My issue is that I wish there were more embedded newbies. Sadly there isn't. Just script kiddies who can string a bit of Python together. These are what are being churned out into industry from our so called Higher Education system. Most are pretty useless.

Much the same with discrete electronics. Many can simulate something using Matlab but have never actually made anything. Again useless.

Not all the kids are like that. We mentor some classes at the local high-schools and help them compete in the "First Robotics" programs. These kids love working at the nuts and bolts level with electronics and controllers.
**broken link removed**
 
First Robotics is a good program.

One of our former members was in First Robotics. He was driven from ETO in part because he was a kid. Now he is doing an internship with Bosch and attending a good engineering university.

Another young member who is still with us has an internship with a business radio company and he is still in high school.

There are bright kids out there but they are still kids and we need to understand that if we are going to help them.
 
3v0 said:
Today in many (most?) cases time to market is far more critical then code size.
And efficiency isn't even usually considered as long as the machine can run it it doesn't matter how wasteful the programming is. Often times they'll simply raise the specs of the machine required to run it rather than try to improve the software.

This is a crime against proper programming technique if you ask me and is FAR too prevalent. The entire current explosion of tablet PC's based on compact linux or IOS operating systems on limited hardware have require a complete reboot of software programming methodology which I think is a good thing. The bloat of the PC today is absurd.
 
And efficiency isn't even usually considered as long as the machine can run it it doesn't matter how wasteful the programming is. Often times they'll simply raise the specs of the machine required to run it rather than try to improve the software.

Shiny blinking things on the screen sell products today. Deep inside the product the efficient OS is using 1% of the processor with the Bling" wanting 200%.

Example:
**broken link removed**
Bling is a C#-based library for easily programming images, animations, interactions, and visualizations on Microsoft's WPF/.NET. Bling is oriented towards design technologists, i.e., designers who sometimes program, to aid in the rapid prototyping of rich UI design ideas. Students, artists, researchers, and hobbyists will also find Bling useful as a tool for quickly expressing ideas or visualizations. Bling's APIs and constructs are optimized for the fast programming of throw away code as opposed to the careful programming of production code.

But it's not thrown away but kept in the code as bloat.
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top