Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Why is everyone so adamant on coding in C?

Status
Not open for further replies.
wrt fortran, irrelevant to if they are going to do any embedded development or not, imho fortran is a must for any electrical or mechanical engineering student. Here fortran used to be on the first year of both ee and me university (cannot say how it is now, they now took the "Bologna model" that reduces work they need to put in big time) and asm was only on the third year (out of 5.5 to become bscee and then 2 more for master and then you go for phd) while HLL are on the second... so it probably have to do with "thinking in the abstract" ...
 
wrt fortran, irrelevant to if they are going to do any embedded development or not, imho fortran is a must for any electrical or mechanical engineering student. Here fortran used to be on the first year of both ee and me university (cannot say how it is now, they now took the "Bologna model" that reduces work they need to put in big time) and asm was only on the third year (out of 5.5 to become bscee and then 2 more for master and then you go for phd) while HLL are on the second... so it probably have to do with "thinking in the abstract" ...
In 1974 FORTRAN was very relevant. :p A few terms latter they had us writting simulators for stack machines (no registers) on the IBM 360 in FORTRAN with keypunches and card decks.

As I said they are now starting freshmen with MATLAB after a term of digital logic.
 
Please note that posts in regards to RNG or Random Number Generation from post #44 and above have been moved to

Mr RB's RNG and RNGs in general

moderator
 
Last edited by a moderator:
3V0-
Optimizing compilers can compact code in ways that no rational human would. Optimized code can be smaller then hand coded assembly.

There's no way in hell any compiler can optimise a small fast task to the level that a good human assembler coder can. Compilers can optimise large complex code to be better than what an assembler programmer can do easily. Those are 2 very different concepts. :)

MisterT-
The RNG I posted (the code is missing the initialization) is from the book "Numerical Recipes - The Art of Scientific Computing, 3rd edition". In the book it says: "It has a period of “only” 1.8e19, so it should not
be used by an application that makes more than 10^12 calls. With that restriction, we think that it will do just fine for 99.99% of all user applications".

Check your code again, you are only adding the prime number;
return v + 2685821657736338717LL;
which is an essentially useless step; adding ANY number after the algorithm itself.
but if you blended the prime number into the algorithm;
return v += 2685821657736338717LL;
it would start to be usable as a RNG. I'm guessing either your code or the book had a typo.

LTX71CM-
Computers don't do "random", I don't think Mr RB suggested the Black RNG was pure random, just that it had better entropy.

"Pure" random is in the eye of the beholder. Any process that produces a new number that cannot be predicted by knowing all the numbers before it meets my definition of random. There's no reason that that goal cannot be produced algorithmically. My process uses an algorithm, a 1Mbit cache and values held within the RNG engine. The algorithm may be known but what the algorithm does with the cache has random factors determined by the cache and by the engine. The result is *sufficiently* random that the next number in the sequence can't be predicted by the observer.

MisterT-
How it is impossible to predict something that your algorithm just generated?
To predict the next number generated you would need to know the entire contents of the 1Mbit cache, and the postion in the cache, and the state of the RNG engine. It is not possible for the observer to know those things based on previously generated numbers, so to predict means you must solve for every possible solution of cache content and cache index and engine values, and that brings you back to equal odds, ie your chance of guessing.

It doesn't take long to code my RNG engine up in C, feel free to play with it and run the data through your tests. Or use it in your commercial product, I don't care. It won't do the really fast megabits/sec RN generation you asked for in your other thread but it is sufficiently random with enough passes, if you need high quality RNG generated solely by algorithm without resorting to hardware tricks.
 
Last edited:
There's no way in hell any compiler can optimise a small fast task to the level that a good human assembler coder can. Compilers can optimise large complex code to be better than what an assembler programmer can do easily. Those are 2 very different concepts. :)

Noone is saying that if you need to code 512 byte code on 4bit cpu that C compiler will do a better job. The point is that average program uses much more then 512 bytes, that is developed much longer then few days and that it do much more then single function. And even that "better optimized code" that fits in 1$ mcu with 512bytes could be done in C faster - C would generate slower code and larger code and using 1.1$ mcu you would fit the code and have it run again faster then original "compact" asm on the 1$ device. So for 0.1$ difference you wrote code that is not portable, hard to maintain and requires genius to fix something 2 years from now if a small change is needed. The generated 512 bytes are still faster and smaller then 1024 bytes generated by C compiler - so what?! And - that is true *only* for those 512 bytes codes ... take any serious mcu application and there is no way you can make code that is shorter nor faster then C compiler will do if you spend on it only 5 times longer then average C developer spend on it...

The point that "small fast task" cannot be always fully optimized by the compiler is the reason there is inline assembly btw :)
 
Nigel I am not sure how you want to count most. If one counts the number or projected coded I expect you will find that high level projects are far more numerous.

Look at the code that's out there - most of it is in assembler, as are almost all techical bulletins and application notes.
 
Look at the code that's out there - most of it is in assembler, as are almost all techical bulletins and application notes.

It is time to put your bias on the shelf.

The code we can see is not a representative sample. The vast majority of it is proprietary. The few of us here who have pounded code for a living have a somewhat better view.

The tech docs seem to do the simple stuff in asm and the more complex stuff in C. If there is any statistical significance it is telling us about the technical documentation departments. The preponderance of asm in the documentation is most possibly due to tradition then what language their customers are using.

Many chip vendors depend on others to write compilers. Picking one compiler for use in documentation would be a strong endorsement they would rather avoid. This favors using asm in documentation.
 
Last edited:
It is time to put your bias on the shelf.

The code we can see is not a representative sample. The vast majority of it is proprietary. The few of us here who have pounded code for a living have a somewhat better view.

By definition 'proprietary' code isn't 'out there', it's keep private and secret - for obvious reasons. I'm referring to code you can download for free.

As you say, most professional coding is done in HLL's, that's why we now need gigabyte processors and gigabytes of memory :p
 
As you say, most professional coding is done in HLL's, that's why we now need gigabyte processors and gigabytes of memory :p

ain't that a nasty truth :D :D :D ... (I still can't believe how slow can development get - I know very usable and fast apps running on 8MHz 8088 that are slow and unusable now on GHz boxes :( ) ... anyhow, that laziness (and some other factors) got us to the point where those gigahertz's and those gigabyte's are cheaper then byte's and hertz's back in the day... also engineer/day became more expensive ..
 
By definition 'proprietary' code isn't 'out there', it's keep private and secret - for obvious reasons. I'm referring to code you can download for free.
Again we see what can happen regarding our differing perspectives.

As you say, most professional coding is done in HLL's, that's why we now need gigabyte processors and gigabytes of memory :p
The worst problem is with rapid development systems like C# JAVA etc. But they have their place. A case in point is the logic analyzer.

The created the product using C# and .NET. His customers wanted it on the Mac and linux so he started over using more conventional coding with the aid of a cross platform toolkit. It took 2 years! The story here is that in spite of the code bloat and speed requirements there is a place for a language that will let you rapidly develop a product and get it to market. In this case money from the C# version funded the development of the cross platform code. The C# is a more polished GUI. One could argue he may have been better off staying with the C# code and spending the 2 years on additional post processors or new products. My bias here tells me the Linux people are too cheap to buy one and how many embedded types use Macs anyway.

C is is more of a mid level language. But the guys who code uC's in C do not need gigabyte processors and gigabytes of memory. The C code is about the same size and runs at about the same speed. All but the tiny apps pointed out by RB.
 

Attachments

  • LogicSoftware_03.jpg
    LogicSoftware_03.jpg
    465.4 KB · Views: 110
Hi,

My first real language was asm and i guess that's why i still like it (for uC's that is, not for Windows programming for example). That was back in the late 1970's, but before that i learned to program in bits (numbers) and didnt even have a compiler to begin with. Where i worked at the time we had to program by setting toggle switches, one for each bit, and then pressing a button to get it to accept that binary word. That was on the 8004 and 8008 uP's and also specially designed house built custom microprocessors that took up a whole PC card just for the uP itself not including other application circuitry. I later moved to the 8080 and i loved the z80 because it had so many different instructions for use. Now uC's like the PIC do so much we dont need that stuff anymore really.

Also, i worked with programming though punch cards. The way you arranged the cards determined what the mainframe computer was going to do that day. One card would cause a computer in Texas for example to be polled for data and that data would be printed out by a big line printer and we would go through the data to determine if that system needed repair. Strange, but you could do this from anywhere in the world and that was way back in the mid 1970's way before the internet.
 
Last edited:
Hi,

My first real language was asm and i guess that's why i still like it (for uC's that is, not for Windows programming for example). That was back in the late 1970's, but before that i learned to program in bits (numbers) and didnt even have a compiler to begin with. Where i worked at the time we had to program by setting toggle switches, one for each bit, and then pressing a button to get it to accept that binary word. That was on the 8004 and 8008 uP's and also specially designed house built custom microprocessors that took up a whole PC card just for the uP itself not including other application circuitry. I later moved to the 8080 and i loved the z80 because it had so many different instructions for use. Now uC's like the PIC do so much we dont need that stuff anymore really.

Also, i worked with programming though punch cards. The way you arranged the cards determined what the mainframe computer was going to do that day. One card would cause a computer in Texas for example to be polled for data and that data would be printed out by a big line printer and we would go through the data to determine if that system needed repair. Strange, but you could do this from anywhere in the world and that was way back in the mid 1970's way before the internet.

Hi MrAl,

I'm was interested in reading this, it's very intriguing to learn about how it all got started. I think we often take the contemporary advantages and resources for granted. My grandfather used punch-cards too when he was at IBM, and I have to say that my patience wouldn't last long! I like the historical aspect of programming, it is fun to learn about.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top