Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Problems in Modern Embedded Dev

country2304

New Member
I'm thinking about making an open-source project related to some problems I've experienced building commercial IoT products and I'm interested to hear other people's thoughts on the stumbling blocks they have encountered in embedded device development.

  • I would like to be able to use higher-level languages with nicer syntactic sugar, (using Micropython in the past has sped up dev time, especially during rapid prototyping). However, most solutions like micropython don't seem viable in most constrained environments.
  • When I've built CI pipelines in the past it has been not easy to integrate the device firmware into end-to-end tests of our entire system, backend included. I would like to be able to use as much of the actual code run on the device to be used in the tests and not have to write mock code for different hardware functions. (Of course, this kind of test is more useful in scenarios without critical timing components etc. that would require proper simulation of the hardware).
  • More generally, I've found that in the embedded world there are fewer tools for managing projects with systems like package managers, maybe more equivalent to what exists in the js world.
 
Programming dedicated MCU based systems is fundamentally very! different to writing programs to run on a general purpose computer with an operating system.

Trying to use the same approach as for programming for Window or Linux-based systems will only result in long-term problems, or having to use much larger & more expensive target devices that are actually needed.

That may be OK for a one-off hobby project or small run of some device, but for mass production the end unit parts cost (and likely power consumption) are critical factors in profitability and market share.


You (the programmer) own the entire system and every resource in it. The only person you have to share resources with or give up CPU cycles for is yourself!


Any interpreted language is wasting finite resources and invaluable CPU time; everything should be compiled / assembled machine code, using integer or fixed-point math as far as practical and avoiding trig function like the plague.

Most serious long term embedded / real-time device programmers would either fall about laughing or make the warding-off-evil cross sign when someone mentions such as RTOS or interpreted languages in an MCU based device :)

It's not that unusual to resort to optimising critical routines by cycle-counting the machine instructions to find ways to tweak the code.

Your most important tools are a really good compiler and hardware debugging interface!


If you are basing things on other peoples code / libraries, then you are stuck with whatever system constraints they have - but if practical, you can write your own replacements to fit with the rest of your own code.

For your overall "pipeline" debugging, analyse the inter-device communication - capture data packets and verify the content.

That's where any problems at a device level will be visible. It's pretty obvious where a problem is if a device has valid inputs but produces incorrect outputs!
 

Latest threads

New Articles From Microcontroller Tips

Back
Top