Why embedded applications should be deployed as containers?

Debom Ghosh
4 min readOct 15, 2020

There are several ways one can execute applications on a computer. The three most prominent ones are running them directly on top of the operating system, relying on a virtual machine (VM) powered by a hypervisor (the piece of software which actually allocates the resources and runs the VM) and, more recently, encapsulated within a container.

Running applications on hypervisors has been the traditional go-to approach for system engineers for the past 10–15 years, and it’s what turned the cloud computing segment into what it is today. Hypervisor-based virtualization, however, has some drawbacks related to their complexity and required resources which cannot be neglected, especially when we’re talking about small, embedded computing units. Moreover, VMs can have a significant impact in the application performance, and take some time to start/stop.

A more modern alternative exists and is known as containerization. Containers can be seen as lightweight environments which are still able to provide isolation for the execution of the applications but, unlike VMs, can more efficiently share the resources of the underlying operating system.

There’s plenty of material on containers and, if you are interested, we recommend carefully going through these amazing articles:

Article 1 — https://www.cio.com/article/2924995/what-are-containers-and-why-do-you-need-them.html

Article 2 — https://www.iotforall.com/containers-on-the-edge

Article 3 — https://embeddedbits.org/using-containers-on-embedded-linux/

Containers have been around for some time now, and proven their worth to web developers, mainly thanks to Docker. What’s still in its infancy is the utilization of containerization technologies within the context of embedded IoT development, and this is precisely our motivation for writing this post.

If you’re an embedded developer and not familiar with containers you might be asking what’s in it for you. Just imagine you have built a new app for a smart home solution which uses AI for controlling the lights and temperature based on user behavior, and are now ready to deploy it to thousands of devices.

To achieve that, you need to worry about several low level aspects of your application that shouldn’t necessarily be your concern. Mentioning only a few, how would you cross-compile your application to the target hardware? How would you package your application so all devices run the same version? How would you add new parts to your applications that rely on different technologies but run together on the device, like a sensor plugin and a web connection service? Or how would you solve potential conflicts between libraries that these different parts of your applications depend on? All these are points that can be easily managed when using virtualization technologies.

When talking about IoT applications, especially those that require more powerful devices (think about a raspberry pi zero), due to data processing requirements or general flexibility for example, container technology is especially fitting, due to its low performance footprint and great versatility. In addition to answering the question above, by adopting containers, IoT development teams can benefit in several ways when adopting them:

  1. Containers make it very simple to integrate new developers in your team — you can use them to create well-defined and isolated environments and share them with developers. Additionally, if you truly embrace the concept, you’ll see it becomes easy to break the application down into a loosely coupled set of heterogeneous “components”, hence making collaboration more natural, and allowing developers to employ the right tools in the right places.
  2. Nicely deal with legacy applications — Different containers can run different versions of the same library, thus allowing you to update parts of the application at a time and also offer support legacy apps. Providing update patches to a much older version of the software would be possible as developers can work on a container with an older version of tools, whilst the mainline development of the latest version is running on a separate container.
  3. Distributed deployment — With containers, building and deploying a distributed application with several components, quite typical for a IoT device, is much easier. Independent deployment with modern DevOps practices on several components simultaneously, over a long period of time, is also well within reach.
  4. Increasing the talent pool for IoT/ Embedded Development — with containers in place, a web or cloud developer can easily start programming for an IoT application out-of-the-box.
  5. Reduce the coupling between software and hardware
  6. Maintenance and update of these applications — Once the app is developed in a containerized manner, rolling out OTA updates, or newest security patches is a cakewalk for developers, being independent of the production life cycle of the hardware fleet.

At Seashell, we have understood that the containerized applications are on their way to becoming the de facto standard also in the embedded world. Therefore, we have built our platform to natively support this amazing technology. The Seashell platform takes all the advantages containers have to offer and enables you to instantly bring this power into your embedded software development workflow. With Seashell you can manage, orchestrate and provision containerized applications using several battle hardened open-source components, while leveraging the very same tools, technologies and practices you are already familiar with!

--

--

Debom Ghosh

A Product Manager working in the field of IoT, Edge Computing and Machine Learning topics