bigfoot - Fotolia
With AWS Greengrass, IoT apps become seamless edge to cloud
Find out how the various applications and infrastructure pieces work together when building an IoT app on top of AWS Greengrass.
A new generation of IoT platforms promises to make it easier to build, deploy and manage apps that span the cloud, edge servers and connected devices. These apps can be challenging to implement because developers need to ensure congruence between code and configurations at every level. AWS Greengrass IoT has been getting attention lately as it allows developers to create applications on top of the AWS Lambda framework-as-a-service platform that communicates with edge devices and embedded devices. It is already being baked into technologies like AWS Snowball Edge, as well as a new class of cellphone base stations built by Nokia.
The AWS Greengrass software lets enterprises run local compute, messaging, data caching, sync and machine learning (ML) inference capabilities for connected devices in a secure way. AWS has started adding new capabilities onto the platform, such as Greengrass ML Inference, which allows machine learning models to be deployed directly to devices. These can make decisions quickly, even when the devices are not connected to the cloud.
Marko Hokkanen, end-to-end solution architect at Nokia, said some of main benefits of the Greengrass IoT service include a huge AWS Greengrass-code ecosystem, a straightforward deployment and that it works offline.
Start with the cloud
It's important to understand the use of AWS microservices for intelligence at the cloud level before engaging Greengrass at the edge. Scott Nelson, vice president of product at Digi International Inc., said, "The power of Greengrass is that it enables the distribution of intelligence within a system and the ability to maintain that intelligence offline."
This provides significant benefit over traditional AWS services by allowing the ability to push intelligence to devices. A complete system integrating Greengrass should consider how splitting intelligence between the AWS Cloud and the device can best meet mission requirements. There are many tradeoffs in determining where this split will occur. One may place a high value on acquiring a large amount of data even if it does not have immediate value for the application at the edge. So, despite having the capability of filtering data on the device, the system would push as much data as possible to the AWS Cloud.
"If the device at the edge has a low data rate and/or unreliable connections with limited buffer space, small deltas in data values are insignificant," Nelson said. "In this situation, the system could heavily filter data in Greengrass, so it never has to be processed in the cloud."
Devices can also operate offline using Greengrass IoT, allowing them to maintain execution of scheduled or responsive operations in the local environment despite having lost connection. On the other hand, having intelligence in the device that continues to operate could put the system in a state where stale data may lead to undesirable outcomes. The system architect must anticipate these situations and guard against the distributed intelligence becoming an effectively uncontrolled device, Nelson said.
Plan security and management
AWS has done a great job extending cloud programming best practices to the embedded development world via Greengrass and Lambda services. Justin Yang, senior director of software engineering at NXP Semiconductor, said, "To build a practical and scalable IoT solution, however, one must also solve secure device onboarding, [device lifecycle management] and automate the AWS Greengrass deployment to these devices."
Such a system needs an end-to-end security infrastructure, and security must be anchored in hardware. The cloud-based device integration and management infrastructure must provide a common abstraction layer to unify the deployment of AWS Greengrass core and Lambda functions within. Device registration must be automated to support large-scale deployment.
Updating and upgrading IoT devices and gateways will be challenging. Venkat Ramasamy, COO at FileCloud, which implements its services on AWS, said, "Abstracting the actual devices could help in easy integration and upgrading devices in later stages. Developers must assume networks will be unreliable in the production environment. Hence, they have to design appropriately by sending, building applications to detect network reliability and using lighter protocols."
Low-end IoT devices
Devices in a Greengrass deployment run different types of software depending on their capabilities and their operational constraints. The most constrained devices are microcontrollers. Their computing power and memory capabilities are very low (RAM is usually in the range of tens of kilobytes), which is ideal for devices like lightbulbs, smoke detectors and conveyor belts.
Amazon FreeRTOS is an operating system for microcontrollers that makes small, low-power edge devices easy to program, deploy, secure, connect and manage. It is based on the FreeRTOS kernel, a popular open source operating system for microcontrollers, and the company extends it with software libraries that make it easy to securely connect small, low-power devices to AWS cloud services, like AWS IoT Core, or to more powerful edge devices running AWS Greengrass, which in turn would connect to AWS IoT Core. Developers can build their application on top of Amazon FreeRTOS using the C programming language. Low-end devices (for example, sensors), may communicate with each other and a secure gateway using legacy (e.g., Modbus) or low-powered wireless protocols without using the AWS IoT Device SDK, said NXP's Yang.
Digi's Nelson said it is also possible to use programmable logic controllers (PLCs) through the Open Platform Communications Unified Architecture (OPC UA) feature provided by the most recent version of Greengrass. OPC UA is a machine-to-machine communication protocol for industrial automation developed by the OPC Foundation. Greengrass OPC UA allows ingestion and processing of messages from industrial equipment with delivery to devices in a Greengrass group or to the cloud based on rules. For example, PLCs can now connect through an OPC UA server to an OPC UA adapter Lambda function in the Greengrass core.
Edge servers
Devices that possess more compute power and memory (in the range of hundreds of megabytes) can be used as edge gateways to locally aggregate and process data coming from more constrained devices connected to them. Being more capable, these devices can run an operating system such as Linux. These devices can run AWS Lambda functions, keep device data in sync and communicate with other devices securely, even when not connected to the internet. Developers can deploy AWS Lambda functions written in Python 2.7, Node.JS 6.10 and Java8.
Developers can use both Amazon FreeRTOS and AWS Greengrass to send data to AWS IoT Core on the AWS Cloud. From there, they can make use of the rest of the AWS Cloud to develop applications in any programming language. AWS Lambda is a popular choice because it is natively integrated with AWS IoT Core.
Greengrass IoT in action
Nokia has demonstrated AWS Greengrass IoT applications running on Nokia Mobile Edge (multi-access edge computing) gateways. These provide virtual cores running AWS Greengrass, machine learning, AI and other applications. The cores securely connect and process end device data and sensor data, and can apply some basic analytic processing of this data.
"There are plenty of cases where you have a fire hose worth of constant data locally, which would not make sense to send up to the cloud," Nokia's Hokkanen said. The company has example applications, such as cold chain logistics, that send the data to an edge gateway to analyze the results and either, one, trigger events locally, and/or, two, send data up to the cloud. For example, if ice cream in a truck was getting too warm, the gateway could trigger a thermostat to adjust the temperature locally. The anomalies and other less periodic data could be sent to the cloud.
Another great example is edge video analytics. The typical paradigm for video analytics is to send the video streams across a WAN -- such as a city, multiple campuses or stores. Cameras produce a significant amount of uplink network data toward these video analytics servers. A more efficient approach is to terminate those streams at the edge, process the analytics and send significantly smaller amount of data to the cloud to then process it, which still provides the 1% of relevant data.
AWS is a good choice for enterprises that want to build IoT apps around the growing Greengrass IoT ecosystem. But it is probably also a good idea to plan for the day when something better comes along. FileCloud's Ramasamy said, "Design for portability. If your solution uses many native APIs from AWS, moving to a new platform will be very hard. Hence, designing your implementations with adequate decoupling is important in case you want to move to another platform in the future."