In the deluge of new tech terms that are constantly hitting our eyeballs “serverless” or “serverless compute” is one term that comes up more and more frequently. While the term serverless is a bit of a misnomer, we need servers or at least computers to run code, the concept allows a developer to write bits of code and deploy it to a framework that autoscales up and down in real-time in response to requests.
The speed at which we can interact, learn, and make our selfies have cat masks is all due to the wide ranging cloud based services developers have available. Cloud providers have continuously been focused on creating services to enable developers to create smarter and more scalable solutions while off-loading the need to maintain the requisite infrastructure. Serverless compute is a newer service that cloud providers have been making available to developers. Developers can deploy functions to their cloud provider and make them available, similar to an API. More interestingly these can also be designed as event based solutions. These events are typically created by other services in the cloud provider’s ecosystem and allows for interesting trigger based functions to be developed.
A typical example is an image uploaded to a file store, this event calls a serverless function which creates a thumbnail and stores this thumbnail back into the the file store as well as recording the thumbnail location in nosql databases. The insert into the nosql database could then trigger further functions. This thumbnail creation function only runs on demand and the only cost incurred is how many times the function is invoked. The function will scale out in real time if many people are uploading images and then scale to zero once they are done.
The benefits are multitude, no infrastructure to maintain, a true pay per use model which is very inexpensive, the capacity to scale your project to enterprise levels with an enterprise sized team, the functions elastically scale to bursty traffic, etc… However, the downside is still lurking in the shadows. System monitoring and debugging becomes difficult and is based on logging available from the service, vendor lock occurs especially when leveraging events from other services, while the serverless compute may be cheap the other services may not be, also deployment mechanisms and IDEs are not typically prepared to manage serverless functions.
Given the limited resources available to IoT and industrial IoT devices, serveless IoT has started to gain traction. Serverless IoT is an extension of the cloud serverless compute offering where IoT devices can connect to a cloud service that responds to devices and their data based on code uploaded by the developer. Similar to the example above an IoT datastore can send it’s data to the cloud IoT service where the data can be further processed, moved to a central database, and another IoT device sent an event to perform further work. Connectivity is a major hurdle to overcome in this example. In an edge computing scenario access to the cloud can be intermittent say in a autonomous vehicle scenario, or non-existent with smart agriculture devices. New technologies are emerging to overcome limited network access.
OpenWhisk allows for the same event driven managed infrastructure but in a hosted environment. There is still the cost of managing your own servers, however OpenWhisk performs the orchestration of spawning containers to manage the overhead of executing your code at scale. In an Industrial IoT scenario a warehouse could have a closed network with serverless compute running to provide communication between all their IoT databases.
IoT databases are emerging in the technology landscape where the database as a micoservice paradigm holds. In this scenario the IoT devices are running a distributed database that provide a very small footprint and at rest takes up little to no compute or power resources. On a device event, the database gets invoked and can perform full sql nosql crud events, once complete the device will go back to low power mode. Due to the mesh network of devices they all interact similar to a large cloud based serverless infrastructure.
One final instance of serverless compute is the notion of a serverless database. In actuality serverless compute databases have been around for a while with the concept of DbaaS (Database as a Service). However the pricing model has not reflected this and the technology has not always aligned with what we have discussed regarding serverless compute. Databases have been designed to be always on and as such need to be inside always on infrastructure. However, going back to the database as a microservice, essentially you have decoupled the state of the database application from the storage of the database itself. Once these two pieces are separated and the database logic runs on a microservice architecture it is relatively simple to containerize this and run the database logic serverless. That handles scale on the application side, then with predetermined sharding and replication mechanisms the storage piece can horizontally scale itself.