For the most part, when I have provided customers with various API endpoint solutions, I’ve hosted those in a central linux instance hosted on Google Cloud Compute Engine. I’ll continue to use that solution for my own deployments, but now that Google supports Python in the Cloud Functions, I’m deploying customers code to this server-less environment for a few reasons:
- Survivability. I want anything I provide to a customer to survive me. A customer should be able to cut-ties at any time without feeling trapped because something is embedded into my own infrastructure.
- Scalability. These cloud functions automatically scale up as needed.
- Cost. Because the customer only pays for the actual processing time, rather than paying for the service to sit around waiting, these functions are great for my customers who typically are not pushing huge volumes of anything.
- Reliability. With cloud functions, there is nothing to manage but the code. No dealing with linux, or hardware, or operating systems, etc. Just deploy the code and watch it work.
These functions are not just triggered by http requests, there are also event driven background tasks such as Cloud Storage events. For example, today I deployed a solution that runs every time a new file is created in a GCS bucket.
I’ve previously experimented with Lambda on AWS but I just do not know my way around that system of modules, so I was very happy to see that Google has now added Python 3.7 to the list of runtimes for their functions. I now have a repeatable process to deploy customer solutions quickly, and transfer ownership directly to them once testing is complete.