Monday, 4th of Jun 2018
Monolithic Serverless Hybrid
Serverless technologies can be plugged into a traditional legacy monolithic platform with little effort, and produce powerful results. Recently Franklin reviewed a legacy SaaS platform and migrated some of the trouble areas to the Google Cloud Platform and made use of their serverless features to roll out a solution which brought immediate relief and significant cost reductions in forecasting models for the client.
The client has several thousand customers which have IoT devices in the field that send data back home, which results in around a million transactions a month.
The legacy application is a fairly simple LEMP [Linux, Nginx, MySQL & PHP] setup. It’s made up of a production server which handles customer traffic. Customers can login to the platform, run reports and interact with their IoT devices in the field.
The IoT devices send data back home to the ingestion server. On average the ingestion server can receive two or three requests per second during business hours. The ingestion server takes some data and pushes it straight into the database, while other data is transformed and pushed to third party cloud services for further ingestion.
Due to the amount of traffic the single server attempts to ingest, NGINX was struggling to handle it all. The ingestion platform isn't very highly spec'd and NGINX was quick to run out of threads, resulting in the IoT devices receiving either a timeout while waiting for a free thread, or the request crashing due to lack of server resources. When the IoT device receives a bad response it will queue the data and attempt a resend every fifteen minutes until successful delivery.
The application wasn't originally designed to sit on multiple servers, so cloning servers and splitting traffic wasn't an ideal solution. More resources could have been thrown at the ingestion platform, but the peaks aren't predictable and the overall traffic is constitly increasing. So it would be a constant battle of increasing specs as traffic increases. So we found a more scalable solution.
The serverless solution was built in Node.js and utilised the Serverless Framework. Pulling apart the above solution, let's focus on the Production Environment group.
The IoT devices now push data straight to a HTTPS Cloud Function, which either pushes the data straight to the database, or publishes it to a Pub/Sub topic depending on the contents. Over 90% of the data is pushed straight to the database, freeing up most of the legacy ingestion resources. The 10% of data that's pushed to the Pub/Sub topic gets forwarded to the legacy ingestion server and is transformed through the legacy business logic, and pushed to a third party cloud provider or stored in the database.
As the NGINX ingestion server can still occasionally receive high load the Pub/Sub Cloud Function has a simple queuing solution. Upon a failed push to the legacy ingestion server the function will store the data packet in a Firestore database. There is a Cloud Scheduler tasks setup to trigger every 5 minutes to grab all messages older than 5 minutes old and push them back to the Pub/Sub topic to repeat the process.
By utilising this serverless hybrid solution the IoT systems are receiving a much lower average response time for their outbound requests, resulting in no local queues. Any data that requires queuing is now resting on GCP. It's easier to review and manage and the retry times are easily adjusted instead of having to manage each IoT queue independently.
Cost forecasting has shown a large reduction, without the serverless ingestion environment, the legacy system was going to need code changes so it could be duplicated and placed behind a load balancer, over doubling the ongoing cost of the infrastructure. Instead the GCP costs are only a few dollars each month.
Follow Franklin to read more about the auto-build, depyloments and metrics gained through the migration to the Serverless Hybrid Model.
You may also like
4 minute read
Serverless isn't going away and more companies are starting to use it and more providers are growing their serverless offerings. So what…
2 minute read
Building on the previous article about the building of the Monolithic Serverless Hybrid, we wanted to discuss how we handled deployments. We…