How to make a Serverless Real Time Chat App
A lot has changed since I started building apps, we previously used to almost rebuild a server 15 years back, then came cloud infrastructure services using which we could simply get pre-configured connected servers to get started with. It has changed how the custom software development process used to be.
However, it still required us to write a lot of code (for good).
What are you going to learn in this blog?
- Why I chose Serverless for Real Time Chat App?
- What are Serverless Architectures?
- Baseline Chat App Architecture
- Best Practices for Improving the Chat App
In 2016, something called Serverless came up.
Serverless is a cloud technology where we don’t actually have to write a lot of custom code, define communication protocols, install server configurations, etc. We will take a short intro to serverless a few sections later, but for now, I would really want to emphasize why this is extremely important.
So, I was working with this fortune 500 client that wanted to build a chat solution and make it production ready within 25 days.
Writing a chat service and integrating other services with it is just one part, we can easily do that within 15 days. What was actually difficult – is to ensure that the we met service level agreements and have everything setup properly to cater 100 business executives collaborating across the globe with performance.
Now, if you were in my shoes and want to build a chat app without waiting for 3-4 months, this blog post is for you.
Why I chose Serverless for Real Time Chat App?
So far we have talked about the ease of deployment and speed using serverless, which is way faster than Serverless environment.
When I was working for Fortune 500, project over costing and time consideration were the vital and deciding factors in building an app. Time overrun can cost you twice or thrice the cost of project.
Why this isn’t the concern while using serverless?
Serverless gives you the liberty to work on managed infrastructure which drastically reduces labour cost and provide flexibility to deploy your vital resources to some other task.
No need to manage servers from now on and rather focus on building new features to your app.
Another concern for startups while building MVP is the scaling factor. Scalability is the race against time.
GoChat, chatting app for Pokemon Go fans, is the best example of scalability failure. It was built as an MVP without taking about scalability in mind.
It went as high as 1 million users in 5 days and went down the very next day.
Building a scalable app requires experience as well as resources.
Using serverless you can deploy pay as you go model which can be scaled to more than 1 million users in less than 24 hrs in a cost efficient way.
What Are Serverless Architectures?
While building and running an application, there’s undifferentiated heavy lifting, such as installing software, managing server, coordinating patch schedule and scaling to meet demand.
Serverless architectures allow you to build and run applications and services without having to manage infrastructure.
Your application still runs on serverless but all the server management for you is done by AWS.
Over time, lot of triggers have been added to AWS Lambda. Using tools such as Amazon Cloud Watch Events, you can respond to any API call by invoking a lambda function. Leveraging this we can easily enrich our application with other functionality ready to be used.
Baseline Chat App Architecture
The chat application we will be building represents a complete serverless architecture that deliver a baseline chat application upon which additional functionality can be added.
The high level architecture of component that are launched automatically:
These Lambda functions are designed to be stateless and can use persistence tier to read/write data. For this, to have complete managed solution we have used DynamoDB.
Over time, a lot of triggers have been added to AWS Lambda. Using tools such as Amazon CloudEvent you can respond to any AWS API call by invoking Lambda function.
There are lot of exceptions to this standard architecture. You can use Amazon API Gateway to proxy a native AWS API call, so that you can map your REST API straight to a service operation, such as adding data to a Kinesis Stream.
With CloudFormation stack launched and the component built, the result is fully functioning chat app hosted in S3, using API Gateway and AWS Lambda to process request and DynamoDB as persistence for our chat messages.
With this baseline chat application, we can add additional functionality, including:
- Integration of SMS/MMS via Twilio. Send messages to chat from SMS.
- Help me panic button with IoT.
- Integration with Slack for message from another platform.
- Typing indicator to see which user are typing.
With add ons completed the architecture of chat app look a bit sophisticated.
Best Practices for Improving the Chat App
For most part, the design pattern you would see in server-yes environment you would also find in serverless environment. With that said, it never hurts to revisit best practices while learning new ones. So lets review some key patterns we incorporated in our serverless chat application.
Decoupling your app
In our chat application, Lambda function is serving our business logic. Since user interact with Lambda at function level, it serves you well to split up logic at each functional level, so you can scale the logic independently from the source and destination from which it serves.
Separating your data stores
Treat each data store as an isolated component of every service it support. One common pitfall when working with microservice is to forget about the data layer. By keeping data store specific for each service provided, you can better manage the resources at data layer specifically for the services.
Leverage Data Transformations up the Stack
While designing a website you need to care about data transformation and compatibility. How will you handle data from different clients, systems, users for services?
Are you going to run different flavors of your environment for each incoming request?
Absolutely not !
With API gateways, transformation become a lot easier with their built-in mapping templates. With these resources you can built data transformation and mapping logic into API layer for request and responses. This result in less work as API gateway is a managed service.
Security Through Service Isolation and Least Privilege
It should be general practice to use least privilege principle and isolate component of your application to provide over access. In this app i have used permission based model via AWS Identity and Access Management(IAM). IAM is integrated in every service on the platform and provide capability for services to perform their least privilege access needs.
This is a change from how we build server-side application before, and with it comes the significant restructuring. Working on serverless will also require us to restructure our monitoring, but we will get on with it with time.
Auto scaling and auto-provisioning saves labor cost because lots of resource management activities we no need to perform by ourselves. This is particularly great when we start to deploy product with reduction in time-to-market.
Auto scaling combined with usage cost performs fairly well in terms of cost of infrastructure for serverless.
Although, serverless architecture does not provide performance capabilities in terms of host or count, it offers alternative ways to configure performance requirements and I expect that such configurations will expand in future.