As companies embark on their digital transformation journeys, pressure on IT organizations have been mounting to levels that have never been experienced before. Businesses have the following expectations, among others, from their IT organizations:
- Applications need to be developed, deployed and enhanced at a rapid pace
- Applications are always available, resilient and performant
- Features expected to match or exceed competitors
- Applications need to run on different form factors including PCs and mobile
- All applications delivered must be secure and compliant
To meet those expectations, not only do companies need to have the capabilities to build these kind of solutions, but also they have to build them faster than ever before. This is why many organizations are rethinking how they are architecting and building solutions so that they can better respond to the demands and expectations businesses have on them. Also, IT organizations are constantly in the lookout for ways to enhance agility, optimize resources consumption and minimize solutions time to market. One way businesses are achieving those goals is by embracing the cloud. Trends show that organizations from all sizes either have moved toward scaling down their on-prem data centers in favor of the cloud or contemplating the adoption of the cloud.
A common concern most organizations have while contemplating the adoption of the cloud is how to approach their legacy apps. When thinking about moving a legacy application to the cloud, a decision has to be made whether the application should be moved as is, a strategy called “lift-and-shift” or it has to be redesigned and transformed to become cloud native. Furthermore, organizations that choose to transform their applications often undertake the effort to modernize the application architectures as well as embrace a DevOps mindset to enhance agility, reliability and productivity.
Recently I went through an effort to modernize a monolithic n-tier eCommerce application. At a high level, the application architecture is depicted below:
This effort included moving the application to the cloud as well as redesigning it to follow a Microservices architecture. Also, part of this effort was to modernize the application components and take advantage of Azure PaaS offerings whenever applicable. The final design we decided upon is depicted below.
In this blog I will go through each application layer, describe aspects that need to be considered to properly implement it and offer a solution to address it using Azure.
This component is the entry point to the system. Because this component is external facing, the following aspects have to be considered to implement this capability
- Availability: users should be able to access the system at any time. The system should also be resilient to handle application as well as infrastructure failures. Also, there should not be any single point of failure in the system. If a component in the system fails, the system should continue to respond to requests that are targeting other components in the system
- Scalability: as the user base grows, the system should continue to honor its SLAs.
- Security: only authorized users should be allowed to login and access the system. Also data can only be accessed by users that have the proper permissions to view and manipulate it.
- Maintainability: because this component is the entry point to the system it is paramount that mechanisms be put in place to enable easy deploys and near-zero downtime updates
The following Azure services can be used to address this aspect of the solution
- Azure web App
- Traffic Manager
- Azure Service Fabric
- Azure Container Service
- Azure Active Directory (for authentication and authorization)
- Content Delivery Network (CDN)
Azure Active Directory B2C is a consumer identity and access management in the cloud that is highly available, and that scales to hundreds of millions of identities. It can be easily integrated across mobile and web platforms. Users can log on to all your applications through fully customizable experiences by using their existing social accounts or by creating new credentials.
Furthermore, Azure AD B2C addresses all the capabilities listed above and more. The following are some of its capabilities:
- Multi-factor Authentication
- Self-service Password Management
- Role Based Access Control
- Application Usage Monitoring
- Rich auditing and Security Monitoring and Alerting
Content Delivery Network (CDN) is a group of distributed systems used to improve websites performance by serving website content (i.e. images, scripts, videos, …etc.) from locations that are geographically closest to where requests are made. If users are expected to be geographically dispersed, a CDN must be used to enhance the responsiveness of the site and improve site usability. Furthermore, the use of a CDN reduces traffic sent to the origin since a subset of the requests will be handled by the CDN edge servers.
Azure CDN can be used for this component. It offers a global solution for delivering high-bandwidth content that is hosted in Azure or any other location. The Azure CDN cache can be held at strategic locations to provide maximum bandwidth for delivering content to users.
Azure API Management is a turnkey solution to publish APIs to external, partner and internal developers to enhance agility, efficiency and usability. It accomplishes that by offering the following capabilities:
- Expose all APIs behind a single static IP and domain
- Get near real-time usage, performance and health analytics
- Automate management and integrate using REST API, PowerShell, and Git
- Provision API Management and scale it on demand in one or more geographical regions
- Self-service API key management
- Auto-generated API catalog, documentation, and code samples
- OAuth-enabled API console for exploring APIs without writing code
- Sign in using popular Internet identity providers and Azure Active Directory
- Client certificate authentication
- Simplify and optimize requests and responses with transformation policies
- Secure APIs with key, JSON Web Token (JWT) validation, and IP filtering
- Protect your APIs from overload and overuse with quotas and rate limits
- Use response caching for improved latency and scale
The recommended implementation option for this layer is REST based APIs that expose services to perform a well-defined function (Bounded Context) in the system. The services need to have their own data layer that should not be shared with other services except through well-designed API calls. Services need to be independent from one another.
When dealing with a system that could encompass a large number of components running on a cluster of machines, management of such infrastructure can be a daunting task. There are a number of orchestrators that try to make this task more manageable. Both Service Fabric and Azure Container Service provide container orchestration capabilities. However, there are key differences between these two services:
The following diagram shows Microservices using Service Fabric as a deployment target:
As mentioned earlier, ACS supports three of the most popular orchestrators, namely Kubernetes, DC/OS and Swarm. The following is a depiction of ACS as a deployment target for the system:
A number of Azure Services can be leveraged to handle any kind of background processing. The following is a list of these services with a brief description:
- Azure Batch: a platform service for running large-scale parallel and high-performance computing (HPC) applications efficiently in the cloud. Azure Batch schedules compute-intensive work to run on a managed collection of virtual machines, and can automatically scale compute resources to meet the needs of your jobs.
- Azure Functions: a Serverless technology that allows to run small pieces of code, in the cloud without worrying about a whole application or the infrastructure to run it.
- Logic App: provides a way to simplify and implement scalable integrations and workflows in the cloud. It provides a visual designer to model and automate your process as a series of steps known as a workflow.
For data persistence, two technologies are dominating the database landscape, namely NoSQL and relational databases. NoSQL databases have gained popularity because they are easier to scale, allows for faster development, and allows to store unstructured data.
Azure Cosmos DB is a globally distributed database service designed to enable you to elastically and independently scale throughput and storage across any number of geographical regions with a comprehensive SLA. You can develop document, key/value, or graph databases with Cosmos DB using a series of popular APIs and programming models.
For scenarios where data consistency and ACID (Atomicity, Consistency, Isolation, Durability) compliance is important, Azure SQL Database is a high-performance, reliable, and secure relational database-as-a service that can be leveraged without needing to manage infrastructure.
Azure offers many services that can be leveraged for reporting. Azure SQL Data Warehouse can be used to load and aggregate data from various data sources to perform analysis and reporting. It is a massively parallel processing (MPP) cloud-based, scale-out, relational database capable of processing massive volumes of data.
To help visualize BI reports, Power BI Embedded can be leveraged to integrate Power BI reports into your web or mobile applications. Power BI Embedded is an Azure service that enables app developers to surface Power BI data experiences within their applications.
One thing to keep in mind is the approach I described above is not the only approach that fits the scenario at hand. There could be other ways to achieve this goal that are valid as well.
I hope this blog was helpful to those who are learning about Azure and those of you who are considering to move some of their workloads to the cloud. Note that you can get started to explore Azure for free. Let me know your thoughts about what has been discussed in this bog and please let me know how I can improve by leaving your feedback.