Hi everyone, welcome to the 3rd chapter in our Tencent Cloud Solutions Architect Professional Course, Cloud Native Application Design. At the end of this chapter, you'll be able to understand the key technologies of cloud native applications and design serverless architecture. In this chapter, we'll cover two sections, Overview of Cloud Native Applications and Implementation of Serverless Architecture in Tencent Cloud. This video, will cover the 1st section; Overview of Cloud Native Applications. The next video will cover the 2nd section. Let's get started with Section 1; Overview of Cloud Native Applications. In this video, we'll cover Application Architecture Development and Cloud Native Architecture and its Key Technologies. The three-dimensions of application architecture development includes system resources such as servers and databases, application architecture including monolithic architecture and the development model. Some challenges of application architecture development across these dimensions include low code reusability and an inability to respond to rapidly changing market requirements. To overcome these challenges, we must learn how to best adapt to the cloud and take full advantage of it. In application architecture evolution, many of the system resources, application architecture, and life cycle management have transformed over time. System resources consist of physical servers, virtualized instances, cloud instances, containers, and serverless functions. Each of which is one abstraction layer above the previous. Application architecture has evolved from monolithic architecture to system oriented architecture, to microservices architecture. The evolution of lifecycle management has gone from waterfall development to continuous integration and continuous deployment and more recently, continuous Ops. Okay, now let's look at Cloud Native Architecture and its Key Technologies. Cloud native is the collection of a series of cloud computing technology systems and enterprise management tools. It includes not only the methodology of cloud native applications, but also the key technologies for its implementation. The concept of cloud native can be summarized as ''applications suitable for the cloud'' and "easy to use cloud architecture". For infrastructure as a service, agile infrastructure is supported. For platform as a service and software as a service, microservice technology is offered. Cloud Native also provide support for continuous delivery and DevOps for management. The 12 factors of cloud native application that should be considered are baseline code, dependency, configuration, backend service, build, release, and execution, process, port binding, concurrency, ease of processing, environmental equivalence, log, and management process. Now let's go over some strengths of cloud native applications compared to traditional applications. For deployment predictability, cloud native applications are predictable while traditional applications are unpredictable. For abstraction, cloud native applications have OS abstraction, while traditional applications have OS dependency. For elasticity, cloud native applications offer elastic scheduling, whereas traditional applications have resource redundancy and a lack of resource scalability. Cloud native applications offer DevOps, while traditional applications offer waterfall development with departmental isolation. The service architecture for cloud native is decoupled microservice architecture, while traditional applications have a coupled monolithic architecture. Finally, for resilience, cloud native applications have automated Ops and fast recovery, while traditional applications have manual Ops with slow recovery. Okay, next, we'll look at an overview of cloud native architecture technologies including DevOps, Continuous Delivery, Microservices and Containers. Microservice is an architecture design that breaks down a large application into small independent services and then connects them for higher cohesion and agility. An application is essentially divided into multiple services that are aggregated like a honeycomb. Container technologies make resource scheduling and microservices easier. Container is a lightweight virtualization technology that can provide multiple isolated runtime environments on a single server and isolate processes through namespaces. Each container has a unique writable filesystem and resource quota. DevOps is a collective term for a set of processes, methods, and systems that are used to promote the communication, collaboration, and integration between development application or software engineering, technical operations, and quality assurance departments from the start to the end. The development part consists of planning, coding, building, and testing. While the operations part consists of release, deployment operations and monitoring. The benefits of DevOps include increasing the deployment efficiency, lead time, deployment success rate, and productivity or profit target. The deployment efficiency of code and changes is over 30 times higher, while the lead time for code and changes is over 200 times higher. Additionally, with DevOps, the deployment success rate in production environments can be over 60 times higher, while the productivity and profit target can be more than doubled. Various tools used for each step of the DevOps process are shown in the following diagram. For reference. Continuous delivery is a process in which a minimum viable product becomes the production software by gradually adding features. First, the requirements must be evaluated, which will influence the design, which will then trigger the development cycle to work on specific features or requirements. Through testing, you may need to return to a previous step to modify and troubleshoot. Once the testing phase is complete, the agile loop is completed and you can move on to the continuous deployment process, which consists of automated Ops collaboration. The continuous delivery is a closed feedback loop that demonstrates the DevOps process.