A Beginner's Guide to AWS Services: ECS, Fargate, ECR, Elastic Beanstalk, and Batch

ECS (Elastic Container Service)

Amazon ECS is a service provided by AWS to run & Manage Docker containers.

Example:

Imagine you have a website that becomes very popular. During peak times, you need more resources to handle all the visitors. With ECS, you can easily add more containers to manage the increased load. When the traffic decreases, you can reduce the number of containers to save resources and money.

AWS Fargate

Amazon fargate is a serverless compute engine for container, that work with ECS. IT let you run containers without having to manage the underlying servers. This means you do not have to worry about provisioning, configuring, or scaling virtual machine to run your container.

Example:

Let's say you created an app where users can share pictures of their pets. Here’s how AWS Fargate would help:

  1. Build the App: You create the app and package it in a container.

  2. Task Definition: You define what resources (like CPU and memory) the container needs in the task definition.

  3. Deploy with Fargate: You tell Fargate to run your app. Fargate finds the right amount of computing power and runs your app in the cloud.

  4. Scaling: If your app becomes popular and more users start sharing pictures, Fargate will automatically handle the increase in traffic by running more containers.

AWS ECR (Elastic Container Registry)

Amazon ECR is a place to store your containers. Before you can run a container, you need to have a place to keep it. ECR is like a library where you store all your container images (the packaged program).

Example:

Imagine you have built a small app that helps people learn math. You put everything the app needs into a container. You then store this container in AWS ECR. Later, you or your friend can easily retrieve this container from ECR and run the app anywhere, ensuring it works exactly the same every time.

AWS Elastic Benstalk

AWS Elastic Benstalk is a service that allows you to quickly deploy and manage applications in the cloud without worrying about the infrastructure (the server, network, etc.). It takes your code and automatically handles the deployment, from capacity provisioning, load balancing and auto-scaling to application health monitoring.

Example:

Let's say you want to create a website where people can share and rate their favourite books. Here's how you would use AWS Elastic Beanstalk to make this happen:

  1. Choose Your Platform: You decide to use Python with a Django framework to build your website.

  2. Write Your Code: You write the code for your book-sharing website. You create pages for users to sign up, log in, add books, rate them, and see recommendations.

  3. Upload to Elastic Beanstalk: You package your code and upload it to Elastic Beanstalk using the AWS Management Console or a simple command-line tool.

  4. Elastic Beanstalk Sets Up: Elastic Beanstalk sets up all the necessary servers, databases, and networking components required to run your website.

  5. Monitor and Adjust: You can monitor the performance of your website through the AWS Management Console. If your site becomes popular, Elastic Beanstalk will automatically scale the resources to ensure it runs smoothly.

  6. Enjoy Your Site: Now, people can visit your website, share their favourite books, and rate them without you having to worry about managing the servers or dealing with traffic spikes.

    AWS Batch

    AWS Bach is a service that helps you run a lot of job (task) in the cloud. IT takes care of the work of distributing & running those jobs, so you do not have to worry about managing servers.

    Example:

    Let's say you have a science project where you need to analyze data from 1,000 experiments. Each experiment's data needs to be processed and analyzed, but you don't have time to do it all manually. Here's how AWS Batch could help:

    1. Prepare Your Data: You upload all your experiment data to AWS.

    2. Define Your Tasks: You tell AWS Batch what needs to be done with each piece of data. For example, you might need to run a certain analysis on each experiment's data.

    3. Submit Your Jobs: You submit a job (a set of tasks) to AWS Batch, telling it to process all the data you uploaded.

    4. AWS Batch Takes Over: AWS Batch finds the best computers to run your analysis tasks. It might use many different computers to get the job done faster. It takes care of all the technical details, like starting the tasks and making sure they finish.

    5. Get Results: Once everything is done, you can download your results from AWS and check your project's outcomes.

0
Subscribe to my newsletter

Read articles from Rushikesh Ghatol directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Rushikesh Ghatol
Rushikesh Ghatol