Cloud Computing

Amazon ECS helps a local integration with Amazon EBS volumes for data-intensive workloads

Spread the love


Voiced by Polly

As we speak we’re asserting that Amazon Elastic Container Service (Amazon ECS) helps an integration with Amazon Elastic Block Retailer (Amazon EBS), making it simpler to run a wider vary of information processing workloads. You’ll be able to provision Amazon EBS storage on your ECS duties working on AWS Fargate and Amazon Elastic Compute Cloud (Amazon EC2) while not having to handle storage or compute.

Many organizations select to deploy their purposes as containerized packages, and with the introduction of Amazon ECS integration with Amazon EBS, organizations can now run extra sorts of workloads than earlier than.

You’ll be able to run information workloads requiring storage that helps excessive transaction volumes and throughput, resembling extract, rework, and cargo (ETL) jobs for large information, which must fetch current information, carry out processing, and retailer this processed information for downstream use. As a result of the storage lifecycle is totally managed by Amazon ECS, you don’t must construct any further scaffolding to handle infrastructure updates, and consequently, your information processing workloads are actually extra resilient whereas concurrently requiring much less effort to handle.

Now you’ll be able to select from quite a lot of storage choices on your containerized purposes working on Amazon ECS:

  • Your Fargate duties get 20 GiB of ephemeral storage by default. For purposes that want further space for storing to obtain giant container photos or for scratch work, you’ll be able to configure as much as 200 GiB of ephemeral storage on your Fargate duties.
  • For purposes that span many duties that want concurrent entry to a shared dataset, you’ll be able to configure Amazon ECS to mount the Amazon Elastic File System (Amazon EFS) file system to your ECS duties working on each EC2 and Fargate. Frequent examples of such workloads embrace internet purposes resembling content material administration techniques, inner DevOps instruments, and machine studying (ML) frameworks. Amazon EFS is designed to be obtainable throughout a Area and will be concurrently hooked up to many duties.
  • For purposes that want high-performance, low-cost storage that doesn’t have to be shared throughout duties, you’ll be able to configure Amazon ECS to provision and connect Amazon EBS storage to your duties working on each Amazon EC2 and Fargate. Amazon EBS is designed to offer block storage with low latency and excessive efficiency inside an Availability Zone.

To be taught extra, see Utilizing information volumes in Amazon ECS duties and persistent storage finest practices within the AWS documentation.

Getting began with EBS quantity integration to your ECS duties
You’ll be able to configure the quantity mount level on your container within the job definition and move Amazon EBS storage necessities on your Amazon ECS job at runtime. For many use circumstances, you may get began by merely offering the scale of the quantity wanted for the duty. Optionally, you’ll be able to configure all EBS quantity attributes and the file system you need the quantity formatted with.

1. Create a job definition
Go to the Amazon ECS console, navigate to Process definitions, and select Create new job definition.

Within the Storage part, select Configure at deployment to set EBS quantity as a brand new configuration sort. You’ll be able to provision and connect one quantity per job for Linux file techniques.

Whenever you select Configure at job definition creation, you’ll be able to configure current storage choices resembling bind mounts, Docker volumes, EFS volumes, Amazon FSx for Home windows File Server volumes, or Fargate ephemeral storage.

Now you’ll be able to choose a container within the job definition, the supply EBS quantity, and supply a mount path the place the quantity might be mounted within the job.

You can too use $aws ecs register-task-definition --cli-input-json file://instance.json command line to register a job definition so as to add an EBS quantity. The next snippet is a pattern, and job definitions are saved in JSON format.

{
    "household": "nginx"
    ...
    "containerDefinitions": [
        {
            ...
            "mountPoints": [
                "containerPath": "/foo",
                "sourceVolume": "new-ebs-volume"
            ],
            "title": "nginx",
            "picture": "nginx"
        }
    ],
    "volumes": [
       {
           "name": "/foo",
           "configuredAtRuntime": true
       }
    ]
}

2. Deploy and run your job with EBS quantity
Now you’ll be able to run a job by deciding on your job in your ECS cluster. Go to your ECS cluster and select Run new job. Be aware that you may choose the compute choices, the launch sort, and your job definition.

Be aware: Whereas this instance goes by deploying a standalone job with an hooked up EBS quantity, you can even configure a brand new or current ECS service to make use of EBS volumes with the specified configuration.

You could have a brand new Quantity part the place you’ll be able to configure the extra storage. The amount title, sort, and mount factors are people who you outlined in your job definition. Select your EBS quantity varieties, sizes (GiB), IOPs, and the specified throughput.

You can’t connect an current EBS quantity to an ECS job. However if you wish to create a quantity from an current snapshot, you might have the choice to decide on your snapshot ID. If you wish to create a brand new quantity, then you’ll be able to go away this area empty. You’ll be able to select the file system sort, both ext3 or ext4 file techniques on Linux.

By default, when a job is terminated, Amazon ECS deletes the hooked up quantity. In the event you want the info within the EBS quantity to be retained after the duty exits, verify Delete on termination. Additionally, you want to create an AWS Id and Entry Administration (IAM) function for quantity administration that comprises the related permissions to permit Amazon ECS to make API calls in your behalf. For extra data on this coverage, see infrastructure function within the AWS documentation.

You can too configure encryption in your EBS volumes utilizing both Amazon managed keys and buyer managed keys. To be taught extra in regards to the choices, see our Amazon EBS encryption within the AWS documentation.

After configuring all job settings, select Create to start out your job.

3. Deploy and run your job with EBS quantity
As soon as your job has began, you’ll be able to see the quantity data on the duty definition particulars web page. Select a job and choose the Volumes tab to seek out your created EBS quantity particulars.

Your staff can set up the event and operations of EBS volumes extra effectively. For instance, utility builders can configure the trail the place your utility expects storage to be obtainable within the job definition, and DevOps engineers can configure the precise EBS quantity attributes at runtime when the applying is deployed.

This enables DevOps engineers to deploy the identical job definition to completely different environments with differing EBS quantity configurations, for instance, gp3 volumes within the growth environments and io2 volumes in manufacturing.

Now obtainable
Amazon ECS integration with Amazon EBS is accessible in 9 AWS Areas: US East (Ohio), US East (N. Virginia), US West (Oregon), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), Europe (Frankfurt), Europe (Eire), and Europe (Stockholm). You solely pay for what you utilize, together with EBS volumes and snapshots. To be taught extra, see the Amazon EBS pricing web page and Amazon EBS volumes in ECS within the AWS documentation.

Give it a attempt now and ship suggestions to our public roadmap, AWS re:Put up for Amazon ECS, or by your typical AWS Assist contacts.

β€” Channy

P.S. Particular because of Maish Saidel-Keesing, a senior enterprise developer advocate at AWS for his contribution in scripting this weblog publish.



2 thoughts on “Amazon ECS helps a local integration with Amazon EBS volumes for data-intensive workloads

Leave a Reply

Your email address will not be published. Required fields are marked *