Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      Upwork Freelancers vs Dedicated React.js Teams: What’s Better for Your Project in 2025?

      August 1, 2025

      Is Agile dead in the age of AI?

      August 1, 2025

      Top 15 Enterprise Use Cases That Justify Hiring Node.js Developers in 2025

      July 31, 2025

      The Core Model: Start FROM The Answer, Not WITH The Solution

      July 31, 2025

      Finally, a sleek gaming laptop I can take to the office (without sacrificing power)

      August 1, 2025

      These jobs face the highest risk of AI takeover, according to Microsoft

      August 1, 2025

      Apple’s tariff costs and iPhone sales are soaring – how long until device prices are too?

      August 1, 2025

      5 ways to successfully integrate AI agents into your workplace

      August 1, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      Enhancing Laravel Queries with Reusable Scope Patterns

      August 1, 2025
      Recent

      Enhancing Laravel Queries with Reusable Scope Patterns

      August 1, 2025

      Everything We Know About Livewire 4

      August 1, 2025

      Everything We Know About Livewire 4

      August 1, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      YouTube wants to use AI to treat “teens as teens and adults as adults” — with the most age-appropriate experiences and protections

      August 1, 2025
      Recent

      YouTube wants to use AI to treat “teens as teens and adults as adults” — with the most age-appropriate experiences and protections

      August 1, 2025

      Sam Altman is afraid of OpenAI’s GPT-5 creation — “The Manhattan Project feels very fast, like there are no adults in the room”

      August 1, 2025

      9 new features that arrived on the Windows 11 Insider Program during the second half of July 2025

      August 1, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»How to Upload Large Objects to S3 with AWS CLI Multipart Upload

    How to Upload Large Objects to S3 with AWS CLI Multipart Upload

    July 31, 2025

    Uploading large files to S3 using traditional single-request methods can be quite challenging. If your’e transferring a 5GB database backup, and a network interruption happens, it forces you to restart the entire upload process. This wastes bandwidth and time. And this approach becomes increasingly unreliable as file sizes grow.

    With a single PUT operation, you can actually upload an object of up to 5 GB. But, when it comes to uploading larger objects (above 5GB), using Amazon S3’s Multipart Upload feature is a better approach.

    Multipart upload makes it easier for you to upload larger files and objects by segmenting them into smaller, independent chunks that upload separately and reassemble on S3.

    In this guide, you’ll learn how to implement multipart uploads using AWS CLI.

    Table of Contents

    • Prerequisites

    • How Multipart Uploads Work

    • Getting started

      • Step 1: Download the AWS CLI

      • Step 2: Configure AWS IAM credentials

    • Step 1: Split Object

    • Step 2: Create an Amazon S3 bucket

    • Step 3: Initiate Multipart Upload

    • Step 4: Upload split files to S3 Bucket

    • Step 5: Create a JSON File to Compile ETag Values

    • Step 6: Complete Mulitipart Upload to S3

    • Conclusion

    Prerequisites

    To follow this guide, you should have:

    • An AWS account.

    • Knowledge of AWS and the S3 service.

    • AWS CLI is installed on your local machine.

    How Multipart Uploads Work

    In a multipart upload, large file transfers are segmented into smaller chunks that get uploaded separately to Amazon S3. After all segments complete their upload process, S3 reassembles them into the complete object.

    For example, a 160GB file broken into 1GB segments generates 160 individual upload operations to S3. Each segment receives a distinct identifier while preserving sequence information to guarantee proper file reconstruction.

    The system supports configurable retry logic for failed segments and allows upload suspension/resumption functionality. Here’s a diagram that shows what the multipart upload process looks like:

    AWS multipart upload process

    Getting Started

    Before you get started with this guide, make sure that you have the AWS CLI installed on your machine. If you don’t already have that installed, follow the steps below.

    Step 1: Download the AWS CLI

    To download the CLI, visit the CLI download documentation. Then, download the CLI based on your operating system (Windows, Linux, macOS). Once the CLI is installed, the next step is to configure your AWS IAM credentials in your terminal.

    Step 2: Configure AWS IAM credentials

    To configure your AWS credentials, navigate to your terminal and run the command below:

    aws configure
    

    This command prompts you to paste in certain credentials, such as AWS Access Key ID and secret ID. To obtain these credentials, create a new IAM user in your AWS account. To do this, follow the steps below. (You can skip these steps if you already have an IAM user and security credentials.)

    1. Sign in to your AWS dashboard.

    2. Click on the search bar above your dashboard and search “IAM”.

    3. Click on IAM.

    4. In the left navigation pane, navigate to Access management > Users.

    5. Click Create user.

    6. During IAM user creation, attach a policy directly by selecting Attach policies directly in step 2: Set permissions.

    7. Give the user admin access by searching “admin” in the permission policies search bar and selecting AdministratorAccess.

    8. On the next page, click Create user.

    9. Click on the created user in the Users section and navigate to Security credentials.

    10. Scroll down and click Create access key.

    11. Select the Command Line Interface (CLI) use case.

    12. On the next page, click Create access key.

    You will now see your access keys. Please keep these safe and do not expose them publicly or share them with anyone.

    You can now copy these access keys into your terminal after running the aws configure command.

    You will be prompted to include the following details:

    • AWS Access Key ID: gotten from the created IAM user credentials. See steps above.

    • AWS Secret Access Key: gotten from the created IAM user credentials. See steps above.

    • Default region name: default AWS region name, for example, us-east-1.

    • Default output format: None.

    Now we’re done with the CLI configuration.

    To confirm that you’ve successfully installed the CLI, run the command below:

    aws --version
    

    You should see the CLI version in your terminal as shown below:

    Image of AWS CLI version

    Now, you are ready for the following main steps for multipart uploads 🙂

    Step 1: Split Object

    The first step is to split the object you intend to upload. For this guide, we’ll be splitting a 188MB video file into smaller chunks.

    Image of object size

    Note that this process also works for much larger files.

    Next, locate the object you intend to upload in your system. You can use the cd command to locate the object in its stored folder using your terminal.

    Then run the split command below:

    split -b <SIZE>mb <filename>
    

    Replace <SIZE> with your desired chunk size in megabytes (for example, 150, 100, 200).

    For this use case, we’ll be splitting our 188mb video file into bytes. Here’s the command:

    
    split -b 31457280 videoplayback.mp4
    

    Next, run the ls -lh command on your terminal. You should get the output below:

    Image of split object

    Here, you can see that the 188MB file has been split into multiple parts (30MB and 7.9MB). When you go to the folder where the object is saved in your system files, you will see additional files with names that look like this:

    • xaa

    • xab

    • xac

    and so on. These files represent the different parts of your object. For example, xaa is the first part of your file, which will be uploaded first to S3. More on this later in the guide.

    Step 2: Create an Amazon S3 Bucket

    If you don’t already have an S3 bucket created, follow the steps in the AWS Get Started with Amazon S3 documentation to create one.

    Step 3: Initiate Multipart Upload

    The next step is to initiate a multipart upload. To do this, execute the command below:

    
    aws s3api create-multipart-upload --bucket DOC-EXAMPLE-BUCKET --key large_test_file
    

    In this command:

    • DOC-EXAMPLE-BUCKET is your S3 bucket name.

    • large_test_file is the name of the file, for example, videoplayback.mp4.

    You’ll get a JSON response in your terminal, providing you with the UploadId. The response looks like this:

    
    {
        "ServerSideEncryption": "AES345",
        "Bucket": "s3-multipart-uploads",
        "Key": "videoplayback.mp4",
        "UploadId": "************************************"
    }
    

    Keep the UploadId somewhere safe in your local machine, as you will need it for later steps.

    Step 4: Upload Split Files to S3 Bucket

    Remember those extra files saved as xaa, xab, and so on? Well, now it’s time to upload them to your S3 bucket. To do that, execute the command below:

    aws s3api upload-part --bucket DOC-EXAMPLE-BUCKET --key large_test_file --part-number 1 --body large_test_file.001 --upload-id exampleTUVGeKAk3Ob7qMynRKqe3ROcavPRwg92eA6JPD4ybIGRxJx9R0VbgkrnOVphZFK59KCYJAO1PXlrBSW7vcH7ANHZwTTf0ovqe6XPYHwsSp7eTRnXB1qjx40Tk
    
    • DOC-EXAMPLE-BUCKET is your S3 bucket name.

    • large_test_file is the name of the file, for example, videoplayback.mp4

    • large_test_file.001 is the name of the file part, for example, xaa.

    • upload-id replaces the example ID with your saved UploadId.

    The command returns a response that contains an ETag value for the part of the file that you uploaded.

    
    {
        "ServerSideEncryption": "aws:kms",
        "ETag": ""7f9b8c3e2a1d5f4e8c9b2a6d4e8f1c3a"",
        "ChecksumCRC64NVME": "mK9xQpD2WnE="
    }
    

    Copy the ETag value and save it somewhere on your local machine, as you’ll need it later as a reference.

    Continue uploading the remaining file parts by repeating the command above, incrementing both the part number and file name for each subsequent upload. For example: xaa becomes xab, and --part-number 1 becomes --part-number 2, and so forth.

    Note that upload speed depends on how large the object is and how good your internet speed is.

    To confirm that all the file parts have been uploaded successfully, run the command below:

    aws s3api list-parts --bucket s3-multipart-uploads --key videoplayback.mp4 --upload-id p0NU3agC3C2tOi4oBmT8lHLebUYqYXmWhEYYt8gc8jXlCStEZYe1_kSx1GjON2ExY_0T.4N4E6pjzPlNcji7VDT6UomtNYUhFkyzpQ7IFKrtA5Dov8YdC20c7UE20Qf0
    

    Replace the example upload ID with your actual upload ID.

    You should get a JSON response like this:

    
    {
        "Parts": [
            {
                "PartNumber": 1,
                "LastModified": "2025-07-27T14:22:18+00:00",
                "ETag": ""f7b9c8e4d3a2f6e8c9b5a4d7e6f8c2b1"",
                "Size": 26214400
            },
            {
                "PartNumber": 2,
                "LastModified": "2025-07-27T14:25:42+00:00",
                "ETag": ""a8e5d2c7f9b4e6a3c8d5f2e9b7c4a6d3"",
                "Size": 26214400
            },
            {
                "PartNumber": 3,
                "LastModified": "2025-07-27T14:28:15+00:00",
                "ETag": ""c4f8e2b6d9a3c7e5f8b2d6a9c3e7f4b8"",
                "Size": 26214400
            },
            {
                "PartNumber": 4,
                "LastModified": "2025-07-27T14:31:03+00:00",
                "ETag": ""e9c3f7a5d8b4e6c9f2a7d4b8c6e3f9a2"",
                "Size": 26214400
            },
            {
                "PartNumber": 5,
                "LastModified": "2025-07-27T14:33:47+00:00",
                "ETag": ""b6d4a8c7f5e9b3d6a2c8f4e7b9c5d8a6"",
                "Size": 26214400
            },
            {
                "PartNumber": 6,
                "LastModified": "2025-07-27T14:36:29+00:00",
                "ETag": ""d7e3c9f6a4b8d2e5c7f9a3b6d4e8c2f5"",
                "Size": 26214400
            },
            {
                "PartNumber": 7,
                "LastModified": "2025-07-27T14:38:52+00:00",
                "ETag": ""f2a6d8c4e7b3f6a9c2d5e8b4c7f3a6d9"",
                "Size": 15728640
            }
        ]
    }
    

    This is how you verify that all parts have been uploaded.

    Step 5: Create a JSON File to Compile ETag Values

    The document we are about to create helps AWS understand which parts the ETags represent. Gather the ETag values from each uploaded file part and organize them into a JSON structure.

    Sample JSON format:

    
    {
        "Parts": [{
            "ETag": "example8be9a0268ebfb8b115d4c1fd3",
            "PartNumber":1
        },
    
        ....
    
        {
            "ETag": "example246e31ab807da6f62802c1ae8",
            "PartNumber":4
        }]
    }
    

    Save the created JSON file in the same folder as your object and name it multipart.json. You can use any IDE of your choice to create and save this document.

    Step 6: Complete Mulitipart Upload to S3

    To complete the multipart upload, run the command below:

    aws s3api complete-multipart-upload --multipart-upload file://fileparts.json --bucket DOC-EXAMPLE-BUCKET --key large_test_file --upload-id exampleTUVGeKAk3Ob7qMynRKqe3ROcavPRwg92eA6JPD4ybIGRxJx9R0VbgkrnOVphZFK59KCYJAO1PXlrBSW7vcH7ANHZwTTf0ovqe6XPYHwsSp7eTRnXB1qjx40Tk
    

    Replace fileparts.json with multipart.json.

    You should get an output like this:

    
    {
        "ServerSideEncryption": "AES256",
        "Location": "https://s3-multipart-uploads.s3.eu-west-1.amazonaws.com/videoplayback.mp4",
        "Bucket": "s3-multipart-uploads",
        "Key": "videoplayback.mp4",
        "ETag": ""78298db673a369adf33dd8054bb6bab7-7"",
        "ChecksumCRC64NVME": "d1UPkm73mAE=",
        "ChecksumType": "FULL_OBJECT"
    }
    

    Now, when you go to your S3 bucket and hit refresh, you should see the uploaded object.

    Image of object successfully uploaded to AWS using multipart upload

    Here, you can see the complete file, file name, type, and size.

    Conclusion

    Multipart uploads transform large file transfers to Amazon S3 from fragile, all-or-nothing operations into robust, resumable processes. By segmenting files into manageable chunks, you gain retry capabilities, better performance, and the ability to handle objects exceeding S3’s 5GB single-upload limit.

    This approach is essential for production environments dealing with database backups, video files, or any large assets. With the AWS CLI techniques covered in this guide, you’re now equipped to handle S3 transfers confidently, regardless of file size or network conditions.

    Check out this documentation from the AWS knowledge center to learn more about multi-part uploads using AWS CLI.

    Source: freeCodeCamp Programming Tutorials: Python, JavaScript, Git & More 

    Facebook Twitter Reddit Email Copy Link
    Previous ArticleHow to Use MongoDB with Go
    Next Article STIV: Scalable Text and Image Conditioned Video Generation

    Related Posts

    Development

    Enhancing Laravel Queries with Reusable Scope Patterns

    August 1, 2025
    Development

    Everything We Know About Livewire 4

    August 1, 2025
    Leave A Reply Cancel Reply

    For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.

    Continue Reading

    CVE-2025-42988 – SAP Business Objects Business Intelligence Platform Information Disclosure

    Common Vulnerabilities and Exposures (CVEs)

    Rilasciata PorteuX 2.1: Novità e Approfondimenti sulla Distribuzione GNU/Linux Portatile Basata su Slackware

    Linux

    Can I play The Elder Scrolls 4: Oblivion Remastered on Steam Deck, ROG Ally, and other gaming handhelds?

    News & Updates

    CVE-2025-6837 – Code-projects Library System Unrestricted File Upload Vulnerability

    Common Vulnerabilities and Exposures (CVEs)

    Highlights

    Machine Learning

    How Hexagon built an AI assistant using AWS generative AI services

    May 13, 2025

    This post was co-written with Julio P. Roque Hexagon ALI. Recognizing the transformative benefits of…

    Exploring the context of online images with Backstory

    July 21, 2025

    LiveKit is an end-to-end stack for WebRTC

    May 3, 2025

    Google Cloud: Driving digital transformation

    May 13, 2025
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.