In v3 you can use the Upload class from @aws-sdk/lib-storage to do multipart uploads. Seems like there might be no mention of this in the docs site for @aws-sdk/client-s3 unfortunately.

It's mentioned in the upgrade guide here: https://github.com/aws/aws-sdk-js-v3/blob/main/UPGRADING.md#s3-multipart-upload

Here's a corrected version of the example provided in https://github.com/aws/aws-sdk-js-v3/tree/main/lib/lib-storage:

  import { Upload } from "@aws-sdk/lib-storage";
  import { S3Client } from "@aws-sdk/client-s3";

  const target = { Bucket, Key, Body };
  try {
    const parallelUploads3 = new Upload({
      client: new S3Client({}),
      tags: [...], // optional tags
      queueSize: 4, // optional concurrency configuration
      leavePartsOnError: false, // optional manually handle dropped parts
      params: target,
    });

    parallelUploads3.on("httpUploadProgress", (progress) => {
      console.log(progress);
    });

    await parallelUploads3.done();
  } catch (e) {
    console.log(e);
  }

At the time of writing, the following Body types are supported:

  • string
  • Uint8Array
  • Buffer
  • Blob (hence also File)
  • Node Readable
  • ReadableStream

(according to https://github.com/aws/aws-sdk-js-v3/blob/main/lib/lib-storage/src/chunker.ts)

However if the Body object comes from a polyfill or separate realm and thus isn't strictly an instanceof one of these values, you will get an error. You can work around a case like this by cloning the Uint8Array/Buffer or piping the stream through a PassThrough. For example if you are using archiver to upload a .zip or .tar archive, you can't pass the archiver stream directly because it's a userland Readable implementation (at time of writing), so you must do Body: archive.pipe(new PassThrough()).

Answer from Andy on Stack Overflow
Top answer
1 of 3
86

In v3 you can use the Upload class from @aws-sdk/lib-storage to do multipart uploads. Seems like there might be no mention of this in the docs site for @aws-sdk/client-s3 unfortunately.

It's mentioned in the upgrade guide here: https://github.com/aws/aws-sdk-js-v3/blob/main/UPGRADING.md#s3-multipart-upload

Here's a corrected version of the example provided in https://github.com/aws/aws-sdk-js-v3/tree/main/lib/lib-storage:

  import { Upload } from "@aws-sdk/lib-storage";
  import { S3Client } from "@aws-sdk/client-s3";

  const target = { Bucket, Key, Body };
  try {
    const parallelUploads3 = new Upload({
      client: new S3Client({}),
      tags: [...], // optional tags
      queueSize: 4, // optional concurrency configuration
      leavePartsOnError: false, // optional manually handle dropped parts
      params: target,
    });

    parallelUploads3.on("httpUploadProgress", (progress) => {
      console.log(progress);
    });

    await parallelUploads3.done();
  } catch (e) {
    console.log(e);
  }

At the time of writing, the following Body types are supported:

  • string
  • Uint8Array
  • Buffer
  • Blob (hence also File)
  • Node Readable
  • ReadableStream

(according to https://github.com/aws/aws-sdk-js-v3/blob/main/lib/lib-storage/src/chunker.ts)

However if the Body object comes from a polyfill or separate realm and thus isn't strictly an instanceof one of these values, you will get an error. You can work around a case like this by cloning the Uint8Array/Buffer or piping the stream through a PassThrough. For example if you are using archiver to upload a .zip or .tar archive, you can't pass the archiver stream directly because it's a userland Readable implementation (at time of writing), so you must do Body: archive.pipe(new PassThrough()).

2 of 3
10

I did come across with the same error that you faced. It seems that they have a known issue that they haven't yet documented accurately:

The error is indeed caused by stream length remaining unknown. We need to improve the error message and the documentation

In order to fix this issue, you just need to specify the Content-length property for PutObjectCommand

Here is the updated snippet:

const { S3 } = require('@aws-sdk/client-s3');

const s3 = new S3({
  credentials: {
    accessKeyId: S3_API_KEY,
    secretAccessKey: S3_API_SECRET,
  },
  region: S3_REGION,
  signatureVersion: 'v4',
});

const uploadToFirstS3 = (passThroughStream) => (new Promise((resolve, reject) => {
  const uploadParams = {
    Bucket: S3_BUCKET_NAME,
    Key:'some-key',
    Body: passThroughStream,
    ContentLength: passThroughStream.readableLength, // include this new field!!
  };
  s3.putObject(uploadParams, (err) => {
    if (err) reject(err);
    resolve(true);
  });
}));
      

Hope it helps!

🌐
Medium
medium.com › @maksim_smagin › software-architecture-101-how-to-upload-file-s3-nodejs-fastify-68fceb5c5133
Tutorial: Upload files to Amazon S3 from the server using pre-signed urls | by Maksim Smagin | Medium
March 9, 2023 - By this i mean frontend will send single request directly to API, and upload will be counted as successful only when API will fully read the file from request, save it to folder on the server or s3 bucket for example, and return response. ... import Fastify from 'fastify'; import FastifyMultipart from '@fastify/multipart'; import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3'; import fs from 'node:fs'; import { promisify } from 'node:util'; import { pipeline } from 'node:stream'; import dotenv from 'dotenv'; // read .env file with configuration dotenv.config(); // create s3 client us
🌐
DEV Community
dev.to › kitsunekyo › upload-to-aws-s3-directly-from-the-browser-js-aws-sdk-v3-1opk
Upload to AWS S3 directly from the browser (js aws sdk v3) - DEV Community
July 4, 2021 - | If you run into CORS issues, make sure you didnt skip the AWS S3 setup section. All thats left to do now is upload our file directly to our s3 bucket, with the received credentials. We can do this right after getting the presigned url, so the user doesn't even realize that we send two requests to different services. The s3 endpoint expects a form upload where, additionally to the file itself, all the credentials from the presigned post url are appended as fields. // client/index.js uploadForm.addEventListener("submit", async function (event) { // ...add the two lines below const uploadedFile
🌐
Amazon Web Services
docs.aws.amazon.com › amazon simple storage service (s3) › user guide › code examples for amazon s3 using aws sdks › basic examples for amazon s3 using aws sdks › actions for amazon s3 using aws sdks › use putobject with an aws sdk or cli
Use PutObject with an AWS SDK or CLI - Amazon Simple Storage Service
Find the complete example and learn how to set up and run in the AWS Code Examples Repository ... pub async fn upload_object( client: &aws_sdk_s3::Client, bucket_name: &str, file_name: &str, key: &str, ) -> Result<aws_sdk_s3::operation::put_object::PutObjectOutput, S3ExampleError> { let body = aws_sdk_s3::primitives::ByteStream::from_path(std::path::Path::new(file_name)).await; client .put_object() .bucket(bucket_name) .key(key) .body(body.unwrap()) .send() .await .map_err(S3ExampleError::from) }
🌐
AWS
docs.aws.amazon.com › aws sdk for javascript › developer guide for sdk version 3 › sdk for javascript (v3) code examples › amazon s3 examples using sdk for javascript (v3)
Amazon S3 examples using SDK for JavaScript (v3) - AWS SDK for JavaScript
Upload the object. import { readFile } from "node:fs/promises"; import { PutObjectCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Upload a file to an S3 bucket.
🌐
Chatwithcloud
chatwithcloud.ai › aws-practical-examples › upload-file-to-s3-using-aws-s3client-typescript
Upload File to S3 Using AWS S3Client TypeScript
November 5, 2023 - In this code, we use the AWS SDK for JavaScript to upload a file to Amazon S3 (Simple Storage Service). The code is written in TypeScript and uses the S3Client and PutObjectCommand classes from the @aws-sdk/client-s3 package.
🌐
AWS
docs.aws.amazon.com › sdk-for-javascript › v3 › developer-guide › s3-example-photo-album-full.html
Uploading photos to Amazon S3: Full code - AWS SDK for JavaScript
This section contains the full HTML and JavaScript code for the example in which photos are uploaded to an Amazon S3 photo album. See the parent section for details and prerequisites. ... <!DOCTYPE html> <html> <head> <script src="./main.js"></script> <script> function getHtml(template) { return template.join("\n"); } listAlbums(); </script> </head> <body> <h1>My photo albums app</h1> <div id="app"></div> </body> </html> ... // Load the required clients and packages const { CognitoIdentityClient } = require("@aws-sdk/client-cognito-identity"); const { fromCognitoIdentityPool, } = require("@aws
Find elsewhere
🌐
Medium
medium.com › @v1shva › uploading-downloading-files-using-aws-s3-sdk-v3-javascript-typescript-as-a-stream-and-processing-42ba07bb892c
Uploading/Downloading files using AWS S3 SDK v3 (Javascript/Typescript) as a stream and processing the data using node-parse (async/sync) | by Vishva Meegalla | Medium
September 17, 2023 - But we do have to wait wait for done promise to resolve to make sure the file uploaded to s3 bucket successfully. import { GetObjectCommand, S3Client } from "@aws-sdk/client-s3"; import { NodeJsClient } from "@smithy/types"; import { Readable } from "node:stream"; import { Upload } from "@aws-sdk/lib-storage"; export class S3Service { constructor(private readonly s3Client = new S3Client({ region: "ap-southeast-1" }) as NodeJsClient<S3Client>) {} public async downloadFile(path: string): Promise<Readable> { const body = ( await this.s3Client.send( new GetObjectCommand({ Bucket: "test-bucket", Key: path, }), ) ).Body; return body; } public uploadFile(path: string, Body: Readable): Upload { const resp = new Upload({ client: this.s3Client, params: { Bucket: "test-bucket", Key: path, Body, }, }); return resp; } }
🌐
AWS
docs.aws.amazon.com › sdk-for-javascript › v3 › developer-guide › s3-example-photo-album.html
Uploading photos to Amazon S3 from a browser - AWS SDK for JavaScript
In this example, a simple HTML page provides a browser-based application for creating photo albums in an Amazon S3 bucket into which you can upload photos. The application lets you delete photos and albums that you add. The browser script uses the SDK for JavaScript to interact with an Amazon ...
🌐
Medium
medium.com › geekculture › uploading-to-aws-s3-via-aws-sdk-version-2-and-version-3-from-express-38927787b705
Uploading to AWS S3 via AWS SDK version 2 and version 3 from Express | by Amasa Abubakar | Geek Culture | Medium
February 4, 2022 - 3) Body: This is the buffer of the file you want to upload. In this case the file will be available on req.file.buffer · Then create what is called a command. To create this, you have to bring in the ‘PutObjectCommand()’ constroctor from the original ‘’@aws-sdk/client-s3', so we have to import ‘@aws-sdk/client-s3’ in the index file.
🌐
Tabnine
tabnine.com › home page › code › javascript › s3
aws-sdk.S3.upload JavaScript and Node.js code examples | Tabnine
const uploadEvents = async (events) => { const outStream = zlib.createGzip(); events.forEach(e => { outStream.write(`${JSON.stringify(humps.decamelizeKeys(e))}\n`, 'utf8'); }); outStream.end(); const date = new Date().toISOString(); const partitionPrefix = date.substring(0, 13); const fileName = `dt=${partitionPrefix}/${date}.json.gz`; const params = { Bucket: process.env.HN_INSIGHTS_EVENTS_BUCKET || 'hn-insights-events', Key: fileName, Body: outStream }; console.log(`Uploading ${fileName}: ${events.length} events...`); await s3.upload(params).promise(); console.log(`Uploading ${fileName} done`); }
🌐
CodeSandbox
codesandbox.io › examples › package › @aws-sdk › client-s3
@aws-sdk/client-s3 examples - CodeSandbox
hello-aws-js-sdk · pradeep61993/S3-File-Upload · notea · google-drive-clone-beNodeJS backend for URL shortener · kickoff-nextjs-fullstack · ecommerce-admin · express-rest-api-demoStarter project for an ES6 RESTful Express API · ack-nestjs-boilerplateAck NestJs Boilerplate ·
🌐
freeCodeCamp
freecodecamp.org › news › how-to-upload-files-to-aws-s3-with-node
How to Upload Files to Amazon S3 with Node.js
April 25, 2023 - PORT=3000 AWS_ACCESS_KEY_ID= AWS_SECRET_ACCESS_KEY= S3_REGION=eu-central-1 S3_BUCKET=prac-s3 · Next, you will create a new instance of the Upload module imported from @aws-sdk/lib-storage within the file.open method. We will configure that instance with some options, including: client: This is ...
🌐
GitHub
gist.github.com › homam › 8646090
How to upload files to AWS S3 with NodeJS SDK · GitHub
See https://coolaj86.com/articles/upload-to-s3-with-node-the-right-way/ if you've tried and failed a few times with what you've found among the top google results and what to know why pretty much none of them work. Of course, you could use lots of const, await, hashrockets, and arrow functions as well, but I prefer plain JS because it's easy even for novices from other languages to read and understand.
🌐
Amazon Web Services
docs.aws.amazon.com › amazon simple storage service (s3) › user guide › working with objects in amazon s3 › uploading objects
Uploading objects - Amazon Simple Storage Service
You can upload any file type—images, backups, data, movies, and so on—into an S3 bucket. The maximum size of a file that you can upload by using the Amazon S3 console is 160 GB. To upload a file larger than 160 GB, use the AWS Command Line Interface (AWS CLI), AWS SDKs, or Amazon S3 REST API.
🌐
AWS
docs.aws.amazon.com › javascript sdk › developer guide for sdk v2 › sdk for javascript code examples › amazon s3 examples › amazon s3 node.js examples › creating and using amazon s3 buckets
Creating and Using Amazon S3 Buckets - AWS SDK for JavaScript
To run the example, type the following at the command line. ... Create a Node.js module with the file name s3_upload.js. Make sure to configure the SDK as previously shown. Create an AWS.S3 service object.
🌐
Latenode
community.latenode.com › other questions › npm
Uploading Files to AWS S3 in React with @aws-sdk/client-s3: Seeking a Clear Example - NPM - Latenode Official Community
February 28, 2025 - I’m facing challenges uploading files to AWS S3 in my React application. I tried using packages like react-s3 and react-aws-s3 but encountered errors with my implementation. I then discovered the @aws-sdk/client-s3 libr…