Uploading large files in web applications can be challenging due to limitations in file size and the potential for network interruptions. Using the multipart upload approach can help manage these challenges effectively. In this blog, we’ll explore how to implement multipart uploads in a React application, providing a step-by-step guide along with best practices.

What is Multipart Upload?

Multipart upload is a technique where a large file is divided into smaller, manageable parts, which are uploaded individually. This method offers several advantages:

  • Resilience: If a part fails to upload, only that part needs to be retried.
  • Efficiency: Smaller parts can be uploaded in parallel, speeding up the overall upload time.
  • User Experience: Progress can be shown for each part, giving users feedback during uploads.

Step-by-Step Implementation

1. Set Up Your React Project

Start by creating a new React project if you haven’t already:

npx create-react-app multipart-upload
cd multipart-upload
npm start

2. Install Axios

We’ll use Axios for making HTTP requests. Install it via npm:

npm install axios

3. Create the File Upload Component

Create a new component, FileUpload.js, to handle file selection and upload.

import React, { useState } from 'react';
import axios from 'axios';
const FileUpload = () => {
const [file, setFile] = useState(null);
const [uploadProgress, setUploadProgress] = useState(0);const handleFileChange = (event) => {
setFile(event.target.files[0]);
};const uploadFile = async () => {
const chunkSize = 1024 * 1024; // 1MB chunks
const totalChunks = Math.ceil(file.size / chunkSize);

for (let i = 0; i < totalChunks; i++) {
const start = i * chunkSize;
const end = Math.min(start + chunkSize, file.size);
const chunk = file.slice(start, end);

const formData = new FormData();
formData.append(‘file’, chunk);
formData.append(‘chunkIndex’, i);
formData.append(‘totalChunks’, totalChunks);
formData.append(‘fileName’, file.name);

await axios.post(‘/upload’, formData, {
onUploadProgress: (progressEvent) => {
const percentage = Math.round((i / totalChunks) * 100 + (progressEvent.loaded / chunk.size) * (100 / totalChunks));
setUploadProgress(percentage);
}
});
}
alert(‘Upload complete!’);
};

return (
<div>
<input type=“file” onChange={handleFileChange} />
<button onClick={uploadFile}>Upload</button>
<progress value={uploadProgress} max=“100” />
</div>

);
};

export default FileUpload;

4. Set Up the Backend

You’ll need a server to handle the file uploads. Here’s a simple Node.js/Express example to manage the uploaded chunks.

const express = require('express');
const multer = require('multer');
const fs = require('fs');
const path = require('path');
const app = express();
const PORT = process.env.PORT || 5000;// Configure storage for multer
const storage = multer.diskStorage({
destination: (req, file, cb) => {
cb(null, ‘uploads/’);
},
filename: (req, file, cb) => {
cb(null, file.originalname);
}
});const upload = multer({ storage: storage });

app.post(‘/upload’, upload.single(‘file’), (req, res) => {
const { chunkIndex, totalChunks, fileName } = req.body;

// Create a directory for the file if it doesn’t exist
const dir = path.join(__dirname, ‘uploads’, fileName);
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, { recursive: true });
}

// Move the uploaded chunk to the directory
fs.rename(req.file.path, path.join(dir, `chunk-${chunkIndex}`), (err) => {
if (err) return res.status(500).send(err);

// Check if all chunks are uploaded
if (parseInt(chunkIndex) === totalChunks – 1) {
// Implement logic to merge chunks if needed
}

res.sendStatus(200);
});
});

app.listen(PORT, () => {
console.log(`Server is running on http://localhost:${PORT}`);
});

5. Merging Chunks (Optional)

If you want to combine the uploaded chunks into a single file after all uploads are complete, you can implement the merging logic on the server.

if (parseInt(chunkIndex) === totalChunks - 1) {
const filePath = path.join(__dirname, 'uploads', fileName, fileName);
const writeStream = fs.createWriteStream(filePath);
for (let i = 0; i < totalChunks; i++) {
const chunkPath = path.join(dir, `chunk-${i}`);
const data = fs.readFileSync(chunkPath);
writeStream.write(data);
fs.unlinkSync(chunkPath); // Optionally delete the chunk
}writeStream.end();
}

Best Practices

  1. Chunk Size: Adjust the chunk size based on your needs. Smaller chunks may improve reliability but can increase overhead.
  2. Error Handling: Implement error handling for failed uploads and retries for specific chunks.
  3. Progress Indicators: Provide users with clear progress indicators for each chunk and the overall upload.
  4. Security: Ensure that your backend validates and sanitizes file uploads to prevent malicious files.