Say Goodbye to File Upload Confusion! A Clear Path with Express.js, MongoDB, and AWS S3
Introduction
Uploading files is a common feature in modern software applications, enabling users to share and store assets such as documents, images, and more. In this comprehensive guide, I’ll show you how to implement file uploads in a scalable and efficient manner using Express.js, MongoDB, and AWS S3.
To ensure clean and maintainable code, we'll use the MVC (Model-View-Controller) design pattern, breaking the implementation into three key parts: controllers for handling logic, models for structuring data, and routes for mapping HTTP requests to appropriate handlers.
By the end of this guide, you’ll have a fully functional file upload system ready to integrate into your software application. If you're new to AWS S3 or need a refresher, check out the AWS S3 Documentation for more details on setting up buckets and managing access.
Prerequisites
Before starting, ensure you have the following:
Node.js and npm installed.
An AWS account with an S3 bucket created.
MongoDB set up (either locally or on a cloud provider like MongoDB Atlas).
Basic knowledge of JavaScript and Node.js.
Step 1: Set Up the Project
- Initialize a Node.js project:
mkdir file-upload-demo
cd file-upload-demo
npm init -y
- Install dependencies:
npm install express multer multer-s3 aws-sdk mongoose dotenv
express
: Web framework for Node.js.multer
: Middleware for handlingmultipart/form-data
.multer-s3
: Integrates Multer with AWS S3.aws-sdk
: AWS SDK for Node.js to interact with S3.mongoose
: MongoDB ODM (Object Data Modeling) library.dotenv
: For environment variable management.
3. Project structure:
file-upload-demo/
├── index.js
├── config/
│ └── awsConfig.js
├── controllers/
│ └── fileController.js
├── models/
│ └── File.js
├── routes/
│ └── fileRoutes.js
├── .env
└── package.json
Step 2: Configure AWS S3
Set up AWS credentials:
Create an IAM user with
AmazonS3FullAccess
.Generate Access Key ID and Secret Access Key.
Create a config file (
config/awsConfig.js
):
const AWS = require('aws-sdk');
AWS.config.update({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION
});
const s3 = new AWS.S3();
module.exports = s3;
3. Add AWS credentials to .env
:
AWS_ACCESS_KEY_ID=your-access-key-id
AWS_SECRET_ACCESS_KEY=your-secret-access-key
AWS_REGION=your-region
AWS_BUCKET_NAME=your-bucket-name
Step 3: Set Up MongoDB
- Create a Mongoose model (
models/File.js
):
const mongoose = require('mongoose');
const fileSchema = new mongoose.Schema({
name: String,
url: String,
key: String,
createdAt: { type: Date, default: Date.now }
});
module.exports = mongoose.model('File', fileSchema);
2. Connect to MongoDB in index.js
:
const mongoose = require('mongoose');
mongoose.connect(process.env.MONGO_URI, {
useNewUrlParser: true,
useUnifiedTopology: true
}).then(() => console.log('MongoDB connected')).catch(err => console.error(err));
Step 4: File Upload Logic
Controller
- Create a controller for file uploads (
controllers/fileController.js
):
const File = require('../models/File');
const s3 = require('../config/awsConfig');
const uploadFile = async (req, res) => {
try {
const { originalname, location, key } = req.file;
const file = new File({
name: originalname,
url: location,
key
});
await file.save();
res.status(200).json({ message: 'File uploaded successfully', file });
} catch (error) {
res.status(500).json({ error: 'Error uploading file' });
}
};
module.exports = { uploadFile };
Middleware
Set up Multer
middleware/fileMiddleware.js
):const multer = require('multer'); const multerS3 = require('multer-s3'); const s3 = require('../config/awsConfig'); const upload = multer({ storage: multerS3({ s3, bucket: process.env.AWS_BUCKET_NAME, acl: 'public-read', metadata: (req, file, cb) => { cb(null, { fieldName: file.fieldname }); }, key: (req, file, cb) => { cb(null, `uploads/${Date.now()}_${file.originalname}`); } }) }); module.exports = upload;
Create upload route (
routes/fileRoutes.js
):const express = require('express'); const { uploadFile } = require('../controllers/fileController'); const upload = require('./fileUploadMiddleware'); const router = express.Router(); router.post('/upload', upload.single('file'), uploadFile); module.exports = router;
Integrate routes in
index.js
:const express = require('express'); const dotenv = require('dotenv'); const fileRoutes = require('./routes/fileRoutes'); dotenv.config(); const app = express(); app.use('/api/files', fileRoutes); app.listen(3000, () => console.log('Server running on port 3000'));
Step 5: Retrieve and Delete Files
Controller
Add functions to fetch and delete files (
controllers/fileController.js
):const getFiles = async (req, res) => { try { const files = await File.find(); res.status(200).json(files); } catch (error) { res.status(500).json({ error: 'Error fetching files' }); } }; const deleteFile = async (req, res) => { try { const file = await File.findById(req.params.id); if (!file) return res.status(404).json({ error: 'File not found' }); await s3.deleteObject({ Bucket: process.env.AWS_BUCKET_NAME, Key: file.key }).promise(); await file.remove(); res.status(200).json({ message: 'File deleted successfully' }); } catch (error) { res.status(500).json({ error: 'Error deleting file' }); } }; module.exports = { uploadFile, getFiles, deleteFile };
Routes
Add routes for fetching and deleting files (
routes/fileRoutes.js
):router.get('/', getFiles); router.delete('/:id', deleteFile);
Testing the API
Use a tool like Postman or cURL to test the endpoints:
Upload a file: POST
/api/files/upload
with amultipart/form-data
body containing a file.Retrieve files: GET
/api/files
.Delete a file: DELETE
/api/files/:id
.
Conclusion
By following this guide, you’ve set up a robust system for uploading, storing, and managing files using Express.js, MongoDB, and AWS S3. This organized approach ensures your application remains scalable and maintainable. Now you’re ready to integrate these capabilities into your projects and enhance user experiences with reliable file management features.