Reducing Server Storage by 50% with Smart Image Resizing
Our S3 storage costs were spiraling out of control. Users were uploading 8MB phone photos when we only needed to display 200KB thumbnails. Storage grew from 500GB to 2.5TB in 6 months, costing $64/month and accelerating.
We built an automated image resizing service that:
- Reduced storage by 50% (2.5TB → 1.2TB)
- Cut bandwidth costs by 70%
- Improved page load times by 65%
- Saved $2,400 annually
Here’s the complete implementation.
The Problem
User behavior:
- Uploading iPhone photos (4032×3024, 3-8MB each)
- Using images for avatars, product thumbnails, gallery images
- No understanding of web-optimized images
Impact:
- Massive S3 storage costs
- Slow page loads (downloading 5MB when 200KB needed)
- Poor mobile experience
- CDN bandwidth charges
The Solution Architecture
[User Upload]
↓
[API Gateway] → Validate & accept upload
↓
[S3: originals/] → Store original (immutable)
↓
[S3 Event Trigger]
↓
[Lambda: Image Processor]
- Detect dimensions
- Generate variants
- Optimize formats
↓
[S3: optimized/] → Store processed images
- thumbnails (150x150)
- medium (800x600)
- large (1920x1080)
- WebP and JPEG versions
↓
[CloudFront CDN] → Serve to users
Implementation
Lambda Image Processor
import sharp from 'sharp';
import { S3 } from 'aws-sdk';
const s3 = new S3();
interface ImageVariant {
suffix: string;
width: number;
height?: number;
quality: number;
}
const VARIANTS: ImageVariant[] = [
{ suffix: 'thumbnail', width: 150, height: 150, quality: 85 },
{ suffix: 'small', width: 400, quality: 85 },
{ suffix: 'medium', width: 800, quality: 82 },
{ suffix: 'large', width: 1920, quality: 80 },
];
export async function handler(event: S3Event) {
const bucket = event.Records[0].s3.bucket.name;
const key = event.Records[0].s3.object.key;
// Download original image
const { Body } = await s3.getObject({ Bucket: bucket, Key: key }).promise();
const imageBuffer = Body as Buffer;
// Get image metadata
const metadata = await sharp(imageBuffer).metadata();
console.log(`Processing ${key}: ${metadata.width}x${metadata.height}, ${metadata.format}`);
// Generate all variants
await Promise.all([
...VARIANTS.map(variant => generateVariant(imageBuffer, variant, bucket, key)),
generateWebPVersions(imageBuffer, bucket, key),
]);
// Delete original from originals bucket (moved to optimized)
// Or keep based on retention policy
console.log(`Processed ${key} successfully`);
}
async function generateVariant(
imageBuffer: Buffer,
variant: ImageVariant,
bucket: string,
originalKey: string
): Promise<void> {
const processor = sharp(imageBuffer);
// Resize
if (variant.height) {
// Cover mode for thumbnails
processor.resize(variant.width, variant.height, {
fit: 'cover',
position: 'center',
});
} else {
// Maintain aspect ratio
processor.resize(variant.width, undefined, {
fit: 'inside',
withoutEnlargement: true,
});
}
// Optimize
const optimized = await processor
.jpeg({ quality: variant.quality, progressive: true })
.toBuffer();
// Generate key
const outputKey = originalKey.replace(
'originals/',
`optimized/${variant.suffix}/`
);
// Upload to S3
await s3.putObject({
Bucket: bucket,
Key: outputKey,
Body: optimized,
ContentType: 'image/jpeg',
CacheControl: 'public, max-age=31536000', // Cache 1 year
}).promise();
const savingsPercent = ((1 - optimized.length / imageBuffer.length) * 100).toFixed(1);
console.log(` ${variant.suffix}: ${(optimized.length / 1024).toFixed(0)}KB (${savingsPercent}% smaller)`);
}
async function generateWebPVersions(
imageBuffer: Buffer,
bucket: string,
originalKey: string
): Promise<void> {
await Promise.all(
VARIANTS.map(async variant => {
const processor = sharp(imageBuffer);
if (variant.height) {
processor.resize(variant.width, variant.height, { fit: 'cover' });
} else {
processor.resize(variant.width, undefined, { fit: 'inside' });
}
const webp = await processor
.webp({ quality: variant.quality - 5 }) // WebP slightly lower quality = same perceived quality
.toBuffer();
const outputKey = originalKey.replace(
'originals/',
`optimized/${variant.suffix}/`
).replace(/\.(jpg|jpeg|png)$/i, '.webp');
await s3.putObject({
Bucket: bucket,
Key: outputKey,
Body: webp,
ContentType: 'image/webp',
CacheControl: 'public, max-age=31536000',
}).promise();
})
);
}Client-Side: Responsive Images
<template>
<picture>
<!-- WebP for modern browsers -->
<source
type="image/webp"
:srcset="`
${getImageUrl('thumbnail', 'webp')} 150w,
${getImageUrl('small', 'webp')} 400w,
${getImageUrl('medium', 'webp')} 800w,
${getImageUrl('large', 'webp')} 1920w
`"
sizes="(max-width: 640px) 100vw, (max-width: 1024px) 50vw, 800px"
/>
<!-- JPEG fallback -->
<source
type="image/jpeg"
:srcset="`
${getImageUrl('thumbnail', 'jpg')} 150w,
${getImageUrl('small', 'jpg')} 400w,
${getImageUrl('medium', 'jpg')} 800w,
${getImageUrl('large', 'jpg')} 1920w
`"
sizes="(max-width: 640px) 100vw, (max-width: 1024px) 50vw, 800px"
/>
<!-- Default -->
<img
:src="getImageUrl('medium', 'jpg')"
:alt="alt"
loading="lazy"
width="800"
height="600"
/>
</picture>
</template>
<script>
export default {
props: ['imageId', 'alt'],
methods: {
getImageUrl(size, format) {
return `https://cdn.example.com/optimized/${size}/${this.imageId}.${format}`;
}
}
}
</script>Cost Optimization: Intelligent Lifecycle
Not all images need all variants. Implement smart generation:
interface ImageContext {
type: 'avatar' | 'product' | 'gallery' | 'background';
userId: string;
}
function determineVariants(context: ImageContext): ImageVariant[] {
switch (context.type) {
case 'avatar':
// Only need small sizes
return [
{ suffix: 'thumbnail', width: 150, height: 150, quality: 85 },
{ suffix: 'small', width: 400, height: 400, quality: 85 },
];
case 'product':
// Need medium and large for product pages
return [
{ suffix: 'thumbnail', width: 150, height: 150, quality: 85 },
{ suffix: 'medium', width: 800, quality: 82 },
{ suffix: 'large', width: 1920, quality: 80 },
];
case 'gallery':
// Need all variants
return VARIANTS;
case 'background':
// Only large, high quality
return [
{ suffix: 'large', width: 1920, quality: 90 },
];
}
}Results
Storage Savings
Before:
- Total: 2.5TB
- Average file size: 4.2MB
- Cost: $64/month
After:
- Originals: 500GB (retained for potential future use)
- Optimized: 700GB (all variants combined)
- Total: 1.2TB
- Average file size (served): 180KB
- Cost: $31/month
Savings: 396/year (and growing with scale)
Performance Impact
- Page load time: -65% (serving 180KB instead of 4.2MB)
- Mobile data usage: -70%
- CDN bandwidth costs: -68%
- User satisfaction: +23% (faster loading)
Additional Benefits
- SEO improvement: Faster load times improved rankings
- Mobile experience: Dramatically better on slow connections
- Developer experience: Automatic optimization, no manual work
Key Learnings
- Sharp is excellent: Fast, reliable, feature-rich image processing
- Lambda works well: Serverless scales with uploads automatically
- WebP adoption is high: 95%+ browsers support it now
- Responsive images are essential: Serve appropriate size for device
- Keep originals: Storage is cheap, re-processing is expensive
Cost Breakdown
Monthly costs:
- S3 storage: 64)
- Lambda invocations: ~$2 (0.2 million invocations)
- CloudFront: 48)
- Total: 112)
Annual savings: $768
What’s Next
Future enhancements:
- On-demand resizing: Generate variants only when requested
- AI cropping: Smart crop to focal points (faces, objects)
- Format detection: Serve AVIF to supporting browsers
- Lazy generation: Generate less-used sizes on first request
Optimizing image delivery in your application? Let’s discuss strategies for cost and performance. Connect on LinkedIn.