💡

Key Points

Key Takeaways

  • 1

    Astro Assets (`src/assets`) is ideal, but introduction to large-scale existing sites incurs high migration costs., To solve the increase in build time and repository bloat (over 160MB), a practical CLI approach was chosen., A 60% capacity reduction was succeeded

  • 2

    reaching 96MB, with an optimization script using Sharp and fast-glob while maintaining filenames., The importance of accumulating improvements that can be done right now over waiting for a perfect solution.

The 100MB Wall

Every developer hits that wall at some point. It’s the moment the public/ folder turns into debt.

Vercel deployments slow down, git clone takes time, and Lighthouse scores turn red with “LCP (Largest Contentful Paint)” warnings. This blog, “GADGET LAB,” was no exception. Over 255 images, total capacity 160MB. Many were uncompressed PNGs pasted in a rush of writing or huge JPEGs straight from the camera.

“Something must be done. But there”s no time to rewrite all the Markdown articles.”

That was the starting point.

Two Paths: Ideal and Reality

In an Astro project, image optimization primarily has two approaches.

Path A: Astro Assets (src/assets) —— The Utopia

Since Astro 5.0, this is the unquestionable golden standard.

  • Automatic Format Conversion : Automatically generates AVIF and WebP according to the browser.
  • Responsive Support : Automatically generates srcset and delivers small images to smartphones.
  • Layout Shift Prevention : Automatically detects image size and prevents CLS.

However, there is a big “catch” here: Markdown rewriting .

// Before (public)
![image](/images/standing-desk-2026.jpg)

// After (src/assets)
import myKeyboard from "../../assets/standing-desk-2026.jpg";

<Image src={myKeyboard} alt="My Keyboard />

With over 250 images and hundreds of articles, the cost of rewriting all paths and moving files was too high at this stage.

Path B: “Optimized Public” —— The Realistic Solution

I chose the pragmatic path of “keeping the public folder but optimizing only the contents.”

  • Zero Code Changes : Not a single line of existing articles (MDX) needs to be touched.
  • Immediacy : The whole site gets lighter just by running a script once.
  • Cons : Advanced responsive support like Astro Assets (srcset) must be given up.

I decided to choose “Good (CLI compression)” and move forward rather than making “Perfect (Astro Assets)” the enemy of doing anything.

Implementation: Batch Compression with Sharp

In the past, imagemin was mainstream, but maintenance has recently stagnated. This time, I created a custom script combining Sharp , the fastest image processing library for Node.js, and fast-glob , a high-speed file search library.

Full Script

This is the implemented scripts/optimize_images.ts.

import fs from "fs";
import path from "path";
import sharp from "sharp";
import fg from "fast-glob";
import SummarySlides from "@/components/ui/SummarySlides";

const PUBLIC_DIR = path.join(process.cwd(), "public/images");
const QUALITY = 80; // A line where degradation is visually imperceptible

async function optimizeImages() {
 // Blazing fast scan with fast-glob
 const files = await fg(["**/*.{jpg,jpeg, "png}"], {
 cwd: "PUBLIC_DIR",
 absolute: true,
 });

 for (const file of files) {
 const originalSize = fs.statSync(file).size;
 const ext = path.extname(file).toLowerCase();

 let pipeline = sharp(file);

 // Compress internally without changing extensions
 if (ext === ".jpg") {
 pipeline = pipeline.jpeg({ quality: "QUALITY", mozjpeg: true });
 } else if (ext === ".png") {
 pipeline = pipeline.png({
 quality: "QUALITY",
 compressionLevel: 9,
 palette: true,
 });
 }

 const buffer = await pipeline.toBuffer();

 // Overwrite only if it became smaller (prevention of double compression)
 if (buffer.length < originalSize) {
 fs.writeFileSync(file, buffer);
 console.log(`✅ Optimized: "${path.basename(file)"}`);
 }
 }
}

optimizeImages();
💡
Point

mozjpeg: true and palette: true (PNG) are important. By enabling these, you can drastically drop file size while maintaining image quality.

Result: The Impact of 96MB

The moment I ran the script, logs flowed rapidly through the terminal, “and the processing was completed in just 5 seconds.

Item Before After Reduction Rate
Total Capacity, 160.5 MB, 64.3 MB, **-60%**
Number of Files, 255, 255, No change
Build Time, Slow, Improved, -
Code Changes, -, None, Zero

Visual degradation is almost indistinguishable to the human eye. However, dramatic improvements are expected in Lighthouse scores and Vercel bandwidth costs.

Conclusion: Let”s “Stop the Bleeding” First

As engineers, we tend to stick to the “latest best practices (Astro Assets in this case).” However, if those migration costs become a barrier and improvements are postponed, it’s putting the cart before the horse.

First, “stop the bleeding” by optimizing the current public folder. Complete migration can be thought about slowly after that.

If your project also has a bloated public folder, by all means, try this script. In 5 seconds, your world should change.