IMAGE

How to Compress Images Without Losing Quality (2026 Guide)

Technical guide to compressing JPG, PNG and WebP up to 80% with no visible loss. Algorithms, formats, common mistakes, and private tools that don't upload your photos to the cloud.

DuneTools · · 11 min read

Images typically represent 70–80% of a web page’s weight. If your site takes more than 3 seconds to load, the problem is almost certainly uncompressed images. For modern SEO (Core Web Vitals), compressing images is not optional, it’s the difference between ranking and not ranking.

This guide explains how to compress images without losing visible quality, which format to choose for each case, why most “free online compressor” tools degrade quality unnecessarily, and how to do it without your photos travelling to anyone else’s server.

How image compression actually works

There are two fundamentally different types of compression:

Lossless compression

The image is reorganised internally to take up less space, but every pixel is preserved exactly. When decompressed, the result is bit-for-bit identical to the original. Used by PNG and lossless WebP.

The savings are limited (typically 20–50%) because there’s a mathematical floor: you can only “rearrange” the data so much. But there’s zero quality loss, so you can repeat it indefinitely without degradation.

Lossy compression

The algorithm deliberately discards information that the human eye barely notices: subtle gradients in similar colours, fine details in shadows, tiny variations in flat areas. Used by JPG, WebP-lossy, and AVIF.

Modern lossy savings are dramatic, 60–90% reduction with no visible loss. The catch is that each new compression discards more information, so you can’t recompress repeatedly without visible degradation.

The 75% rule for JPG

The most useful piece of practical advice on image compression: JPG at 75–85% quality is indistinguishable from the original to the human eye, while reducing the weight by 60–80%.

This is the “sweet spot” used by Facebook, Google, Apple, and any company that ships images to billions of users daily. Below 70% you start to see banding artefacts in gradients and “blocking” in flat areas. Above 90% adds weight with no visible benefit.

Practical recommendation: 80% is the safe default. If the photo will be printed at high resolution, use 90%. If it’s for a thumbnail in a feed, you can go down to 70%.

Modern algorithms that change the game

The “compress image” tools that have used the same JPG algorithm for 25 years are technologically obsolete. There are 3 algorithms that produce significantly better results:

MozJPEG (Mozilla)

A drop-in replacement for the standard JPG encoder, 20–30% smaller files at the same visible quality. Backwards compatible with all browsers and devices that read JPG. Used by TinyPNG, Squoosh, Cloudflare Image Resizing, and by DuneTools’ Compress Image since 2026.

Oxipng / pngquant

Rather than just rearranging the PNG, these tools analyse content and reduce the colour palette intelligently. Up to 80% savings on PNGs with limited palettes (logos, screenshots, illustrations) without losing visual quality. Photo-style PNGs see less benefit.

libwebp / libavif (WASM)

WebAssembly versions of Google’s reference encoders. They allow precise control over quality, tiling, and chroma subsampling. WebP at quality 80 typically saves 25–30% vs an equivalent JPG.

DuneTools integrates these algorithms directly via WebAssembly, they all run in your browser, no upload, no server in the equation.

Resize: the big lever everyone ignores

Bigger compression than any algorithm comes from resizing. A 4032×3024 px iPhone photo (typical for a modern phone) is 12 megapixels. For Instagram (1080×1080), Twitter (1200×675), or a normal webpage (max 1920 px wide), you need only 2–4 megapixels.

Resizing from 4032 px to 1920 px reduces the area by 77%, and the file weight tracks the area. A 4 MB photo becomes 900 KB just by resizing, before applying any compression.

Practical rule: before compressing for the web, resize the image to its maximum real display size. If it’s never going to be shown bigger than 1920 px on screen, don’t ship 4032 px. The user won’t notice and your page will load 4× faster.

The DuneTools “Web” preset does exactly this: 1920 px maximum side, JPG at 75%. For most photos you get a 85–90% reduction with no visible loss.

When to use each format

FormatBest forMaximum compressionTransparency
JPGReal photos, faces, landscapes70–80% (mozjpeg)No
PNGLogos, screenshots, line drawings30–50% (oxipng)Yes
WebPAlmost everything (modern alternative to both)80–90%Yes
AVIFMaximum reduction (the future)90–95%Yes

Quick decision tree:

  1. Does it need transparency? → PNG or WebP-lossless.
  2. Is it a photo? → WebP (or JPG if maximum compatibility).
  3. Modern site (post-2022)? → WebP for everything.
  4. Need bleeding-edge performance? → AVIF + WebP fallback.

Privacy: the silent problem with online tools

When you upload a photo to “comprimir-imagen.com” or “freecompressor.org”, this happens:

  1. Your photo travels through the internet to their server.
  2. The server keeps an indefinite copy (verify their terms, most don’t say “we delete after 24h”, and even if they did, there’s no audit).
  3. The server compresses and returns the result.
  4. Your photo is now on third-party hardware.

For a meme, that’s fine. For a personal photo, an unreleased product, professional client work, a medical document, NDA content, that’s a leak with no return.

The alternative is local processing with WebAssembly (WASM), a web standard that runs native compression code inside your browser. DuneTools Compress Image uses mozjpeg, oxipng, and libwebp compiled to WASM: the photo never travels anywhere, no copies are kept, closing the tab erases everything.

Practical rule: if the website shows a “Uploading…” progress bar, your photo is leaving your machine. If it processes instantly after dropping the file, it’s local.

Common mistakes (and how to avoid them)

Mistake 1: Re-compressing an already compressed file. A JPG saved at 85% then re-saved at 85% is not the same as the original at 85%. Each round adds artefacts. Solution: always keep the original PNG/RAW and compress only from it.

Mistake 2: Bumping quality “just in case”. A JPG at 95% weighs 2× as much as one at 85% with no visible difference. Solution: 80% is the safe default; only go up if you’ll print at high resolution.

Mistake 3: PNG for photos. PNG is lossless: it doesn’t discard data. For a photo with millions of unique colours, it’s typically 3–5× heavier than an equivalent JPG, with no visible quality benefit. Solution: PNG only for logos, screenshots, and graphics with flat areas.

Mistake 4: Ignoring resize. Compressing a 4032 px photo to 80% quality leaves you with a heavy 4032 px photo. Solution: resize first to actual display size, then compress.

Mistake 5: Trusting the first website you find. Many free tools watermark, slow you down, or are paid-service frontends. Solution: verify it processes locally (look for “WebAssembly” or “in-browser”) and compare the result against a control.

Real-world workflow

The typical professional workflow for compressing images for the web:

  1. Start from the original (RAW, full PNG, max-quality JPG).
  2. Resize to maximum real display size (1920 px for hero, 800 px for thumbnails).
  3. Choose format: photos → WebP/JPG, graphics → WebP/PNG, transparent → WebP/PNG.
  4. Compress with mozjpeg/libwebp at quality 75–85.
  5. Verify visually at 100% zoom, especially in flat areas (sky, faces).
  6. Publish, save the original on your local backup.

DuneTools Compress Image automates steps 2–4 with the “Web” preset (1920 px + 75% + auto-format) and keeps everything 100% local on your device.

Executive summary

For 95% of cases in 2026:

  1. JPG quality 80 (or WebP) is the best size/quality ratio.
  2. Resize before compressing, that’s where the real reduction is.
  3. PNG only for graphics with flat areas or transparency.
  4. Don’t re-compress already compressed files.
  5. Use tools that process locally if the photo is sensitive.

Compressing images well isn’t an art, it’s mechanics. Following these five rules separates a fast site that ranks on Google from a heavy site that loses visitors and revenue.