Is Docker Eating Up Your Storage?
Fx64b
2025-07-16
Is Docker Eating Up Your Storage?
So there I was, working on my video-archiver project on my Manjaro Linux laptop, when suddenly my system started throwing "disk space full" errors.
Weird. I've got 1TB of storage, and I definitely haven't downloaded that many Linux ISOs... right?
A quick df -h
revealed the brutal truth: 2GB free out of 1TB, with only 85GB showing as "used" in my file manager. Something was definitely off.
The Storage Mystery
My first thought was partition issues. I run a dual-boot setup, so maybe something went wrong with the disk layout? After a few hours of running diagnostic commands and diving into partition tables, I found... nothing unusual.
That's when it hit me: I'd been working with Docker quite a bit lately. Could it be...?
Turns out, it absolutely was. Docker had accumulated a whopping ~250GB of build cache.
How I Ended Up Here
The problem was my video-archiver project. In my enthusiasm to add Docker support, I implemented what I can only describe as "quick and dirty" containerization:
- Unoptimized Docker images
- No
.dockerignore
file node_modules
being copied into images (and cached)- Downloaded videos and SQLite databases in the same directory getting copied into images
- Multiple rebuilds creating layers upon layers of cached data
Basically, I did everything you're not supposed to do with Docker, and my disk space paid the price.
Diagnosing Docker Storage Issues
If you suspect Docker is eating your storage, here's how to investigate:
Check Overall Docker Disk Usage
Bashdocker system df
This shows you a breakdown of space used by:
- Images
- Containers
- Build cache
- Volumes
Detailed Build Cache Analysis
Bashdocker buildx du
This command shows detailed information about your build cache, including which cache entries are taking up the most space.
Monitor Images
Bashdocker images --format "table {{.Repository}}\t{{.Tag}}\t{{.Size}}" | head -10
Lists your largest images so you can see what's consuming space.
The Diagnostic Script
I created this handy script to check Docker storage usage:
Bash1#!/bin/bash 2echo "๐ณ Docker Resource Monitor" 3echo "==========================" 4echo "๐พ System Usage:" 5docker system df 6echo "" 7echo "๐๏ธ Build Cache:" 8docker buildx du 2>/dev/null || echo "BuildKit not available" 9echo "" 10echo "๐ Image Stats:" 11docker images --format "table {{.Repository}}\t{{.Tag}}\t{{.Size}}" | head -10
Manual Cleanup Solutions
Once you've confirmed Docker is the problem, here's how to fix it:
The Nuclear Option
Bashdocker system prune -a --volumes
Warning: This removes everything - unused images, containers, networks, volumes, and build cache. Only use this if you're okay with rebuilding everything.
Surgical Cleanup
I prefer a more controlled approach:
Bash1#!/bin/bash 2echo "๐งน Starting Docker cleanup..." 3echo "๐ Current Docker disk usage:" 4docker system df 5 6echo "๐๏ธ Removing dangling images..." 7docker image prune -f 8 9echo "๐งฝ Cleaning build cache..." 10docker builder prune -f 11 12echo "๐ฆ Removing unused containers..." 13docker container prune -f 14 15echo "โ Cleanup complete!" 16echo "๐ New Docker disk usage:" 17docker system df
Selective Build Cache Cleanup
For a less aggressive approach, you can remove build cache older than a certain time:
Bashdocker builder prune -f --filter 'until=48h'
This removes build cache older than 48 hours, which is what I added to my run script as a quick fix.
Automated Solutions
Cron Job for Regular Cleanup
Add this to your crontab to run cleanup weekly:
Bash1# Clean Docker build cache weekly 20 2 * * 0 docker builder prune -f --filter 'until=72h'
Docker Compose with Cleanup
For development environments, you could include cleanup in the workflow:
YAML1version: '3.8' 2services: 3 app: 4 build: . 5 # ... other config 6 7 # Cleanup service 8 cleanup: 9 image: alpine 10 command: sh -c "docker builder prune -f --filter 'until=24h'" 11 profiles: 12 - cleanup
Run cleanup with: docker-compose --profile cleanup up cleanup
Prevention is Better Than Cure
Here's how to avoid this mess in the first place:
1. Use .dockerignore
Create a .dockerignore
file to exclude unnecessary files:
DOCKERIGNORE1node_modules 2npm-debug.log 3.git 4.gitignore 5README.md 6.env 7.nyc_output 8coverage 9.parcel-cache 10dist 11*.db 12*.sqlite 13downloads/
2. Multi-stage Builds
Use multi-stage builds to reduce final image size:
DOCKERFILE1# Build stage 2FROM node:18-alpine AS builder 3WORKDIR /app 4COPY package*.json ./ 5RUN npm ci --only=production 6 7# Production stage 8FROM node:18-alpine AS production 9WORKDIR /app 10COPY /app/node_modules ./node_modules 11COPY . . 12EXPOSE 3000 13CMD ["node", "server.js"]
3. Configure Docker Daemon
Set build cache limits in Docker daemon configuration:
JSON1{ 2 "builder": { 3 "gc": { 4 "enabled": true, 5 "defaultKeepStorage": "20GB", 6 "policy": [ 7 { 8 "keepStorage": "10GB", 9 "filter": [ 10 "type==source.local", 11 "type==exec.cachemount", 12 "type==source.git.checkout" 13 ] 14 } 15 ] 16 } 17 } 18}
4. Monitor Regularly
Set up monitoring to catch storage issues early:
Bash1# Add to your shell profile (e.g., ~/.bashrc or ~/.zshrc) 2alias docker-usage="docker system df && echo '' && docker buildx du 2>/dev/null || echo 'BuildKit not available'"
Lessons Learned
- Docker's build cache is powerful but can be storage-hungry
- Always use
.dockerignore
in projects - Regular cleanup prevents emergency situations
- Multi-stage builds are your friend
- Monitor your Docker usage
The 250GB build cache was definitely a learning experience. Now I'm more careful about what gets copied into my Docker images, and I have automated cleanup processes in place.
Quick Reference
Diagnostic Commands:
docker system df
- Overall usagedocker buildx du
- Build cache detailsdocker images --format "table {{.Repository}}\t{{.Tag}}\t{{.Size}}"
- Image sizes
Cleanup Commands:
docker system prune -a --volumes
- Nuclear optiondocker builder prune -f
- Clean build cachedocker image prune -f
- Remove dangling imagesdocker container prune -f
- Remove stopped containers
Prevention:
- Use
.dockerignore
- Implement multi-stage builds
- Set up automated cleanup
- Monitor regularly
P.S. - Yes, I probably should have noticed the 250GB of missing space sooner. In my defense, I was very focused on getting the video-archiver features working. Sometimes you get so deep into building something that you forget to check if you're accidentally filling up your hard drive in the process.