Directory Enumeration

A Debian-first guide to identifying automated path discovery, reviewing exposed directories and files, and reducing unnecessary web exposure.

What this is

Directory enumeration is the process of discovering hidden paths, directories, files, backups, and application endpoints on a web server. It is commonly done with automated tools that try large wordlists against a target to find anything useful.

Attackers use directory enumeration to locate admin pages, exposed backups, configuration files, uploads, APIs, and forgotten content.

What it looks like

Detect

Review recent web service logs

sudo journalctl -u caddy --since "1 hour ago"

Look for repeated 404 responses

sudo grep ' 404 ' /var/log/caddy/*.log 2>/dev/null | tail -n 100

Count source IPs generating many requests

sudo awk '{print $1}' /var/log/caddy/*.log 2>/dev/null | sort | uniq -c | sort -nr | head

Search for common enumeration targets

sudo grep -Ei "/admin|/backup|/uploads|/old|/test|/dev|/private|/config|\.zip|\.bak|\.old" /var/log/caddy/*.log 2>/dev/null | tail -n 100

Review what actually exists under the web root

sudo find /var/www -maxdepth 4 -type f | less

Contain

Block a clearly abusive source IP

sudo ufw deny from <IP_ADDRESS>

Remove or disable content that should not be public

sudo find /var/www -type f \( -name "*.bak" -o -name "*.old" -o -name "*.zip" \)

Restrict sensitive paths in web server configuration

# Protect internal paths, backups, and administrative locations that should not be public

Recover

Review discovered files and directories manually

sudo find /var/www -maxdepth 4 \( -type f -o -type d \) | less

Check for exposed backups or archives

sudo find /var/www -type f \( -name "*.zip" -o -name "*.tar" -o -name "*.bak" -o -name "*.old" \)

Review whether enumeration led to successful access

sudo grep -Ei "admin|login|upload|api" /var/log/caddy/*.log 2>/dev/null | tail -n 100

Recovery means removing what should not be public and confirming that directory discovery did not lead to real access or further abuse.

Prevent

Option 1 — Remove old content and backups

sudo find /var/www -type f | less

Option 2 — Restrict access to sensitive paths

# Use your web server configuration to block internal-only paths

Option 3 — Separate public content from internal tools

sudo ss -tulnp

Option 4 — Review logs regularly for unusual path discovery

sudo journalctl -u caddy --since "24 hours ago"

Option 5 — Keep applications and plugins updated

sudo apt update
sudo apt upgrade

Optional Tools & Hosting

Manual content review should come first. Future recommendations may include low-cost options that make log review or isolation easier, but the primary recommendation is to keep public content minimal and intentional.

Notes

Environment Note

All commands shown are based on Debian-based systems unless otherwise noted.