A Debian-first guide to identifying automated path discovery, reviewing exposed directories and files, and reducing unnecessary web exposure.
Directory enumeration is the process of discovering hidden paths, directories, files, backups, and application endpoints on a web server. It is commonly done with automated tools that try large wordlists against a target to find anything useful.
Attackers use directory enumeration to locate admin pages, exposed backups, configuration files, uploads, APIs, and forgotten content.
sudo journalctl -u caddy --since "1 hour ago"
sudo grep ' 404 ' /var/log/caddy/*.log 2>/dev/null | tail -n 100
sudo awk '{print $1}' /var/log/caddy/*.log 2>/dev/null | sort | uniq -c | sort -nr | head
sudo grep -Ei "/admin|/backup|/uploads|/old|/test|/dev|/private|/config|\.zip|\.bak|\.old" /var/log/caddy/*.log 2>/dev/null | tail -n 100
sudo find /var/www -maxdepth 4 -type f | less
sudo ufw deny from <IP_ADDRESS>
sudo find /var/www -type f \( -name "*.bak" -o -name "*.old" -o -name "*.zip" \)
# Protect internal paths, backups, and administrative locations that should not be public
sudo find /var/www -maxdepth 4 \( -type f -o -type d \) | less
sudo find /var/www -type f \( -name "*.zip" -o -name "*.tar" -o -name "*.bak" -o -name "*.old" \)
sudo grep -Ei "admin|login|upload|api" /var/log/caddy/*.log 2>/dev/null | tail -n 100
Recovery means removing what should not be public and confirming that directory discovery did not lead to real access or further abuse.
sudo find /var/www -type f | less
# Use your web server configuration to block internal-only paths
sudo ss -tulnp
sudo journalctl -u caddy --since "24 hours ago"
sudo apt update
sudo apt upgrade
Manual content review should come first. Future recommendations may include low-cost options that make log review or isolation easier, but the primary recommendation is to keep public content minimal and intentional.
All commands shown are based on Debian-based systems unless otherwise noted.