I have the dokk webserver locally, and from it I generate a static website that I then upload to the website. My goal here is to have the website serve only static files.
The way I generate the static website is by “mirroring” it with wget:
wget --mirror --page-requisites --adjust-extension --execute robots=off localhost:8080 and then I upload all the
.html pages. My only issue with wget is that it downloads one URL at a time, and it’s taking between 1 and 2 hours for completing the job. I looked at aria2 for parallel downloads, but it only takes a list of URLs as input instead of following links. What tools exist for mirroring a website in parallel?