Downloading large file wget

I was once trying to mirror whole site completely. The site breaks occasionally and needed to start wget and get the rest of the files.
So here what worked for me.

wget --convert-links --random-wait -N -r -p -E -e robots=off -U mozilla
Another option to use this fully comprehensive one wget --convert-links -b -q -c-N -r -p -E -e robots=off -U mozilla
What is important is the -N, this is to see if file is newer to other already downloaded. If cannot file or it is less or more than the one already downloaded it will downloaded again.

Apache Yahoo News Feed

Drupal Yahoo News

Linux Yahoo News Feed

Mysql Yahoo News Feed