I am trying to mirror a blogger site so that I can have an exact copy of it on my filesystem to view. I have tried issuing the following command on Linux:
wget -r -k -x -e robots=off --wait 1 http://your.site.here.blogspot.com/
I have even tried using the -D flag to list a comma-separated list of domanins to follow (would prefer to just follow any domain though without having to specify all of them). I have even tried changing the .com part of the URL to the top-level domain for my country (.it) (without which for some reason I don't understand and would like to know, wget retrieves only index.html and no other page, perhaps someone here can explain why).
So, even when I do a
wget -r -k -x -e robots=off --wait 1 http://your.site.here.blogspot.it/
several HTML and also the favicon.ico are downloaded but none of the .png images from blogger are downloaded. Why is this so and how can I get wget to work properly. I've read the wget man page but had no luck.
Thanks.
.png
images are hosted onhttp://your.site.here.blogspot.it/
? Images uploaded to the Blogger service seem to be served from<number>.bp.blogspot.com
instead, which would explain whywget
won't fetch them. – jayhendren Oct 4 '13 at 0:38