From Archiveteam
Revision as of 10:15, 10 June 2024 by Kevidryon2 (talk | contribs) (Added -L option to curl to allow a redirect)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
Angelfire- Welcome to Angelfire 1303510943179.png
Status Online!
Archiving status On hiatus
Archiving type DPoS, ArchiveBot
Project source angelfire-grab
Project tracker angelfire
IRC channel #angelonfire (on hackint)
Data[how to use] archiveteam_angelfire

Angelfire is a web hosting service since 1996, containing big chunks of early WWW history (which people love to mock).

It is not expected that the Angelfire archive can ever be truly complete, as Angelfire, like other free hosts such as Homestead, has or had a policy of deleting "inactive" accounts. As there is no known mirror of many of these former accounts and associated web pages, there may be no way to recover such deleted websites.

Angelfire underwent some changes in 2010, apparently not disruptive but requiring users to pay for some options like the old Web Shell tool; we do not know whether this caused some older websites to become unaccessible for their owners and whether that could cause inactivity and hence deletion. The Alexa rank of the property seems in constant fall, from better than 2000th position in early 2012 to worse than 3400th in early 2014.

It's not clear how bad Lycos is. A quick search for Lycos shutdowns only points to their (independently operated) Lycos Europe liquidation, which gave less than a month for the users to save their emails before deletion. Lycos Tripod on the other hand, which was in 2003 Europe's largest homepage building community (with special Google alliance), found a last minute buyer for its European wing but then suddenly went down in July 2013 (it was around 60,000th Alexa position in 2012 and fell well below 100,000 in early 2013).


All usernames/user info for scraping individual user's sitemaps can be found here:

Archivebot gave it a try,

Schbirid has some ugly Bash scripts: (ask before you use, they are probably out of date)

Discovery & Downloading

First grab all the sitemap indexes:

curl -L | grep -Eo 'http.*gz' > sitemap-index-urls

Use that to grab all the sitemaps:

wget -i sitemap-index-urls

Inside you will see the users' sitemaps URLs


Extract the user sitemap URLs:

zgrep -hEo 'http:.*xml' sitemap-index-*.xml.gz > sitemap-urls

Extract the webpage URLs:

grep -Eo '<loc>.*</loc>'"${user}"/sitemap.xml | sed 's#<loc>##' | sed 's#</loc>##' > "${user}.urls"

Grab them with options like: -m --no-parent --no-cookies -e robots=off --page-requisites,

As of 2015-05-08 there are 3895290 users

You will want --no-cookies because angelfire wants to set them everywhere.

Reject* and reject (ads) --> --reject-regex='(\/adm\/ad\/|\/doc\/images\/track\/ot_noscript\.gif)'

Some images are hosted on -->,

Guestbooks have been killed in 2012, e.g.

Some users have blogs with infinite calendars, like this in the sitemap: . Wget will run infinitely on those, better skip them for now.

Many users have no URLs in their sitemaps. Not sure what to do with those.

External links