Difference between revisions of "WikiTeam"

From Archiveteam
Jump to navigation Jump to search
m (Reverted edits by Joeyboi (Talk) to last revision by Ca7)
 
(145 intermediate revisions by 22 users not shown)
Line 1: Line 1:
<center><big>'''We save wikis, from Wikipedia to tiniest wikis'''<br/>[http://code.google.com/p/wikiteam/downloads/list?can=1 +80 wikis saved to date]</big></center>
{{Infobox project
{{TOCright}}
| title = WikiTeam XML
Welcome to '''WikiTeam'''. A '''wiki''' is a website that allows the creation and editing of any number of interlinked web pages, generally used to store information on a specific subject or subjects. This is done with a day-to-day web browser using a simplified markup language (HTML as an example) or a WYSIWYG (what-you-see-is-what-you-get) text editor.
| image = Wikiteam.jpg
| description = WikiTeam, we preserve wikis
| project_status = {{specialcase}}
| archiving_status = {{inprogress}} (manual)
| source = [https://github.com/WikiTeam/wikiteam WikiTeam GitHub]
| irc = wikiteam
}}


Examples of huge wikis:
'''WikiTeam''' software is a set of tools for archiving wikis. They work on [[MediaWiki]] wikis, but we want to expand to other wiki engines. As of 2019, WikiTeam has preserved more than 250,000 wikis.
* [[Wikipedia]] - arguably the largest and one of the oldest Wikis on the planet. It offers public backups: http://dumps.wikimedia.org
 
* [[Wikimedia Commons]] - a Wiki of media files available for free usage. It offers public backups: http://dumps.wikimedia.org
You can check [https://archive.org/details/wikiteam our collection] at [[Internet Archive]], the [https://github.com/WikiTeam/wikiteam source code] on [[GitHub]] and some [https://wikiapiary.com/wiki/Websites/WikiTeam lists of wikis by status] on [[WikiApiary]]. There's also a [https://wikiapiary.com/wiki/Category:Website_not_archived list] of not yet archived wikis on WikiApiary.
** But there is no image dump available, only the image descriptions
 
* [[Wikia]] - a website that allows the creation and hosting of wikis. It offers public backups: http://wiki-stats.wikia.com
There are two completely separate projects under the umbrella of '''WikiTeam''':
* The archival of the wikis in the form of XML dumps. This is what most of this page is about.
* The archival of external links found in wikis to WARCs. See the [[#Links warrior project|Links warrior project]] section.
 
The archival of the wikis themselves to WARCs is also desirable but has not been attempted yet.
 
== Current status ==
 
The total number of MediaWiki wikis is unknown, but some estimates exist.
 
According to [[WikiApiary]], which is the most updated database, there are 21,139 independent wikis (1,718 are semantic) and 4,819 in wikifarms as of 2018-08-02.<ref>[https://wikiapiary.com/wiki/Websites Websites] - WikiApiary</ref> But it doesn't include 400,000+ [[Wikia]] wikis, and the independent list coverage can be improved for sure.
 
According to Pavlo's list generated in December 2008, there are 20,000 wikis.<ref>[http://cs.brown.edu/~pavlo/mediawiki/ Pavlo's list of wikis] ([http://www.cs.brown.edu/~pavlo/mediawiki/mediawikis.csv mediawiki.csv]) ([https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/mediawikis_pavlo.csv backup])</ref> This list was imported into WikiApiary.
 
According to [[WikiIndex]], there are 20,698 wikis.<ref>[http://wikiindex.org/Special:Statistics WikiIndex Statistics]</ref> The URLs in this project were added to WikiApiary in the past too.
 
A number of [[#Wikifarms|wikifarms]] have vanished and about 180 are still online.<ref>[https://wikiapiary.com/wiki/Farm:Farms Wikifarms]</ref><ref>[https://en.wikipedia.org/wiki/Comparison_of_wiki_hosting_services Comparison of wiki hosting services]</ref><ref>[http://wikiindex.org/Category:WikiFarm Category:WikiFarm]</ref>


Most of the wikis don't offer public backups. How bad!
Most wikis are small, containing about 100 pages or less, but there are some very large wikis:<ref>[http://meta.wikimedia.org/wiki/List_of_largest_wikis List of largest wikis]</ref><ref>[http://s23.org/wikistats/largest_html.php?th=15000&lines=500 List of largest wikis in the world]</ref>
* By '''number of pages''': Wikimedia Commons (77 million), Wikidata (72 million), English Wikipedia (49 million), DailyWeeKee (35 million), WikiBusiness (22 million).
* By '''number of files''': Wikimedia Commons (57 million), English Wikipedia (800,000).


== Tools and source code ==
The oldest dumps are probably some 2001 dumps of Wikipedia when it used UseModWiki.<ref>[https://dumps.wikimedia.org/archive/ Wikimedia Downloads Historical Archives]</ref><ref>[http://dumps.wikimedia.org/nostalgiawiki Dump] of [http://nostalgia.wikipedia.org/ Nostalgia], an ancient version of Wikipedia from 2001</ref>
* [http://code.google.com/p/wikiteam/ WikiTeam Google Code repository]
* [http://code.google.com/p/wikiteam/source/browse/trunk/dumpgenerator.py dumpgenerator.py] to download MediaWiki wikis: <tt>python dumpgenerator.py --api=http://archiveteam.org/api.php --xml --images</tt>
* [http://code.google.com/p/wikiteam/source/browse/trunk/wikipediadownloader.py wikipediadownloader.py] to download Wikipedia dumps from download.wikimedia.org: <tt>python wikipediadownloader.py</tt>
* There are many wiki engines, the most famous is MediaWiki. So, the tools must me ready to read data from almost every wiki engine to saved them all.
* [http://dl.dropbox.com/u/63233/Wikitravel/Source%20Code%20and%20tools/Source%20Code%20and%20tools.7z Scripts of a guy who saved Wikitravel]


== Coordination ==
As of 2019, our collection at Internet Archive holds dumps for 250,000 wikis (including independent, wikifarm wikis, some packages of wikis and Wiki[pm]edia).<ref>[https://archive.org/details/wikiteam WikiTeam collection] at Internet Archive</ref>
Talk with WikiTeam at irc://efnet/wikiteam


{{-}}
== Wikifarms ==
== Wiki dumps ==
For a more detailed list, [http://code.google.com/p/wikiteam/downloads/list?can=1 visit the download section] on Google Code.


There is another site of MediaWiki dumps located [http://mirrors.sdboyd56.com/WikiTeam/index.html here] on [http://www.archiveteam.org/index.php?title=User:Sdboyd Scott's] Website.
There are also wikifarms with hundreds of wikis. Here we only create pages for those we have some special information about that we don't want to lose (like archiving history and tips). For a full list, please use WikiApiary [https://wikiapiary.com/wiki/Farm:Main_Page wikifarms main page].


TODO lists:
Before backing up a wikifarm, try to update the list of wikis for it. There are [https://github.com/WikiTeam/wikiteam/tree/master/listsofwikis/mediawiki Python scripts to generate those lists] for many wikifarms.
* [[WikiTeam/Sites using MediaWiki (English)]]
* [[WikiTeam/Sites using MediaWiki (Multilingual)]]
* Backup your favorite wikis or leave the URL [[Talk:WikiTeam|here]].


{| class="wikitable" border=1 width=99% style="text-align: center;"
{| class="wikitable sortable plainlinks" style="text-align: center;"
! Wiki !! Wiki is online? !! Dumps available? (official or home-made) !! Comments/Details !! Saved by us? Who? Where?
! width=140px | Wikifarm !! width=80px | Wikis !! Status !! width=80px | Dumps !! Comments
|-
|-
| [http://s23.org/wikistats/anarchopedias_html.php Anarchopedias] || Yes || Official: no. Home-made: [http://www.mediafire.com/file/t73az9cwhzco2wb/Anarchopedia_Jun2011.7z Yes] || - || idiolect
| [[Battlestar Wiki]] ([http://battlestarwiki.org site]) || data-sort-value=4 | 4<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/battlestarwiki.org battlestarwiki.org - list of wikis]</ref> || {{green|Online}} || data-sort-value=3 | 3<ref>[https://archive.org/search.php?query=identifier%3Awiki%2Abattlestarwikiorg%2A battlestarwikiorg - dumps]</ref> || Last dumped Mar/Apr 2022, [https://fr.battlestarwiki.ddns.net/ fr.battlestarwiki.ddns.net] shows wrong XML when dumping
|-
|-
| [http://archiveteam.org Archive Team Wiki] || Yes || Official: no. Home-made: [http://code.google.com/p/wikiteam/downloads/list?can=1&q=archiveteam yes] || - || WikiTeam
| [[BluWiki]] ([http://wayback.archive.org/web/20090301060338/http://bluwiki.com/go/Main_Page site]) || data-sort-value=0 | Unknown || {{red|Offline}} || data-sort-value=24 | 24<ref>[https://archive.org/search.php?query=bluwiki%20subject%3Awikiteam bluwiki - dumps]</ref> ||  
|-
|-
| Bulbapedia || Yes || Official: no. Home-made: no || - || dr-spangle is working on it with a self-built PHP downloader
| [[Communpedia]] ([https://wikiapiary.com/wiki/Communpedia_%28ru%29 site]) || data-sort-value=5 | 5 || {{orange|Unstable}} || data-sort-value=4 | 4<ref>[https://archive.org/search.php?query=subject%3A%22Comunpedia%22%20OR%20subject%3A%22Communpedia%22%20OR%20subject%3A%22kommynistru%22 communpedia - dumps]</ref>
|-
|-
| [[Citizendium]] || Yes || Official: [http://en.citizendium.org/wiki/CZ:Downloads daily] (no full history). Home-made: [[Citizendium|yes]], April 2011 || No image dumps available ||  
| [[EditThis]] ([http://editthis.info site]) || data-sort-value=1350 | 1,350<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/editthis.info editthis.info - list of wikis]</ref>  || {{orange|Unstable}} || data-sort-value=1297 | 1,297<ref>[https://archive.org/search.php?query=editthisinfo%20subject%3Awikiteam editthis.info - dumps]</ref> || Most dumps were done in 2014. This wikifarm is not well covered in WikiApiary.<ref>[https://wikiapiary.com/wiki/Farm:EditThis Farm:EditThis]</ref>
|-
|-
| [http://s23.org/wikistats/editthis_html.php EditThis] || Yes || Official: no. Home-made: in progress
| [[elwiki.com]] ([https://web.archive.org/web/20070917110429/http://www.elwiki.com/ site]) || data-sort-value=0 | Unknown<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/elwiki.com elwiki.com - list of wikis]</ref> || {{red|Offline}} || data-sort-value=0 | None<ref>[https://archive.org/search.php?query=elwiki%20subject%3Awikiteam elwiki.com - dumps]</ref> || Last seen online in 2008.<ref>[https://web.archive.org/web/20080221125135/http://www.elwiki.com/ We're sorry about the downtime we've been having lately]</ref> There is no dumps, presumably lost. Perhaps [https://web.archive.org/web/form-submit.jsp?type=prefixquery&url=http://elwiki.com/ some pages] are in the Wayback Machine.
|-
|-
| enciclopedia.us.es || Yes || Official: no. Home-made: no || Sysop sent me page text sql tables || emijrp
| [[Fandom]] ([http://www.fandom.com site]) || data-sort-value=261146 | 261,146<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/fandom.com fandom.com - list of wikis]</ref> || {{green|Online}} || data-sort-value=300000 | 300,000+ || [http://community.fandom.com/wiki/Help:Database_download Help:Database download], [https://github.com/Wikia/app/tree/dev/extensions/wikia/WikiFactory/Dumps Their dumping code]
|-
|-
| [[Encyclopedia Dramatica]] || No || Official: no. Home-made: partial  || WebEcology Project Article Dump (~9000 Articles)<br />Most of the Images probably Lost || &nbsp;
| [[Miraheze]] ([https://meta.miraheze.org site]) || data-sort-value=6438 | 6,438<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/miraheze.org miraheze.org - list of wikis]</ref> || {{green|Online}} || data-sort-value=2200 | ~2,200<ref>[https://archive.org/search.php?query=miraheze%20subject%3Awikiteam miraheze - dumps]</ref> || Non-profit. Dumps were made in September 2016. Later in 2019 more dumps were uploaded.
|-
|-
| [http://s23.org/wikistats/gentoo_html.php Gentoo wikis] || Yes || Official: no. Home-made: [http://code.google.com/p/wikiteam/downloads/list?can=1&q=gentoo yes] ||  || WikiTeam
| [[Neoseeker.com]] ([https://neowiki.neoseeker.com site])|| data-sort-value=183 | 183<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/neoseeker.com neoseeker.com - list of wikis]</ref> || {{green|Online}} || data-sort-value=181 |181<ref>[https://archive.org/search.php?query=identifier%3Awiki%2Aneoseekercom%2A neoseeker.com - dumps]</ref> || Last dumped in Apr 2022, [https://sandbox.neoseeker.com/ sandbox.neoseeker.com] is a broken wiki and [https://pokemon.neoseeker.com/ pokemon.neoseeker.com] skipped due to image download issue.
|-
|-
| GNUpedia || No || Official: no. Home-made: no || No database. This "wiki encyclopedia" was only HTML pages. Only ~3 articles were sent to the mailing list. After that, the project was closed || -
| [[Orain]] ([https://meta.orain.org site]) || data-sort-value=425 | 425<ref>[https://raw.githubusercontent.com/WikiTeam/wikiteam/master/listsofwikis/mediawiki/orain.org orain.com - list of wikis]</ref> || {{red|Offline}} || data-sort-value=380 |380<ref>[https://archive.org/search.php?query=orain%20subject%3Awikiteam orain - dumps]</ref><ref>[https://archive.org/details/wikifarm-orain.org-20130824 Orain wikifarm dump (August 2013)]</ref> || Last seen online in September 2015. Dumps were made in August 2013, January 2014 and August 2015.
|-
|-
| [http://s23.org/wikistats/metapedias_html.php Metapedia] || Yes || Official: ?. Home-made: no || - || -
| [[Referata]] ([http://www.referata.com site]) || data-sort-value=156 | 156<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/referata.com referata.com - list of wikis]</ref> || {{orange|Unstable}} || data-sort-value=80 | ~80<ref>[https://archive.org/search.php?query=referata%20subject%3Awikiteam referata.com - dumps]</ref><ref>[https://archive.org/details/referata.com-20111204 Referata wikifarm dump 20111204]</ref><ref>[https://archive.org/details/wikifarm-referata.com-20130824 Referata wikifarm dump (August 2013)]</ref> || Check why there are dozens of wikis without dump.
|-
|-
| [http://s23.org/wikistats/scoutwiki_html.php Neoseeker] || Yes || Official: ?. Home-made: no || - || -
| [[ScribbleWiki]] ([http://scribblewiki.com site]) || data-sort-value=119 | 119<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/scribblewiki.com scribblewiki.com - list of wikis]</ref> || {{red|Offline}} || data-sort-value=0 | None<ref>[https://archive.org/search.php?query=scribblewiki%20subject%3Awikiteam scribblewiki.com - dumps]</ref> || Last seen online in 2008.<ref>[https://web.archive.org/web/20080404093502/http://scribblewiki.com/main.php What is ScribbleWiki?]</ref> There is no dumps, presumably lost. Perhaps [https://web.archive.org/web/form-submit.jsp?type=prefixquery&url=http://scribblewiki.com/ some pages] are in the Wayback Machine.
|-
|-
| [[Nupedia]] || No || Official: ?. Home-made: Yes, saved from IA || - || -
| [[ShoutWiki]] ([http://www.shoutwiki.com site]) || data-sort-value=2173 | 2,173<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/shoutwiki.com shoutwiki.com - list of wikis]</ref> || {{green|Online}} || data-sort-value=1300 | ~1,300<ref>[https://archive.org/search.php?query=shoutwiki%20subject%3Awikiteam shoutwiki.com - dumps]</ref><ref>[http://www.archive.org/details/shoutwiki.com ShoutWiki wikifarm dump]</ref> || Check why there are dozens of wikis without dump.
|-
|-
| OmegaWiki || Yes || Official: [http://www.omegawiki.org/Development daily] || - || -
| [[Sourceforge]] || data-sort-value=0 | Unknown || {{green|Online}} || data-sort-value=315 | 315<ref>[https://archive.org/search.php?query=sourceforge%20subject%3Awikiteam sourceforge - dumps]</ref> ||  
|-
|-
| OpenStreetMap || Yes || Official: Yes. Home-made: no
| [[TropicalWikis]] ([http://tropicalwikis.com site]) || data-sort-value=187 | 187<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/tropicalwikis.com tropicalwikis.com - list of wikis]</ref> || {{red|Offline}} || data-sort-value=152 | 152<ref>[https://archive.org/search.php?query=tropicalwikis%20subject%3Awikiteam tropicalwikis.com - dumps]</ref> || Killed off in November 2013. Allegedly pending move to [[Orain]] (which became offline too). Data from February 2013 and earlier saved.
|-
|-
| [http://s23.org/wikistats/opensuse_html.php OpenSUSE wikis] || Yes || Official: no. Home-made: [http://code.google.com/p/wikiteam/downloads/list?can=1&q=opensuse in progress] || - || -  
| [[Wik.is]] ([http://wik.is site]) || data-sort-value=0 | Unknown || {{red|Offline}} || data-sort-value=0 | Unknown || Non-MediaWiki.
|-
|-
| OSDev || Yes || Official: [http://wiki.osdev.org/OSDev_Wiki:About weekly] || - || Not yet
| [[Wiki-Site]] ([http://www.wiki-site.com site]) || data-sort-value=2659 | 2,659<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/wiki-site.com wiki-site.com - list of wikis]</ref> || {{green|Online}} || data-sort-value=367 | 367 || No uploaded dumps yet.
|-
|-
| [http://s23.org/wikistats/scoutwiki_html.php Scout wikis]
| [[WikiHub]] ([http://wikihub.ssu.lt site]) || data-sort-value=0 | Unknown || {{red|Offline}} || data-sort-value=7 | 7<ref>[https://archive.org/details/wikifarm-wikihub.ssu.lt-20131110 wikihub - dumps]</ref> ||
|-
|-
| [http://tvtropes.org TV Tropes] || Yes || Official: No Unofficial: In progress || No dump mechanism, using wget -nc -r -p -l 0 -np -w 45 -E -k -T 10 -nv -x "http://tvtropes.org" || DoubleJ
| [[Wiki.Wiki]] ([https://wiki.wiki site]) || data-sort-value=100 | 100<ref>[https://raw.githubusercontent.com/WikiTeam/wikiteam/master/listsofwikis/mediawiki/wiki.wiki wiki.wiki - list of wikis]</ref> || {{green|Online}} || data-sort-value=0 | Unknown ||  
|-
|-
| [http://s23.org/wikistats/uncyclomedia_html.php Uncyclomedias]
| [[Wikkii]] ([https://web.archive.org/web/20140621054654/http://wikkii.com/wiki/Free_Wiki_Hosting site]) || data-sort-value=3267 | 3,267 || {{red|Offline}} || data-sort-value=1300 | 1,300<ref>[https://archive.org/search.php?query=wikkii%20subject%3Awikiteam wikki.com - dumps]</ref> ||
|-
|-
| Wikanda || Yes || Official: no. Home-made: [http://code.google.com/p/wikiteam/downloads/list?can=1&q=wikanda yes] || - || emijrp
| [[YourWiki.net]] ([https://web.archive.org/web/20100124003107/http://www.yourwiki.net/wiki/YourWiki site]) || data-sort-value=0 | Unknown || {{red|Offline}} || data-sort-value=0 | Unknown ||  
|-
| [[Wikia]] || Yes || Official: [http://wiki-stats.wikia.com/ on demand] || No image dumps available || Not yet
|-
| [http://wikifur.com WikiFur] || Yes || Official: [http://dumps.wikifur.com/ yes] || No image dumps available || Not yet
|-
| WikiHow
|-
| [[Wikimedia Commons]] || Yes || Official: [http://dumps.wikimedia.org/commonswiki/latest/ periodically] || No image dumps available || Not yet
|-
| [[Wikipedia]] || Yes || Official: [http://dumps.wikimedia.org/backup-index.html periodically] || No image dumps available. English Wikipedia dump uses to be very old || Not yet
|-
| [http://s23.org/wikistats/wikisite_html.php Wiki-site.com]
|-
|  WikiTravel || Yes || Official: [http://wikitravel.org/en/Wikitravel:Database_dump not yet]. Home-made: [http://code.google.com/p/wikiteam/downloads/list?can=1&q=wikitravel yes], another of [http://dl.dropbox.com/u/63233/Wikitravel/Complete%20zip/WikitravelComplete14-June-2010.7z 2010-06-14] || - || WikiTeam
|-
|  WikiWikiWeb || Yes || Home-made: [http://www.multiupload.com/BGGCFUHOE7 yes] || - || Ca7
|}
|}
== Wikis to archive ==
Please [https://wikiapiary.com/wiki/Special:FormEdit/Website add a wiki to WikiApiary] if you want someone to archive it sooner or later; or tell us on IRC ({{IRC|wikiteam}}) if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.
[https://github.com/WikiTeam/wikiteam/wiki/Tutorial You can help] downloading wikis yourself. If you don't know where to start, pick a [https://wikiapiary.com/wiki/Category:Website_not_archived wiki which was not archived yet] from the lists on WikiApiary. Also, you can edit those pages to link existing dumps! You'll help others focus their work.
Examples of huge wikis:
* '''[[Wikipedia]]''' - arguably the largest and one of the oldest wikis on the planet. It offers public backups (also for sister projects): https://dumps.wikimedia.org
** They have some mirrors but not many.
** The transfer of the dumps to the Internet Archive is automated and is currently managed by [[User:Hydriz|Hydriz]].
* '''[[Wikimedia Commons]]''' - a wiki of media files available for free usage. It offers public backups: https://dumps.wikimedia.org
** But there is no image dump available, only the image descriptions
** So we made it! http://archive.org/details/wikimediacommons
* '''[[Wikia]]''' - a website that allows the creation and hosting of wikis. Doesn't make regular backups.
We're trying to decide which [https://groups.google.com/forum/#!topic/wikiteam-discuss/TxzfrkN4ohA other wiki engines] to work on: suggestions needed!
== Tools and source code ==
=== Official WikiTeam tools ===
* [https://github.com/WikiTeam/wikiteam WikiTeam in GitHub]
* '''[https://raw.githubusercontent.com/WikiTeam/wikiteam/master/dumpgenerator.py dumpgenerator.py] to download MediaWiki wikis:''' <tt>python dumpgenerator.py --api=https://www.archiveteam.org/api.php --xml --images</tt>
* [https://raw.githubusercontent.com/WikiTeam/wikiteam/master/wikipediadownloader.py wikipediadownloader.py] to download Wikipedia dumps from download.wikimedia.org: <tt>python wikipediadownloader.py</tt>
=== Other ===
* [https://web.archive.org/web/20150403081903/http://dl.dropbox.com/u/63233/Wikitravel/Source%20Code%20and%20tools/Source%20Code%20and%20tools.7z Scripts of a guy who saved Wikitravel]
* [http://www.communitywiki.org/en/BackupThisWiki OddMuseWiki backup]
* UseModWiki: use wget/curl and [http://www.usemod.com/cgi-bin/wiki.pl?WikiPatches/RawMode raw mode] (might have a different URL scheme, like [http://meatballwiki.org/wiki/action=browse&id=TheTippingPoint&raw=1 this])
** Some wikis: [[UseMod:SiteList]]
== Wiki dumps ==
Most of our dumps are in the [http://www.archive.org/details/wikiteam wikiteam collection at the Internet Archive]. If you want an item to land there, just upload it in "opensource" collection and remember the "WikiTeam" keyword, it will be moved at some point. When you've uploaded enough wikis, you'll probably be made a collection admin to save others the effort to move your stuff.
For a manually curated list, [https://github.com/WikiTeam/wikiteam/wiki/Available-Backups visit the download section] on GitHub.
There is another site of MediaWiki dumps located [http://mirrors.sdboyd56.com/WikiTeam/index.html here] on [http://www.archiveteam.org/index.php?title=User:Sdboyd Scott's] website.
=== Tips ===
=== Tips ===
Some tips:
Some tips follow. Don't issue commands you don't understand, especially batch commands which use loops or find and xargs, unless you're ready to lose all the data you got.
* When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z use to be smaller (better compress ratio), so use 7z.
 
When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z is usually smaller (better compress ratio), so use 7z.
 
To download a mass of wikis with N parallel threads, just <code>split</code> your full <code>$list</code> in N chunks, then start N instances of <code>launcher.py</code> ([https://github.com/WikiTeam/wikiteam/wiki/Tutorial#Download_a_list_of_wikis tutorial]), one for each list
* If you want to upload dumps as they're ready and clean up your storage: at the same time, in a separate window or screen, run a loop of the kind <code>while true; do ./uploader.py $list --prune-directories --prune-wikidump; sleep 12h; done;</code> (the <code>sleep</code> ensure each run has something to do).
* If you're using --xmlrevisions, dumpgenerator.py will use much less memory because it won't get giant blobs of XML from Special:Export when a big page has a thousand revisions or more. You can then afford to run 100 instances of launcher.py/dumpgenerator.py with just 2 cores and 8 GiB of RAM. Watch your ulimit for the number of files, individual and total memory: 7z may consume up to 5 GiB of RAM for the biggest dumps (over 10 GiB). CPU usage tends to be lower at the beginning (launcher.py is not yet launching any 7z task because few dumps have completed) and the disk is usually hit harder at a beginning of a resume (launcher.py needs to scan the directories multiple times and dumpgeneratory.py needs to read the lists of titles, XML and image directories). Before increasing concurrency, make sure you have enough resources for those stressful times, not just for the easy ride at the beginning of the dump.
* If you want to go advanced and run really ''many'' instances, use <code>tmux</code>[http://blog.hawkhost.com/2010/07/02/tmux-%E2%80%93-the-terminal-multiplexer-part-2/]! Use [https://serverfault.com/a/814089/203035 tmux new-window] to launch several instances in the same session. Every now and then, attach to the tmux session and look (<code>ctrl-b f</code>) for windows stuck on "is wrong", "is slow" or "......" loops, or which are inactive[http://unix.stackexchange.com/questions/78093/how-can-i-make-tmux-monitor-a-window-for-inactivity]. Even with a couple cores you can run a hundred instances, just make sure to have enough disk space for the occasional huge ones (tens of GB).
* If you get closer to a 1000 instances of launcher.py, it may be too much for tmux to handle. You're probably not actually going to look at the output of hundreds of windows anyway. So just run everything in the background with xargs, monitor the crashes and then check the directories manually.<pre>split -a 4 -d -l 10 wikistoarchive.txt wt_ ; ls -1 wt_* | xargs -n1 -I§ -P300 sh -c "python launcher.py § 2>&1 > /dev/null ; "</pre>
 
If you have many wikidump directories, some of the following commands may be useful. Sometimes a dump is complete but the 7z is missing or broken (e.g. for lack of memory), or you're running low on disk and you can't wait for uploader.py to verify the uploads one by one. A hint of a complete dump is the presence of siteinfo.json: that means dumpgenerator thought the XML was done, but an image download may still be running.
* Check errors in 7z files. It's better to avoid running uploader.py on many archives if you're not sure they're fine (for instance if you've not monitored crashes of dumpgenerator.py/launcher.py). It's much harder for other people to download the 7z files from archive.org and check them after they've been uploaded, and the presence of an archive may discourage someone else from making a new one even if the archive is not actually usable. <pre>find -maxdepth 1 -type f -name "*7z" | xargs -n1 -P4 -I§ sh -c "7z l § 2>&1 | grep ^ERROR "</pre>
* Delete directories corresponding to a 7z file.<pre>find -maxdepth 1 -name "*wikidump.7z" | cut -d/ -f2 | sed 's,.7z,,g' | xargs -P8 rm -rf</pre>
* If launcher.py has failed to create 7z files due to running low on resources, you may make them manually with a loop and lower compression level.<pre>find -maxdepth 1 -name siteinfo.json | cut -d/ -f2 |sed 's,wikidump,,g' | xargs -n1 -P6 -I§ sh -c "cd §wikidump/ ; 7za a -ms=off -mx=3 ../§history.xml.7z §history.xml §titles.txt errors.log index.html config.txt siteinfo.json Special:Version.html ; ../§history.xml.7z ../§wikidump.7z ; 7za a -mx=1 ../§wikidump.7z images/ §images.txt ; "</pre>
* Find the biggest ongoing wikidump directories: when you don't have something as nice as [https://dev.yorhel.nl/ncdu ncdu], something simple may suffice, like  <code>du -shc * | grep  G</code> or <code>find -maxdepth 2 -type f -name "*xml" -size +1G</code>.
 
=== BitTorrent downloads ===
You can download and seed the torrents from the archive.org collection. Every item has a "Torrent" link.
 
=== Old mirrors ===
<span class="plainlinks">
# [https://sourceforge.net/projects/wikiteam/files/ Sourceforge] (also mirrored to another 26 mirrors)
# [http://www.archive.org/details/WikiTeamMirror Internet Archive] ([http://ia700705.us.archive.org/16/items/WikiTeamMirror/ direct link] to directory)
</span>
 
=== Recursive ===
 
We also have dumps for our coordination wikis:
* [[ArchiveTeam wiki]] ([https://archive.org/details/wiki-archiveteamorg 2014-03-26])
* [[WikiApiary]] ([https://archive.org/details/wiki-wikiapiarycom_w 2015-03-25])
 
== Restoring wikis ==
 
Anyone can restore a wiki using its XML dump and images.
 
Wikis.cc is [https://www.wikis.cc/wiki/Wikis_recuperados restoring some sites].


== Closing/In danger ==
== Links warrior project ==
* Gentoo wikis: Error 503 Service Unavailable as of 2011-04-06 http://s23.org/wikistats/gentoo_html.php
{{Infobox project
** Again up. [http://code.google.com/p/wikiteam/downloads/list?can=1&q=gentoo Saved]! [[User:Emijrp|Emijrp]] 21:30, 10 April 2011 (UTC)
| title = WikiTeam links
| image = Wikiteam.jpg
| description = We preserve external links used in wikis
| project_status = {{specialcase}}
| archiving_status = {{inprogress}} (dormant since 2017)
| source = [https://github.com/Archiveteam/wikis-grab wikis-grab]
| tracker = [https://tracker.archiveteam.org/wikis/ wikis]
| irc = wikiteam
}}


== Offline wikis and wikifarms ==
There is a (currently dormant) warrior project to archive external links used in wikis. The target format for this archival is [[WARC]]. The data from this project is uploaded to [https://archive.org/details/archiveteam_wiki this collection] on the Internet Archive.
elwiki.com


* 2011
== References ==
** wik.is
<references/>
* 2010
**
* 2009
**
* 2008
** Scribblewiki (wikifarm)


== External links ==
== External links ==
* http://wikiindex.org - A lot of wikis to save
* [https://github.com/WikiTeam/wikiteam WikiTeam] on GitHub
* http://wiki1001.com/
* [http://wikiindex.org WikiIndex] - an index of wikis
* http://meta.wikimedia.org/wiki/List_of_largest_wikis
* [http://s23.org/wikistats/ S23 wikistats] - stats for over 40,000 wikis
* http://s23.org/wikistats/
* [http://en.wikipedia.org/wiki/Comparison_of_wiki_farms Comparison of wikifarms]
* http://en.wikipedia.org/wiki/Comparison_of_wiki_farms
* [http://en.wikipedia.org/wiki/User:Emijrp/Wikipedia_Archive Wikipedia Archive]
* http://en.wikipedia.org/wiki/User:Emijrp/Wikipedia_Archive
* http://blog.shoutwiki.com/
* http://wikiheaven.blogspot.com/
* [http://s23.org/wikistats/largest_html.php?th=15000&lines=500 List of largest wikis in the world]
* Dump of [http://nostalgia.wikipedia.org/ nostalgia], an ancient version of Wikipedia from 2001, [http://dumps.wikimedia.org/nostalgiawiki dump]
* http://code.google.com/p/wikiteam/downloads/list?can=1 many dumps


{{Navigation box}}
{{wikis}}


[[Category:Archive Team]]
[[Category:Archive Team]]
[[Category:Wikis| ]]

Revision as of 02:45, 17 April 2022

WikiTeam XML
WikiTeam, we preserve wikis
WikiTeam, we preserve wikis
Status Special case
Archiving status In progress... (manual)
Archiving type Unknown
Project source WikiTeam GitHub
IRC channel #wikiteam (on hackint)

WikiTeam software is a set of tools for archiving wikis. They work on MediaWiki wikis, but we want to expand to other wiki engines. As of 2019, WikiTeam has preserved more than 250,000 wikis.

You can check our collection at Internet Archive, the source code on GitHub and some lists of wikis by status on WikiApiary. There's also a list of not yet archived wikis on WikiApiary.

There are two completely separate projects under the umbrella of WikiTeam:

  • The archival of the wikis in the form of XML dumps. This is what most of this page is about.
  • The archival of external links found in wikis to WARCs. See the Links warrior project section.

The archival of the wikis themselves to WARCs is also desirable but has not been attempted yet.

Current status

The total number of MediaWiki wikis is unknown, but some estimates exist.

According to WikiApiary, which is the most updated database, there are 21,139 independent wikis (1,718 are semantic) and 4,819 in wikifarms as of 2018-08-02.[1] But it doesn't include 400,000+ Wikia wikis, and the independent list coverage can be improved for sure.

According to Pavlo's list generated in December 2008, there are 20,000 wikis.[2] This list was imported into WikiApiary.

According to WikiIndex, there are 20,698 wikis.[3] The URLs in this project were added to WikiApiary in the past too.

A number of wikifarms have vanished and about 180 are still online.[4][5][6]

Most wikis are small, containing about 100 pages or less, but there are some very large wikis:[7][8]

  • By number of pages: Wikimedia Commons (77 million), Wikidata (72 million), English Wikipedia (49 million), DailyWeeKee (35 million), WikiBusiness (22 million).
  • By number of files: Wikimedia Commons (57 million), English Wikipedia (800,000).

The oldest dumps are probably some 2001 dumps of Wikipedia when it used UseModWiki.[9][10]

As of 2019, our collection at Internet Archive holds dumps for 250,000 wikis (including independent, wikifarm wikis, some packages of wikis and Wiki[pm]edia).[11]

Wikifarms

There are also wikifarms with hundreds of wikis. Here we only create pages for those we have some special information about that we don't want to lose (like archiving history and tips). For a full list, please use WikiApiary wikifarms main page.

Before backing up a wikifarm, try to update the list of wikis for it. There are Python scripts to generate those lists for many wikifarms.

Wikis to archive

Please add a wiki to WikiApiary if you want someone to archive it sooner or later; or tell us on IRC (#wikiteam (on hackint)) if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.

You can help downloading wikis yourself. If you don't know where to start, pick a wiki which was not archived yet from the lists on WikiApiary. Also, you can edit those pages to link existing dumps! You'll help others focus their work.

Examples of huge wikis:

  • Wikipedia - arguably the largest and one of the oldest wikis on the planet. It offers public backups (also for sister projects): https://dumps.wikimedia.org
    • They have some mirrors but not many.
    • The transfer of the dumps to the Internet Archive is automated and is currently managed by Hydriz.
  • Wikia - a website that allows the creation and hosting of wikis. Doesn't make regular backups.

We're trying to decide which other wiki engines to work on: suggestions needed!

Tools and source code

Official WikiTeam tools

Other

Wiki dumps

Most of our dumps are in the wikiteam collection at the Internet Archive. If you want an item to land there, just upload it in "opensource" collection and remember the "WikiTeam" keyword, it will be moved at some point. When you've uploaded enough wikis, you'll probably be made a collection admin to save others the effort to move your stuff.

For a manually curated list, visit the download section on GitHub.

There is another site of MediaWiki dumps located here on Scott's website.

Tips

Some tips follow. Don't issue commands you don't understand, especially batch commands which use loops or find and xargs, unless you're ready to lose all the data you got.

When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z is usually smaller (better compress ratio), so use 7z.

To download a mass of wikis with N parallel threads, just split your full $list in N chunks, then start N instances of launcher.py (tutorial), one for each list

  • If you want to upload dumps as they're ready and clean up your storage: at the same time, in a separate window or screen, run a loop of the kind while true; do ./uploader.py $list --prune-directories --prune-wikidump; sleep 12h; done; (the sleep ensure each run has something to do).
  • If you're using --xmlrevisions, dumpgenerator.py will use much less memory because it won't get giant blobs of XML from Special:Export when a big page has a thousand revisions or more. You can then afford to run 100 instances of launcher.py/dumpgenerator.py with just 2 cores and 8 GiB of RAM. Watch your ulimit for the number of files, individual and total memory: 7z may consume up to 5 GiB of RAM for the biggest dumps (over 10 GiB). CPU usage tends to be lower at the beginning (launcher.py is not yet launching any 7z task because few dumps have completed) and the disk is usually hit harder at a beginning of a resume (launcher.py needs to scan the directories multiple times and dumpgeneratory.py needs to read the lists of titles, XML and image directories). Before increasing concurrency, make sure you have enough resources for those stressful times, not just for the easy ride at the beginning of the dump.
  • If you want to go advanced and run really many instances, use tmux[1]! Use tmux new-window to launch several instances in the same session. Every now and then, attach to the tmux session and look (ctrl-b f) for windows stuck on "is wrong", "is slow" or "......" loops, or which are inactive[2]. Even with a couple cores you can run a hundred instances, just make sure to have enough disk space for the occasional huge ones (tens of GB).
  • If you get closer to a 1000 instances of launcher.py, it may be too much for tmux to handle. You're probably not actually going to look at the output of hundreds of windows anyway. So just run everything in the background with xargs, monitor the crashes and then check the directories manually.
    split -a 4 -d -l 10 wikistoarchive.txt wt_ ; ls -1 wt_* | xargs -n1 -I§ -P300 sh -c "python launcher.py § 2>&1 > /dev/null ; "

If you have many wikidump directories, some of the following commands may be useful. Sometimes a dump is complete but the 7z is missing or broken (e.g. for lack of memory), or you're running low on disk and you can't wait for uploader.py to verify the uploads one by one. A hint of a complete dump is the presence of siteinfo.json: that means dumpgenerator thought the XML was done, but an image download may still be running.

  • Check errors in 7z files. It's better to avoid running uploader.py on many archives if you're not sure they're fine (for instance if you've not monitored crashes of dumpgenerator.py/launcher.py). It's much harder for other people to download the 7z files from archive.org and check them after they've been uploaded, and the presence of an archive may discourage someone else from making a new one even if the archive is not actually usable.
    find -maxdepth 1 -type f -name "*7z" | xargs -n1 -P4 -I§ sh -c "7z l § 2>&1 | grep ^ERROR "
  • Delete directories corresponding to a 7z file.
    find -maxdepth 1 -name "*wikidump.7z" | cut -d/ -f2 | sed 's,.7z,,g' | xargs -P8 rm -rf
  • If launcher.py has failed to create 7z files due to running low on resources, you may make them manually with a loop and lower compression level.
    find -maxdepth 1 -name siteinfo.json | cut -d/ -f2 |sed 's,wikidump,,g' | xargs -n1 -P6 -I§ sh -c "cd §wikidump/ ; 7za a -ms=off -mx=3 ../§history.xml.7z §history.xml §titles.txt errors.log index.html config.txt siteinfo.json Special:Version.html ; ../§history.xml.7z ../§wikidump.7z ; 7za a -mx=1 ../§wikidump.7z images/ §images.txt ; "
  • Find the biggest ongoing wikidump directories: when you don't have something as nice as ncdu, something simple may suffice, like du -shc * | grep G or find -maxdepth 2 -type f -name "*xml" -size +1G.

BitTorrent downloads

You can download and seed the torrents from the archive.org collection. Every item has a "Torrent" link.

Old mirrors

  1. Sourceforge (also mirrored to another 26 mirrors)
  2. Internet Archive (direct link to directory)

Recursive

We also have dumps for our coordination wikis:

Restoring wikis

Anyone can restore a wiki using its XML dump and images.

Wikis.cc is restoring some sites.

Links warrior project

WikiTeam links
We preserve external links used in wikis
We preserve external links used in wikis
Status Special case
Archiving status In progress... (dormant since 2017)
Archiving type Unknown
Project source wikis-grab
Project tracker wikis
IRC channel #wikiteam (on hackint)

There is a (currently dormant) warrior project to archive external links used in wikis. The target format for this archival is WARC. The data from this project is uploaded to this collection on the Internet Archive.

References

External links

v · t · e         Knowledge and Wikis
Software

DokuWiki · MediaWiki · MoinMoin · Oddmuse · PukiWiki · UseModWiki · YukiWiki

Wikifarms

atwiki · Battlestar Wiki · BluWiki · Communpedia · EditThis · elwiki.com · Fandom · Miraheze · Neoseeker.com · Orain · Referata · ScribbleWiki · Seesaa · ShoutWiki · SourceForge · TropicalWikis · Wik.is · Wiki.Wiki · Wiki-Site · Wikidot · WikiHub · Wikispaces · WikiForge · WikiTide · Wikkii · YourWiki.net

Wikimedia

Wikipedia · Wikimedia Commons · Wikibooks · Wikidata · Wikinews · Wikiquote · Wikisource · Wikispecies · Wiktionary · Wikiversity · Wikivoyage · Wikimedia Incubator · Meta-Wiki

Other

Anarchopedia · Citizendium · Conservapedia · Creation Wiki · EcuRed · Enciclopedia Libre Universal en Español · GNUPedia · Moegirlpedia · Nico Nico Pedia · Nupedia · OmegaWiki · OpenStreetMap · Pixiv Encyclopedia

Indexes and stats

WikiApiary · WikiIndex · Wikistats