Difference between revisions of "WikiTeam"

From Archiveteam
Jump to navigation Jump to search
(→‎Wikifarms: better colour, yellow->orange)
Line 52: Line 52:
| [[BluWiki]] ([http://wayback.archive.org/web/20090301060338/http://bluwiki.com/go/Main_Page site]) || data-sort-value=0 | Unknown || {{red|Offline}} || data-sort-value=24 | 24<ref>[https://archive.org/search.php?query=bluwiki%20subject%3Awikiteam bluwiki - dumps]</ref> ||  
| [[BluWiki]] ([http://wayback.archive.org/web/20090301060338/http://bluwiki.com/go/Main_Page site]) || data-sort-value=0 | Unknown || {{red|Offline}} || data-sort-value=24 | 24<ref>[https://archive.org/search.php?query=bluwiki%20subject%3Awikiteam bluwiki - dumps]</ref> ||  
|-
|-
| [[Communpedia]] ([https://wikiapiary.com/wiki/Communpedia_%28ru%29 site]) || data-sort-value=5 | 5 || {{orange|Unestable}} || data-sort-value=4 | 4<ref>[https://archive.org/search.php?query=subject%3A%22Comunpedia%22%20OR%20subject%3A%22Communpedia%22%20OR%20subject%3A%22kommynistru%22 communpedia - dumps]</ref>
| [[Communpedia]] ([https://wikiapiary.com/wiki/Communpedia_%28ru%29 site]) || data-sort-value=5 | 5 || {{orange|Unstable}} || data-sort-value=4 | 4<ref>[https://archive.org/search.php?query=subject%3A%22Comunpedia%22%20OR%20subject%3A%22Communpedia%22%20OR%20subject%3A%22kommynistru%22 communpedia - dumps]</ref>
|-
|-
| [[EditThis]] ([http://editthis.info site]) || data-sort-value=1350 | 1,350<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/editthis.info editthis.info - list of wikis]</ref>  || {{orange|Unstable}} || data-sort-value=1297 | 1,297<ref>[https://archive.org/search.php?query=editthisinfo%20subject%3Awikiteam editthis.info - dumps]</ref> || Most dumps were done in 2014. This wikifarm is not well covered in WikiApiary.<ref>[https://wikiapiary.com/wiki/Farm:EditThis Farm:EditThis]</ref>
| [[EditThis]] ([http://editthis.info site]) || data-sort-value=1350 | 1,350<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/editthis.info editthis.info - list of wikis]</ref>  || {{orange|Unstable}} || data-sort-value=1297 | 1,297<ref>[https://archive.org/search.php?query=editthisinfo%20subject%3Awikiteam editthis.info - dumps]</ref> || Most dumps were done in 2014. This wikifarm is not well covered in WikiApiary.<ref>[https://wikiapiary.com/wiki/Farm:EditThis Farm:EditThis]</ref>

Revision as of 15:25, 20 February 2020

WikiTeam XML
WikiTeam, we preserve wikis
WikiTeam, we preserve wikis
Status Special case
Archiving status In progress... (manual)
Archiving type Unknown
Project source WikiTeam GitHub
IRC channel #wikiteam (on hackint)

WikiTeam software is a set of tools for archiving wikis. They work on MediaWiki wikis, but we want to expand to other wiki engines. As of 2019, WikiTeam has preserved more than 250,000 wikis.

You can check our collection at Internet Archive, the source code on GitHub and some lists of wikis by status on WikiApiary. There's also a list of not yet archived wikis on WikiApiary.

There are two completely separate projects under the umbrella of WikiTeam:

  • The archival of the wikis in the form of XML dumps. This is what most of this page is about.
  • The archival of external links found in wikis to WARCs. See the Links warrior project section.

The archival of the wikis themselves to WARCs is also desirable but has not been attempted yet.

Current status

The total number of MediaWiki wikis is unknown, but some estimates exist.

According to WikiApiary, which is the most updated database, there are 21,139 independent wikis (1,718 are semantic) and 4,819 in wikifarms as of 2018-08-02.[1] But it doesn't include 400,000+ Wikia wikis, and the independent list coverage can be improved for sure.

According to Pavlo's list generated in December 2008, there are 20,000 wikis.[2] This list was imported into WikiApiary.

According to WikiIndex, there are 20,698 wikis.[3] The URLs in this project were added to WikiApiary in the past too.

A number of wikifarms have vanished and about 180 are still online.[4][5][6]

Most wikis are small, containing about 100 pages or less, but there are some very large wikis:[7][8]

  • By number of pages: Wikimedia Commons (77 million), Wikidata (72 million), English Wikipedia (49 million), DailyWeeKee (35 million), WikiBusiness (22 million).
  • By number of files: Wikimedia Commons (57 million), English Wikipedia (800,000).

The oldest dumps are probably some 2001 dumps of Wikipedia when it used UseModWiki.[9][10]

As of 2019, our collection at Internet Archive holds dumps for 250,000 wikis (including independent, wikifarm wikis, some packages of wikis and Wiki[pm]edia).[11]

Wikifarms

There are also wikifarms with hundreds of wikis. Here we only create pages for those we have some special information about that we don't want to lose (like archiving history and tips). For a full list, please use WikiApiary wikifarms main page.

Before backing up a wikifarm, try to update the list of wikis for it. There are Python scripts to generate those lists for many wikifarms.

Wikis to archive

Please add a wiki to WikiApiary if you want someone to archive it sooner or later; or tell us on the #wikiteam IRC channel (on EFnet) if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.

You can help downloading wikis yourself. If you don't know where to start, pick a wiki which was not archived yet from the lists on WikiApiary. Also, you can edit those pages to link existing dumps! You'll help others focus their work.

Examples of huge wikis:

  • Wikipedia - arguably the largest and one of the oldest wikis on the planet. It offers public backups (also for sister projects): https://dumps.wikimedia.org
    • They have some mirrors but not many.
    • The transfer of the dumps to the Internet Archive is automated and is currently managed by Hydriz.
  • Wikia - a website that allows the creation and hosting of wikis. Doesn't make regular backups.

We're trying to decide which other wiki engines to work on: suggestions needed!

Tools and source code

Official WikiTeam tools

Other

Wiki dumps

Most of our dumps are in the wikiteam collection at the Internet Archive. If you want an item to land there, just upload it in "opensource" collection and remember the "WikiTeam" keyword, it will be moved at some point. When you've uploaded enough wikis, you'll probably be made a collection admin to save others the effort to move your stuff.

For a manually curated list, visit the download section on GitHub.

There is another site of MediaWiki dumps located here on Scott's website.

Tips

Some tips:

  • When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z use to be smaller (better compress ratio), so use 7z.
  • To download a mass of wikis with N parallel threads, just split your full $list in N chunks, then start N instances of launcher.py (tutorial), one for each list
    • If you want to upload dumps as they're ready and clean up your storage: at the same time, in a separate window or screen, run a loop of the kind while true; do ./uploader.py $list --prune-directories --prune-wikidump; sleep 12h; done; (the sleep ensure each run has something to do).
    • If you want to go advanced and run really many instances, use tmux[1]! Use tmux new-window to launch several instances in the same session. Every now and then, attach to the tmux session and look (ctrl-b f) for windows stuck on "is wrong", "is slow" or "......" loops, or which are inactive[2]. Even with a couple cores you can run a hundred instances, just make sure to have enough disk space for the occasional huge ones (tens of GB).

BitTorrent downloads

You can download and seed the torrents from the archive.org collection. Every item has a "Torrent" link.

Old mirrors

  1. Sourceforge (also mirrored to another 26 mirrors)
  2. Internet Archive (direct link to directory)

Recursive

We also have dumps for our coordination wikis:

Restoring wikis

Anyone can restore a wiki using its XML dump and images.

Wikis.cc is restoring some sites.

Links warrior project

WikiTeam links
We preserve external links used in wikis
We preserve external links used in wikis
Status Special case
Archiving status In progress... (dormant since 2017)
Archiving type Unknown
Project source wikis-grab
Project tracker wikis
IRC channel #wikiteam (on hackint)

There is a (currently dormant) warrior project to archive external links used in wikis. The target format for this archival is WARC. The data from this project is uploaded to this collection on the Internet Archive.

References

External links

v · t · e         Knowledge and Wikis
Software

DokuWiki · MediaWiki · MoinMoin · Oddmuse · PukiWiki · UseModWiki · YukiWiki

Wikifarms

atwiki · Battlestar Wiki · BluWiki · Communpedia · EditThis · elwiki.com · Fandom · Miraheze · Neoseeker.com · Orain · Referata · ScribbleWiki · Seesaa · ShoutWiki · SourceForge · TropicalWikis · Wik.is · Wiki.Wiki · Wiki-Site · Wikidot · WikiHub · Wikispaces · WikiForge · WikiTide · Wikkii · YourWiki.net

Wikimedia

Wikipedia · Wikimedia Commons · Wikibooks · Wikidata · Wikinews · Wikiquote · Wikisource · Wikispecies · Wiktionary · Wikiversity · Wikivoyage · Wikimedia Incubator · Meta-Wiki

Other

Anarchopedia · Citizendium · Conservapedia · Creation Wiki · EcuRed · Enciclopedia Libre Universal en Español · GNUPedia · Moegirlpedia · Nico Nico Pedia · Nupedia · OmegaWiki · OpenStreetMap · Pixiv Encyclopedia

Indexes and stats

WikiApiary · WikiIndex · Wikistats