Difference between revisions of "WikiTeam"
(→Wikifarms: 20 bluwiki dumps in IA) |
|||
Line 45: | Line 45: | ||
| [[Battlestar Wiki]] ([http://battlestarwiki.org site]) || 8 || {{green|Online}} || ? || | | [[Battlestar Wiki]] ([http://battlestarwiki.org site]) || 8 || {{green|Online}} || ? || | ||
|- | |- | ||
| [[BluWiki]] ([http://wayback.archive.org/web/20090301060338/http://bluwiki.com/go/Main_Page site]) || ? || {{red|Offline}} || ? || | | [[BluWiki]] ([http://wayback.archive.org/web/20090301060338/http://bluwiki.com/go/Main_Page site]) || ? || {{red|Offline}} || ~20<ref>[https://archive.org/search.php?query=bluwiki%20subject%3Awikiteam bluwiki - dumps]</ref> || | ||
|- | |- | ||
| [[EditThis]] ([http://editthis.info site]) || 1,350<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/editthis.info editthis.info - list of wikis]</ref> || {{yellow|Unstable}} || 1307+ (IA: 1,297<ref>[https://archive.org/search.php?query=editthisinfo%20subject%3Awikiteam editthis.info - dumps]</ref>) || Most dumps were done in 2014. This wikifarm is not well covered in WikiApiary.<ref>[https://wikiapiary.com/wiki/Farm:EditThis Farm:EditThis]</ref> | | [[EditThis]] ([http://editthis.info site]) || 1,350<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/editthis.info editthis.info - list of wikis]</ref> || {{yellow|Unstable}} || 1307+ (IA: 1,297<ref>[https://archive.org/search.php?query=editthisinfo%20subject%3Awikiteam editthis.info - dumps]</ref>) || Most dumps were done in 2014. This wikifarm is not well covered in WikiApiary.<ref>[https://wikiapiary.com/wiki/Farm:EditThis Farm:EditThis]</ref> |
Revision as of 18:50, 21 July 2016
WikiTeam | |
![]() WikiTeam, we preserve wikis | |
URL | https://github.com/WikiTeam/wikiteam |
Status | Online! (at least some of them) |
Archiving status | In progress... |
Archiving type | Unknown |
Project tracker | manual for now, check not archived wikis on wikiapiary |
IRC channel | #wikiteam (on hackint) |
WikiTeam software is a set of tools for archiving wikis. They work on MediaWiki wikis, but we want to expand to other wiki engines. As of January 2016, WikiTeam has preserved more than 27,000 stand-alone.
You can check our collection at Internet Archive, the source code in GitHub and some lists of wikis by status in WikiApiary.
Current status
The total number of MediaWiki wikis is unknown, but some estimates exist.
According to WikiApiary, which is the most updated database, there are 21,369 independent wikis (1,508 are semantic) and 4,554 in wikifarms.[1] But it doesn't include Wikia 400,000+ wikis and the independent list coverage can be improved for sure.
According to Pavlo's list generated in December 2008, there are 20,000 wikis.[2] This list was imported into WikiApiary.
According to WikiIndex, there are 20,698 wikis.[3] The URLs in this project were added to WikiApiary in the past too.
A number of wikifarms have vanished and about 150 are still online.[4][5][6]
Most wikis are small, containing about 100 pages or less, but there are some very large wikis:[7][8]
- By number of pages: Wikimedia Commons (40 million), English Wikipedia (37 million), DailyWeeKee (35 million), WikiBusiness (22 million) and Wikidata (19 million).
- By number of files: Wikimedia Commons (28 million), English Wikipedia (800,000).
The oldest dumps are probably some 2001 dumps of Wikipedia when it used UseModWiki.[9][10]
As of November 2015, our collection at Internet Archive holds dumps for 27,420 wikis (including independent, wikifarm wikis, some packages of wikis and Wiki[pm]edia).[11]
Wikifarms
There are also wikifarms with hundreds of wikis. Here we only create pages for those we have some special information about that we don't want to lose (like archiving history and tips). For a full list, please use WikiApiary wikifarms main page.
Before backing up a wikifarm, try to update the list of wikis for it. There are Python scripts to generate those lists for many wikifarms.
Wikifarm | Wikis | Status | Dumps | Comments |
---|---|---|---|---|
Battlestar Wiki (site) | 8 | Online | ? | |
BluWiki (site) | ? | Offline | ~20[12] | |
EditThis (site) | 1,350[13] | Unstable | 1307+ (IA: 1,297[14]) | Most dumps were done in 2014. This wikifarm is not well covered in WikiApiary.[15] |
elwiki.com (site) | Unknown[16] | Offline | None[17] | Last seen online in 2008.[18] There is no dumps, presumably lost. Perhaps some pages are in the Wayback Machine. |
Miraheze (site) | ? | Online | ? | |
Neoseeker.com (site) | 229[19] | Online | 159[20] | Check why there are dozens of wikis without dump. |
Orain (site) | 425[21] | Offline | ~380[22][23] | Last seen online in September 2015. Dumps were made in August 2013, January 2014 and August 2015. |
Referata (site) | 156[24] | Online | ~80[25][26][27] | Check why there are dozens of wikis without dump. |
ScribbleWiki (site) | 119[28] | Offline | None[29] | Last seen online in 2008.[30] There is no dumps, presumably lost. Perhaps some pages are in the Wayback Machine. |
ShoutWiki (site) | 1,879[31] | Online | ~1,300[32][33] | Check why there are dozens of wikis without dump. |
Sourceforge | ? | Online | 315[34] | |
TropicalWikis (site) | 187[35] | Offline | 152[36] | Killed off in November 2013. Allegedly pending move to Orain (which became offline too). Data from February 2013 and earlier saved. |
Wik.is (site) | ? | Offline | ? | Non-MediaWiki. |
Wiki-Site (site) | 5,839[37] | Online | 367 | No uploaded dumps yet. |
Wikia (site) | 400,000[38] | Online | ~34,000[39] | Help:Database download, Their dumping code |
WikiHub (site) | ? | Offline | 7[40] | |
Wiki.Wiki (site) | 100[41] | Online | ? | |
Wikkii (site) | 3,267 | Offline | 1,300[42] | |
YourWiki.net (site) | ? | Offline | ? |
Wikis to archive
Please add a wiki to WikiApiary if you want someone to archive it sooner or later; or tell us on the #wikiteam channel if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.
You can help downloading wikis yourself. If you don't know where to start, pick a wiki which was not archived yet from the lists on WikiApiary. Also, you can edit those pages to link existing dumps! You'll help others focus their work.
Examples of huge wikis:
- Wikipedia - arguably the largest and one of the oldest wikis on the planet. It offers public backups (also for sister projects): http://dumps.wikimedia.org
- They have some mirrors but not many.
- The transfer of the dumps to the Internet Archive is automated and is currently managed by Hydriz.
- Wikimedia Commons - a wiki of media files available for free usage. It offers public backups: http://dumps.wikimedia.org
- But there is no image dump available, only the image descriptions
- So we made it! http://archive.org/details/wikimediacommons
- Wikia - a website that allows the creation and hosting of wikis. Doesn't make regular backups.
We're trying to decide which other wiki engines to work on: suggestions needed!
Tools and source code
Official WikiTeam tools
- WikiTeam in GitHub
- dumpgenerator.py to download MediaWiki wikis: python dumpgenerator.py --api=http://archiveteam.org/api.php --xml --images
- wikipediadownloader.py to download Wikipedia dumps from download.wikimedia.org: python wikipediadownloader.py
Other
- Scripts of a guy who saved Wikitravel
- OddMuseWiki backup
- UseModWiki: use wget/curl and raw mode (might have a different URL scheme, like this)
- Some wikis: UseMod:SiteList
Wiki dumps
Most of our dumps are in the wikiteam collection at the Internet Archive. If you want an item to land there, just upload it in "opensource" collection and remember the "WikiTeam" keyword, it will be moved at some point. When you've uploaded enough wikis, you'll probably be made a collection admin to save others the effort to move your stuff.
For a manually curated list, visit the download section on GitHub.
There is another site of MediaWiki dumps located here on Scott's website.
Tips
Some tips:
- When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z use to be smaller (better compress ratio), so use 7z.
- To download a mass of wikis with N parallel threads, just
split
your full$list
in N chunks, then start N instances oflauncher.py
(tutorial), one for each list- If you want to upload dumps as they're ready and clean up your storage: at the same time, in a separate window or screen, run a loop of the kind
while true; do ./uploader.py $list --prune-directories --prune-wikidump; sleep 12h; done;
(thesleep
ensure each run has something to do). - If you want to go advanced and run really many instances, use
tmux
[1]! Every now and then, attach to the tmux session and look (ctrl-b f
) for windows stuck on "is wrong", "is slow" or "......" loops, or which are inactive[2]. Even with a couple cores you can run a hundred instances, just make sure to have enough disk space for the occasional huge ones (tens of GB).
- If you want to upload dumps as they're ready and clean up your storage: at the same time, in a separate window or screen, run a loop of the kind
BitTorrent downloads
You can download and seed the torrents from the archive.org collection. Every item has a "Torrent" link.
Old mirrors
- Sourceforge (also mirrored to another 26 mirrors)
- Internet Archive (direct link to directory)
Recursive
We also have dumps for our coordination wikis:
- ArchiveTeam wiki (2014-03-26)
- WikiApiary (2015-03-25)
References
- ↑ Websites - WikiApiary
- ↑ Pavlo's list of wikis (mediawiki.csv) (backup)
- ↑ WikiIndex Statistics
- ↑ Wikifarms
- ↑ Comparison of wiki hosting services
- ↑ Category:WikiFarm
- ↑ List of largest wikis
- ↑ List of largest wikis in the world
- ↑ Wikimedia Downloads Historical Archives
- ↑ Dump of Nostalgia, an ancient version of Wikipedia from 2001
- ↑ WikiTeam collection at Internet Archive
- ↑ bluwiki - dumps
- ↑ editthis.info - list of wikis
- ↑ editthis.info - dumps
- ↑ Farm:EditThis
- ↑ elwiki.com - list of wikis
- ↑ elwiki.com - dumps
- ↑ We're sorry about the downtime we've been having lately
- ↑ neoseeker.com - list of wikis
- ↑ neoseeker.com - dumps
- ↑ orain.com - list of wikis
- ↑ orain - dumps
- ↑ Orain wikifarm dump (August 2013)
- ↑ referata.com - list of wikis
- ↑ referata.com - dumps
- ↑ Referata wikifarm dump 20111204
- ↑ Referata wikifarm dump (August 2013)
- ↑ scribblewiki.com - list of wikis
- ↑ scribblewiki.com - dumps
- ↑ What is ScribbleWiki?
- ↑ shoutwiki.com - list of wikis
- ↑ shoutwiki.com - dumps
- ↑ ShoutWiki wikifarm dump
- ↑ sourceforge - dumps
- ↑ tropicalwikis.com - list of wikis
- ↑ tropicalwikis.com - dumps
- ↑ wiki-site.com - list of wikis
- ↑ wikia.com - list of wikis
- ↑ Wikia wikis data dumps
- ↑ wikihub - dumps
- ↑ wiki.wiki - list of wikis
- ↑ wikki.com - dumps
External links
- http://wikiindex.org - A lot of wikis to save
- http://wiki1001.com/ offline?
- http://s23.org/wikistats/
- http://en.wikipedia.org/wiki/Comparison_of_wiki_farms
- http://en.wikipedia.org/wiki/User:Emijrp/Wikipedia_Archive
- http://blog.shoutwiki.com/
- http://wikiheaven.blogspot.com/