Difference between revisions of "WikiTeam"

From Archiveteam
Jump to navigation Jump to search
m (→‎Official WikiTeam tools: Changed links to more modern ones.)
(updating links to GitHub)
Line 4: Line 4:
| image = Wikiteam.jpg
| image = Wikiteam.jpg
| description = WikiTeam, a set of tools for wiki preservation and a repository of wikis
| description = WikiTeam, a set of tools for wiki preservation and a repository of wikis
| URL = http://code.google.com/p/wikiteam
| URL = https://github.com/WikiTeam/wikiteam
| project_status = {{online}} (at least some of them)
| project_status = {{online}} (at least some of them)
| tracker = manual for now, check [https://wikiapiary.com/wiki/Category:Website_not_archived not archived wikis on wikiapiary]
| tracker = manual for now, check [https://wikiapiary.com/wiki/Category:Website_not_archived not archived wikis on wikiapiary]
| archiving_status = {{inprogress}} ([http://code.google.com/p/wikiteam/wiki/NewTutorial you can help])
| archiving_status = {{inprogress}} ([https://github.com/WikiTeam/wikiteam/wiki/Tutorial you can help])
| irc = wikiteam
| irc = wikiteam
}}
}}
Line 19: Line 19:
Please [https://wikiapiary.com/wiki/Special:FormEdit/Website add a wiki to wikiapiary] if you want someone to archive it sooner or later; or tell us on the #wikiteam channel if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.
Please [https://wikiapiary.com/wiki/Special:FormEdit/Website add a wiki to wikiapiary] if you want someone to archive it sooner or later; or tell us on the #wikiteam channel if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.


[http://code.google.com/p/wikiteam/wiki/NewTutorial You can help] downloading wikis yourself. If you don't know where to start, pick a [https://wikiapiary.com/wiki/Category:Website_not_archived wiki which was not archived yet] from the lists on wikiapiary. If you can't, edit those pages to link existing dumps! You'll help others focus their work.
[https://github.com/WikiTeam/wikiteam/wiki/Tutorial You can help] downloading wikis yourself. If you don't know where to start, pick a [https://wikiapiary.com/wiki/Category:Website_not_archived wiki which was not archived yet] from the lists on wikiapiary. If you can't, edit those pages to link existing dumps! You'll help others focus their work.


Examples of huge wikis:
Examples of huge wikis:
Line 36: Line 36:
== Tools and source code ==
== Tools and source code ==
=== Official WikiTeam tools ===
=== Official WikiTeam tools ===
* [http://code.google.com/p/wikiteam/ WikiTeam Google Code repository]
* [https://github.com/WikiTeam/wikiteam WikiTeam in GitHub]
* '''[https://raw.githubusercontent.com/WikiTeam/wikiteam/master/dumpgenerator.py dumpgenerator.py] to download MediaWiki wikis:''' <tt>python dumpgenerator.py --api=http://archiveteam.org/api.php --xml --images</tt>
* '''[https://raw.githubusercontent.com/WikiTeam/wikiteam/master/dumpgenerator.py dumpgenerator.py] to download MediaWiki wikis:''' <tt>python dumpgenerator.py --api=http://archiveteam.org/api.php --xml --images</tt>
* [https://raw.githubusercontent.com/WikiTeam/wikiteam/master/wikipediadownloader.py wikipediadownloader.py] to download Wikipedia dumps from download.wikimedia.org: <tt>python wikipediadownloader.py</tt>
* [https://raw.githubusercontent.com/WikiTeam/wikiteam/master/wikipediadownloader.py wikipediadownloader.py] to download Wikipedia dumps from download.wikimedia.org: <tt>python wikipediadownloader.py</tt>
Line 50: Line 50:
Most of our dumps are in the [http://www.archive.org/details/wikiteam wikiteam collection at the Internet Archive]. If you want an item to land there, just upload it in "opensource" collection and remember the "WikiTeam" keyword, it will be moved at some point. When you've uploaded enough wikis, you'll probably be made a collection admin to save others the effort to move your stuff.
Most of our dumps are in the [http://www.archive.org/details/wikiteam wikiteam collection at the Internet Archive]. If you want an item to land there, just upload it in "opensource" collection and remember the "WikiTeam" keyword, it will be moved at some point. When you've uploaded enough wikis, you'll probably be made a collection admin to save others the effort to move your stuff.


For a manually curated list, [http://code.google.com/p/wikiteam/wiki/AvailableBackups visit the download section] on Google Code.
For a manually curated list, [https://github.com/WikiTeam/wikiteam/wiki/Available-Backups visit the download section] on GitHub.


There is another site of MediaWiki dumps located [http://mirrors.sdboyd56.com/WikiTeam/index.html here] on [http://www.archiveteam.org/index.php?title=User:Sdboyd Scott's] website.
There is another site of MediaWiki dumps located [http://mirrors.sdboyd56.com/WikiTeam/index.html here] on [http://www.archiveteam.org/index.php?title=User:Sdboyd Scott's] website.
Line 57: Line 57:
Some tips:
Some tips:
* When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z use to be smaller (better compress ratio), so use 7z.
* When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z use to be smaller (better compress ratio), so use 7z.
* To download a mass of wikis with N parallel threads, just <code>split</code> your full <code>$list</code> in N chunks, then start N instances of <code>launcher.py</code> ([https://code.google.com/p/wikiteam/wiki/NewTutorial#Download_a_list_of_wikis tutorial]), one for each list
* To download a mass of wikis with N parallel threads, just <code>split</code> your full <code>$list</code> in N chunks, then start N instances of <code>launcher.py</code> ([https://github.com/WikiTeam/wikiteam/wiki/Tutorial#Download_a_list_of_wikis tutorial]), one for each list
** If you want to upload dumps as they're ready and clean up your storage: at the same time, in a separate window or screen, run a loop of the kind <code>while true; do ./uploader.py $list --prune-directories --prune-wikidump; sleep 12h; done;</code> (the <code>sleep</code> ensure each run has something to do).  
** If you want to upload dumps as they're ready and clean up your storage: at the same time, in a separate window or screen, run a loop of the kind <code>while true; do ./uploader.py $list --prune-directories --prune-wikidump; sleep 12h; done;</code> (the <code>sleep</code> ensure each run has something to do).  
** If you want to go advanced and run really ''many'' instances, use <code>tmux</code>[http://blog.hawkhost.com/2010/07/02/tmux-%E2%80%93-the-terminal-multiplexer-part-2/]! Every now and then, attach to the tmux session and look (<code>ctrl-b f</code>) for windows stuck on "is wrong", "is slow" or "......" loops, or which are inactive[http://unix.stackexchange.com/questions/78093/how-can-i-make-tmux-monitor-a-window-for-inactivity]. Even with a couple cores you can run a hundred instances, just make sure to have enough disk space for the occasional huge ones (tens of GB).
** If you want to go advanced and run really ''many'' instances, use <code>tmux</code>[http://blog.hawkhost.com/2010/07/02/tmux-%E2%80%93-the-terminal-multiplexer-part-2/]! Every now and then, attach to the tmux session and look (<code>ctrl-b f</code>) for windows stuck on "is wrong", "is slow" or "......" loops, or which are inactive[http://unix.stackexchange.com/questions/78093/how-can-i-make-tmux-monitor-a-window-for-inactivity]. Even with a couple cores you can run a hundred instances, just make sure to have enough disk space for the occasional huge ones (tens of GB).
Line 85: Line 85:
* [http://s23.org/wikistats/largest_html.php?th=15000&lines=500 List of largest wikis in the world]
* [http://s23.org/wikistats/largest_html.php?th=15000&lines=500 List of largest wikis in the world]
* Dump of [http://nostalgia.wikipedia.org/ nostalgia], an ancient version of Wikipedia from 2001, [http://dumps.wikimedia.org/nostalgiawiki dump]
* Dump of [http://nostalgia.wikipedia.org/ nostalgia], an ancient version of Wikipedia from 2001, [http://dumps.wikimedia.org/nostalgiawiki dump]
* http://code.google.com/p/wikiteam/wiki/AvailableBackups many dumps


{{Navigation box}}
{{Navigation box}}


[[Category:Archive Team]]
[[Category:Archive Team]]

Revision as of 13:17, 17 December 2014

We save wikis, from Wikipedia to tiniest wikis
15000+ wikis saved to date
WikiTeam
WikiTeam, a set of tools for wiki preservation and a repository of wikis
WikiTeam, a set of tools for wiki preservation and a repository of wikis
URL https://github.com/WikiTeam/wikiteam
Status Online! (at least some of them)
Archiving status In progress... (you can help)
Archiving type Unknown
Project tracker manual for now, check not archived wikis on wikiapiary
IRC channel #wikiteam (on hackint)

Welcome to WikiTeam. A wiki is a website that allows the creation and editing of any number of interlinked web pages, generally used to store information on a specific subject or subjects. This is done with a day-to-day web browser using a simplified markup language (HTML as an example) or a WYSIWYG (what-you-see-is-what-you-get) text editor.

Most of the wikis don't offer public backups. How bad!

Wikis to archive

Please add a wiki to wikiapiary if you want someone to archive it sooner or later; or tell us on the #wikiteam channel if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.

You can help downloading wikis yourself. If you don't know where to start, pick a wiki which was not archived yet from the lists on wikiapiary. If you can't, edit those pages to link existing dumps! You'll help others focus their work.

Examples of huge wikis:

  • Wikipedia - arguably the largest and one of the oldest Wikis on the planet. It offers public backups (also for sister projects): http://dumps.wikimedia.org
    • They have some mirrors but not many.
    • Every now and then we upload a copy to archive.org, but this is not automated. You can do it in our stead. ;)
  • Wikimedia Commons - a Wiki of media files available for free usage. It offers public backups: http://dumps.wikimedia.org
  • Wikia - a website that allows the creation and hosting of wikis. Doesn't make regular backups.

There are also several wikifarms with hundreds of wikis. On this wiki we only create pages for those we have some special information about that we don't want to lose (like archiving history and tips). For a full list, please use wikiapiary: see the wikifarms main page.

We're trying to decide which other wiki engines to work on: suggestions needed!

Tools and source code

Official WikiTeam tools

Other

Wiki dumps

Most of our dumps are in the wikiteam collection at the Internet Archive. If you want an item to land there, just upload it in "opensource" collection and remember the "WikiTeam" keyword, it will be moved at some point. When you've uploaded enough wikis, you'll probably be made a collection admin to save others the effort to move your stuff.

For a manually curated list, visit the download section on GitHub.

There is another site of MediaWiki dumps located here on Scott's website.

Tips

Some tips:

  • When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z use to be smaller (better compress ratio), so use 7z.
  • To download a mass of wikis with N parallel threads, just split your full $list in N chunks, then start N instances of launcher.py (tutorial), one for each list
    • If you want to upload dumps as they're ready and clean up your storage: at the same time, in a separate window or screen, run a loop of the kind while true; do ./uploader.py $list --prune-directories --prune-wikidump; sleep 12h; done; (the sleep ensure each run has something to do).
    • If you want to go advanced and run really many instances, use tmux[1]! Every now and then, attach to the tmux session and look (ctrl-b f) for windows stuck on "is wrong", "is slow" or "......" loops, or which are inactive[2]. Even with a couple cores you can run a hundred instances, just make sure to have enough disk space for the occasional huge ones (tens of GB).

BitTorrent downloads

You can download and seed the torrents from the archive.org collection.

Old mirrors

  1. Sourceforge (also mirrored to another 26 mirrors)
  2. Internet Archive (direct link to directory)

See also

External links