Difference between revisions of "Software"

From Archiveteam
Jump to navigation Jump to search
(Add belweder)
(→‎General Tools: add SiteSucker)
Line 18: Line 18:
* [https://sourceforge.net/projects/wilise/ WiLiSe] '''Wi'''ki'''Li'''nk '''Se'''arch - Python script to get links to specific pages of a site through the search in a Wiki ([[wikipedia:MediaWiki|MediaWiki]]-type) has the [http://www.mediawiki.org/wiki/Api.php api.php] accessible or [http://www.mediawiki.org/wiki/Extension:LinkSearch extension LinkSearch] enabled (the project is still very immature and at the moment the code is only available in [http://sourceforge.net/p/wilise/code/1/tree/code/trunk/ this SVN repository]).
* [https://sourceforge.net/projects/wilise/ WiLiSe] '''Wi'''ki'''Li'''nk '''Se'''arch - Python script to get links to specific pages of a site through the search in a Wiki ([[wikipedia:MediaWiki|MediaWiki]]-type) has the [http://www.mediawiki.org/wiki/Api.php api.php] accessible or [http://www.mediawiki.org/wiki/Extension:LinkSearch extension LinkSearch] enabled (the project is still very immature and at the moment the code is only available in [http://sourceforge.net/p/wilise/code/1/tree/code/trunk/ this SVN repository]).
* [[Mobile Phone Applications]] -- some notes on preserving old versions of mobile apps
* [[Mobile Phone Applications]] -- some notes on preserving old versions of mobile apps
* [https://ricks-apps.com/osx/sitesucker/index.html SiteSucker] - available on Apple's app stores


== Hosted tools ==
== Hosted tools ==

Revision as of 11:46, 28 July 2025

WARC Tools

The WARC Ecosystem has information on tools to create, read and process WARC files.

General Tools

  • GNU WGET
    • Backing up a Wordpress site: wget --no-parent --no-clobber --html-extension --recursive --convert-links --page-requisites --user=<username> --password=<password> <path>"
  • cURL
  • HTTrack - HTTrack options
  • Pavuk -- a bit flaky, but very flexible
  • belweder - tries to be a maintained alternative to HTTrack. Pre-release quality. Uses a custom format, WARC export planned.
  • Warrick - Tool to recover lost websites using various online archives and caches.
  • Beautiful Soup - Python library for web scraping
  • Scrapy - Fast python library for web scraping
  • snscrape - Tool to scrape social networking services.
  • Splinter - Web app acceptance testing library for Python -- could be used along with a scraping lib to extract data from hard-to-reach places
  • WiLiSe WikiLink Search - Python script to get links to specific pages of a site through the search in a Wiki (MediaWiki-type) has the api.php accessible or extension LinkSearch enabled (the project is still very immature and at the moment the code is only available in this SVN repository).
  • Mobile Phone Applications -- some notes on preserving old versions of mobile apps
  • SiteSucker - available on Apple's app stores

Hosted tools

  • Pinboard is a convenient social bookmarking service that will archive copies of all your bookmarks for online viewing. The catch is that it costs $22/year, or $39/year if you want the archival feature and you can only download archives of your 25 most recent bookmarks in a particular category. This may pose problems if you ever need to get your data out in a hurry.
  • freeyourstuff.cc -- Extensible open-source (source) Chrome plugin allowing users to export their own content (reviews, posts, etc.). Exports to JSON format, optionally publish to freeyourstuff.cc & mirrors under Creative Commons CC0 license. Supports Yelp, IMDB, TripAdvisor, Amazon, GoodReads, and Quora as of July 2019.

Site-Specific

Format Specific

Proposed

Web scraping

Why Back Up?SoftwareFormats