Difference between revisions of "Distributed recursive crawls"

From Archiveteam
Jump to navigation Jump to search
(Add IA collection)
(Update status)
 
Line 1: Line 1:
{{Infobox project
{{Infobox project
| project_status = {{specialcase}}
| project_status = {{specialcase}}
| archiving_status = {{in progress}}
| archiving_status = {{onhiatus}}
| archiving_type = DPoS
| archiving_type = DPoS
| source = [https://github.com/ArchiveTeam/grab-grab grab-grab]
| source = [https://github.com/ArchiveTeam/grab-grab grab-grab]

Latest revision as of 02:18, 29 August 2022

Distributed recursive crawls
Status Special case
Archiving status On hiatus
Archiving type DPoS
Project source grab-grab
Project tracker grab
IRC channel #Y (on hackint)
Data[how to use] archiveteam_grab

This is a project to recursively crawl large websites that have no clear structure that can easily be split into work items the way we usually do on DPoS projects. It is somewhat comparable to ArchiveBot in that crawls are started manually for specific sites of interest.