Distributed recursive crawls
Jump to navigation Jump to search
|Distributed recursive crawls|
|Archiving status||On hiatus|
|IRC channel||#Y (on hackint)|
This is a project to recursively crawl large websites that have no clear structure that can easily be split into work items the way we usually do on DPoS projects. It is somewhat comparable to ArchiveBot in that crawls are started manually for specific sites of interest.