Puu.sh

From Archiveteam
Revision as of 20:21, 20 September 2013 by Dnova (talk | contribs)
Jump to navigation Jump to search
puu.sh
Puu.sh logo
Puush homepage screenshot.png
URL http://puush.me/
Status Special Case
Archiving status Saved! ~5.8 TB of data

Still In progress...

Archiving type Unknown
Project source https://github.com/ArchiveTeam/puush-grab
Project tracker http://b07s57le.corenetworks.net:8031/puush/
IRC channel #pushharder (on hackint)

puu.sh is a file sharing service that was created in 2010.

Image expiry

Early on June 7th, 2013, the following email was sent out to users:

Hey guys,

We're making some important changes to puush and want to inform you of how it will affect our service.

When we first conceived puush in 2010, we wanted to create a straightforward way to help us quickly share what was on our screens. Soon after, we extended puush to allow us to throw small files around too. Since then, we’ve seen a massive uptake and tremendous support from our users. The problem is that a tremendous majority of puushes aren’t being accessed again after 24 hours - in fact, only 10% of puushes are accessed after a month.

puush to us is a quick way to share things. puush is not a data warehouse.

We do not wish to become a file locker, file storage or backup service. There are plenty of other solutions out there that do a much better job of this (e.g. Dropbox), so what we want to do is this:

  • Remove the 200mb storage limit for free users
  • Stop offering permanent storage, and files will expire after not being accessed for:
    • Free users: 1 month
    • Pro users: up to 6 months
  • Offer an optional Dropbox “sync” for pro users (i.e. automatically save a copy to dropbox)

How this will affect you after the 1st of August 2013:

  • You will no longer have an account storage limits. Feel free to puush as much as you want!
  • We are going to start expiring files. At this point, any files which haven't been recently viewed by anyone will be automatically deleted after 1 month, or up to 6 months for pro users.
  • If you wish to grab a copy of your files before this begins, you can download an archive from your My Account page (Account -> Settings -> Pools -> Export).

As an example, if you have puush'd images which are being used on a forum, as long as that thread is visited at least once a month (or up to 6 months as a pro user) your files will *always be accessible*.

This notice is also visible on the puu.sh site, where it was announced even earlier.

How to Help

If you are comfortable running scripts manually (i.e., outside the Warrior) go to the GitHub repo for information how to run the scripts.

Where can I find a file?

If you know the item ID, go the the Wayback Machine and enter the URL as http://puu.sh/XXXXX without any filename extension. The Wayback Machine treats the URL as case-insensitive so you may need to explore which URL is the one you are looking for.

If the Puush is private, it is unlikely archived as we do not guess the access code (the bunch of characters after the item ID). You can, however, use wildcards as a way of browsing the Wayback Machine. Here's an example.

Archives

Archives are uploaded to the Archive Team Puush collection. These are the original WARC files. They are 10GB in size instead of the typical 50GB because the project is staged on cloud hosting with small disk space.

Tracker information

  • The tracker and rsync target is being run by User:Chfoo.
  • On 2013-08-22, Redis was unable to background save due to failed fork().
  • On 2013-08-27, an attempt was made to clear out the tracker log. Redis crashed.

Logs

Ranges

Date Loaded Start (Base 10) End (Base 10) Alphabet Notes
2013-08-06 0 (0) 3UXX3 (51607749) Legacy At most 10 URLs per item
2013-08-27 10 (62) 3UXX3 (51607749) Legacy At most 13 URLs per item (unlucky 13)
2013-09-08 3UXX4 (51607750) 49999 (61285459) Legacy At most 13 URLs per item
2013-09-13 4999a (61285460) 4mPOO (64547754) Puush At most 13 URLs per item
2013-09-15 4mPOP (64547755) 4rrrr (65645689) Puush At most 13 URLs per item
2013-09-16 4rrrs (65645690) 4sQ00 (65978416) Puush At most 13 URLs per item
4sQ01 (65978417) Puush At most 13 URLs per item. Auto-queues using a script that checks Twitter.

Ideas

  • Keep accessing each and every file - likely unsustainable in the long run in the event that expiry times are shortened
  • Grab everything - the site appears to use incremental images IDs

Shortcode Stats

Number of shortcodes:	 526
Number of string lengths:	 3
3 	 5 	   0.951%
4 	 125 	  23.764%
5 	 396 	  75.285%
Number of unique characters:	 62
Number of characters used:	 2495
0 	 24 	   0.962%
1 	 155 	   6.212%
2 	 234 	   9.379%
3 	 121 	   4.850%
4 	 24 	   0.962%
5 	 45 	   1.804%
6 	 26 	   1.042%
7 	 37 	   1.483%
8 	 25 	   1.002%
9 	 34 	   1.363%
A 	 46 	   1.844%
B 	 37 	   1.483%
C 	 46 	   1.844%
D 	 38 	   1.523%
E 	 36 	   1.443%
F 	 42 	   1.683%
G 	 33 	   1.323%
H 	 31 	   1.242%
I 	 37 	   1.483%
J 	 32 	   1.283%
K 	 38 	   1.523%
L 	 35 	   1.403%
M 	 28 	   1.122%
N 	 39 	   1.563%
O 	 31 	   1.242%
P 	 44 	   1.764%
Q 	 28 	   1.122%
R 	 36 	   1.443%
S 	 31 	   1.242%
T 	 26 	   1.042%
U 	 29 	   1.162%
V 	 32 	   1.283%
W 	 45 	   1.804%
X 	 30 	   1.202%
Y 	 29 	   1.162%
Z 	 30 	   1.202%
a 	 34 	   1.363%
b 	 39 	   1.563%
c 	 32 	   1.283%
d 	 46 	   1.844%
e 	 27 	   1.082%
f 	 30 	   1.202%
g 	 39 	   1.563%
h 	 38 	   1.523%
i 	 30 	   1.202%
j 	 34 	   1.363%
k 	 24 	   0.962%
l 	 29 	   1.162%
m 	 40 	   1.603%
n 	 40 	   1.603%
o 	 38 	   1.523%
p 	 25 	   1.002%
q 	 26 	   1.042%
r 	 34 	   1.363%
s 	 23 	   0.922%
t 	 45 	   1.804%
u 	 36 	   1.443%
v 	 27 	   1.082%
w 	 32 	   1.283%
x 	 45 	   1.804%
y 	 26 	   1.042%
z 	 22 	   0.882%

How many items are there?

<chfoo> [...] using the decentralized script i wrote, i've grabbed [randomly] 3824 items (totalling 785M) out of 6409 requests (a 60% hit rate at a max id of "40000" or 59,105,344). so, in theory, there's 35,463,206 items based on this sample and max id.