For those who are into data hoarding, what are people using to archive whatever Friendfeed posts are still accessible?
vicster.,
Bruce Lewis,
Iván Abrego,
Anne Bouey,
Stephen Mack,
Steve C, Team Marina,
junrxu,
D.,
SAM,
RudĩϐЯaЯïan,
Amir,
Aloof Schipperke,
Big Joe Silenced,
Jennifer Dittrich,
Maitani,
Jenny H.,
Steven Perez,
John (bird whisperer),
Hookuh Tinypants,
and
Pete's Got To Go
liked this
I've seen more mentions of this: https://github.com/claudio...
- rønin
Although the non-US crowd has been bandying this about as well: http://www.softpedia.com/get...
- rønin
I'm systematically copying and pasting all my posts into a word document and deleting as I go. Not the popular route, but the one of my choosing.
- Jenny H.
Copy, paste, delete, drink, repeat... :-)
- Todd Hoff
ALL. DAY. LONG. I finished one account, but I'm only mid-2011 on the other. Super tedious, but worth it to me.
- Jenny H.
Ack! No, no deleting! How am I supposed to find my comments that I want to download if you're deleting them all? (she asked, selfishly and ack-fully)
- bentley
I know and I feel bad about that, but I've tried with the scripts before, they didn't work, and I'm not risking losing the documentation of my life since 2008. Selfish, perhaps, but I'm out of time and options.
- Jenny H.
Copy/paste I understand, but why delete?
- bentley
Because you can only go back 640 posts and I have many thousands of them.
- Jenny H.
I think Benjamin Golub pushed a change so that you can go back farther than 640 posts now. So far I've gotten to 1140 posts.
- Victor Ganata
I guess I'll try re-running the script, but I assumed that message was that going back would be more robust but not that they raised the cap on how far back you could go.
- Andrew C (✔)
If you manually type in friendfeed.com/$USERNAME?start=1350, it won't work and it'll just give you that same page as friendfeed.com/$USERNAME?start=660, but there's now an additional ncursor=$LONG_HEXADECIMAL_STRING field that I have no idea how to parse that allows you to go back way farther
- Victor Ganata
OIC. Yeah, so a robust script would have to manually grab and parse the "older" link rather than just calculate it.
- Andrew C (✔)
I can't get past start=630 and I KNOW I have posts beyond January of this year. :(
- Zulema ❧ spicy cocoa tart
Yeah you need the ncursor=xxx key too.
- Victor Ganata
Ben's change definitely let's you go beyond start=2010 now. But as Victor mentioned, it only works with the corresponding ncursor. Seems like the only way to get that is to physically click to each page.
- chrisofspades
I guess I'll just have to save the raw HTML
- Victor Ganata
Could someone who knows Ben ask if the ncursor thing could be used in the FF API?
- Andrew C (✔)
And if server load is an issue, which it might be, Twitter's export runs on the server. You make a request, it churns away for a bit, and then when it's done it emails you (I think?) with a link for an archive you can download.
- Andrew C (✔)
https://github.com/zverok/clio goes beyond 640 posts boundary (scroll to the end of README.md for instructions in English).
- courant
Trying the Ruby version now, courant. Thanks for sharing!
- Zulema ❧ spicy cocoa tart
The scripts I've run do seem to get past the 640 post limit, but they end up going in infinite loops after the 10,000th post.
- Victor Ganata
Victor, the Ruby version got as far back as 2013 (with only one month from 2012). That's a lot because, as we all know, I post A LOT.
- Zulema ❧ spicy cocoa tart
Right, so, does anyone know anyone at FF/FB to ask to adjust the API to go past 10K posts?
- Andrew C (✔)
http://www.standunited.org/petitio... ply sign to save
- || بهاره ||