External Comment Integration Rewrite |
July 11th, 2012 |
comments, tech |
- They load in parallel: one site being slow won't delay the others.
- They're cached: I make at most one request per page per 5-minute period so that it doesn't hammer origin sites when I have a popular post.
- It can pull in reddit and lesswrong posts.
When I first added comments it was just for
Facebook, and I couldn't run any server side code. So I wrote a
simple stateless wsgi app that ran on dotCloud which would fetch
comments and return javascript which would document.write
them into the post. This would delay rendering of anything below
until the request had gone through dotCloud to Facebook, back to
dotCloud, and then back to the browser, but there wasn't much of
anything below the comments so it didn't much matter.
A little later I added Google Plus comments, and I didn't rewrite the wsgi app to make requests in parallel, so it didn't even start getting the Google Plus comments until it finished with Facebook, and it didn't display any comments until it heard back from both. Worse, if anything went wrong with either, comments didn't display at all. It also didn't do any caching, so when I had a popular article my little dotCloud wsgi app would get swamped and no one would see comments.
I've been doing some crossposting to lesswrong lately, and wanted comments to show up the same way as they do with Google Plus and Facebook. I decided to fix the code at the same time. The general idea now is for each site we:
- Make a div.
- Inject some javascript that:
- Makes an XmlHttpRequest to
/wsgi/json-comments/<service>/<token>
- Gets back data in json as:
[[author, author_link, anchor, comment], [author, author_link, anchor, comment], ...]
- (Example: /wsgi/json-comments/gp/XhFcxeshNje.)
- Writes out the comments into the div.
- Makes an XmlHttpRequest to
The wsgi code handling requests to /wsgi/json-comments/*
is mostly the same as
before, except that now a single request is only ever for one service, it returns simple
json instead of javascript to write out the comments, and it does caching.
Setting up caching was pretty simple: memcache does almost everything for me. I had to:
# On Ubuntu at least this also starts memcached running and sets it # to run on boot. sudo apt-get install memcached python-memcacheAnd then add a bit of code to my wsgi app:
import memcache import time # connect to memcached mc = memcache.Client(['127.0.0.1:11211'], debug=0) # example key: "gp/XhFcxeshNje" key = "%s/%s" % (service, token) t_and_comments = mc.get(key) if t_and_comments: t, comments = json.loads(t_and_comments) else: t, comments = 0, [] # cache for 5min if time.time()-t > 5*60: comments = generate_comments(service, token) mc.set(key, json.dumps([time.time(), comments])) # ideally we could serve stale data while kicking off a background # thread, but we're not that sophisticated. return comments
Update 2012-07-11: I just modified this to be smarter. It
now does two requests, first to /wsgi/json-comments-cached
and once that completes to /wsgi/json-comments
. The first
request is "give me whatever comments you have cached, and don't waste
time on any external services!" while the second is "if you have fresh
comments cached give me those, otherwise I'll wait for you to go get
them."
Update 2013-03-15: I noticed that comments on notes were no longer coming in through the facebook comment api, so I scraped them and added a mode where my comment code reads from a file. I haven't used notes since 8/2011 and they're not getting comments anymore, so archiving them seems like a fine compromise.
Update 2013-09-18: The server-side code is on github.
Comment via: google plus, facebook