Using Buffer to schedule “Tweets”

If you’re a user of social media, you may have heard of the social media tool called Buffer. In case you haven’t, it’s basically a sharing and scheduling tool which “buffers” outbound tweets and then sends them  in the future. I use it for Twitter, but it can also be connected to Google+, FaceBook, LinkedIn and so on (if you have a premium account)…. For me the free one works just fine.

Buffer differs from other Twitter scheduling apps because you don’t have to manually schedule tweets, you set a timetable for when you want tweets to go out and Buffer automatically tweets at the predetermined times.

You can fill your buffer in a variety of ways:

  • Emailing tweets to a private email address
  • Via the web interface at
  • From your smart phone
  • Through Firefox, Chrome or Safari browser plugins (not IE … yet).

This is great and I’ve noticed more activity and engagement from my followers since I used Buffer. Unfortunately, I couldn’t think of a way to populate multiple Twitter accounts without paying for the premium version….. until now.


Buffer screenshot
Buffer screenshot

As you can see from the screenshot above, I have access to three Twitter accounts. My own account @richardbishop, my company account @TrustIV and the @VivitWorldwide account. By logging into Twitter in three browsers, I can stay connected to each of them and use the different plugins to populate my Buffer.

In the screenshot above, I used Chrome, Chrome Canary and Firefox and installed the Buffer plugin in each of them. This allows me to paste Tweets between the browsers if I want the same message in multiple accounts or post to individual Twitter feeds if I want to. Each of my accounts has a different schedule and it is possible to share my non-personal Buffer account with the other users of the @TrustIV and @VivitWorldwide Twitter accounts.

Are You a Human? – CAPTCHA replacement

Update – 16th March 2020 – Sadly this plugin and the “Are You a Human” website is no longer available. 

I’ve noticed a lot of automated registrations on my blog recently (despite using Akismet to deter spam). Deleting these fake user accounts every few days has become a pain in the neck. This reminded me of plugin that I read about last year which claims to be more effective than CAPTCHAs and can help deter automated user registrations on your site.

The Are You A Human plug-in helps to distinguish humans from bots in a more enjoyable way than forcing users to read images of corrupted words. The games are simple,  mildly amusing and according to the Are You a Human website, are generally faster than a text CAPTCHA.

Are You A Human plug-ins are available for WordPress, Drupal, Joomnla, nodeJS, phpBB, PHP, dotNet, Java and others so the chances are it could be used on your site if you have this problem.

I installed it in about 5 minutes on my site and it seems to work really well. Click on the image below for a demo of this in action. Alternatively try to post a comment below and use it “for real”.

Are You a Human

Tate website fails under Kraftwerk ticket demand



Who’d have thought it?
I didn’t realise that Kraftwerk was so popular.

This morning, Twitter and Google Plus are full of people ranting about poor website performance.

Today’s vitriol is reserved for the Tate Modern in London which is selling tickets for the Kraftwerk 2013 shows.



Here are a few snippets of customer feedback from social media this morning.

Twitter comments - Tate

Disgruntled customers are never a “good news” story. The IT people will have their necks on the blocks, the PR team will be working overtime and I really feel sorry for the people who have to man the phones and take calls from customers who’ve been waiting to get through for hours.

A brief look at the site shows a few things that they could do to relieve their problems, but I suspect that they simply haven’t seen demand like this before. This, of course, makes it hard to plan; but some simple performance testing prior to launch could have identified these problems and prevented such a PR disaster.

What could they have done?

Used a CDN
They appear to host their own images
A CDN could help to take load away from their webservers at peak time

Served scaled images and use lossless image compression
They send large images, but resize them in HTML.
Many images could be compressed without reducing quality.
Large images consume bandwidth and reduce the number of simultaneous users that a site can support.

Use a more scalable application architecture
They seem to be hosted by who offer various hosting options including cloud as well as conventional hosting plans. Despite an on-demand architecture, if the application isn’t designed and built to scale up to meet demand it can still fail.