Thoughts on URL shorteners

Image representing as depicted in Crunc...

This week there was lots of buzz around, a URL shortener company from Beatworks that raised 2 million dollars.  Betaworks is an incubator started by John Borthwick who i had the priveldge of working with at AOL. is pretty sweet. Check out the things it can do for you:

  • It uses a cookie to remember the last 15 links you’ve shortened and displays that history on the home page when you visit
  • It allows you to set up a custom URL ending for your link.
  • It automatically creates 3 thumbnails for every page you save a link to.
  • It saves a cached copy forever of every page you shorten a link to, on Amazon‘s S3 storage (processing is done on EC2, as well, so uptime looks good).
  • It tracks click-through numbers and referrers so you can see what kind of traffic your shortcut got and from where.
  • There’s a simple API for adding functionality to any other web app
  • It uses Reuter’s Calais to determine the general category and specific subjects of all the pages its users create shortcuts to.
  • All the data, including traffic data and thumbnails, is easily accessible by XML and JSON feeds.

That’s pretty slick indeed.  I think it’s interesting to see that investors see a service that helps developers and others garner more value from the web as a legitmate business. I’m presonally not sure where the business is in there.

An interesting post i read related to this is Delicious Joshua Schachter’s blog post about URL shorteners.  As he states, there are 3 people involved in shortening: (1) the site the link refers to, (2) the site/service  – the transit – containing the shortened URL, and (3) the user clicking on the shortened URL.  In his view, ALL are harmed from this service.  As he states:

The transit’s main problem with these systems is that a link that used to be transparent is now opaque and requires a lookup operation. From my past experience with Delicious, I know that a huge proportion of shortened links are just a disguise for spam, so examining the expanded URL is a necessary step. The transit has to hit every shortened link to get at the underlying link and hope that it doesn’t get throttled. It also has to log and store every redirect it ever sees.

The site where the link points to has milder problems. It’s possible that the redirection steps steals search juice. It certainly makes it harder to track down links to the published site if the publisher ever needs to reach their authors. And the publisher may lose information about the source of its traffic.

But the biggest burden falls on the clicker, the person who follows the links. The extra layer of indirection slows down browsing with additional DNS lookups and server hits. A new and potentially unreliable middleman now sits between the link and its destination. And the long-term archivability of the hyperlink now depends on the health of a third party. The shortener may decide a link is a Terms Of Service violation and delete it. If the shortener accidentally erases a database, forgets to renew its domain, or just disappears, the link will break. If a top-level domain changes its policy on commercial use, the link will break. If the shortener gets hacked, every link becomes a potential phishing attack.

Those all sound hairy, although it seems that has taken care of the some of the problems of the site disappearing by caching the page.  Even still, is the additional metrics provided by worth the loss of SEO juice?  It will be interesting to see how services like this begin to change the linking landscape and whether their services of providing accurate gauges of what’s “hot” are useful

Reblog this post [with Zemanta]

You Might Also Like