With the increasing popularity of social media channels such as Twitter, URL shorteners have become commonplace - and this is hardly surprising. Twitter, by it's very nature, aims to be quick, dynamic and a continual stream of information/breaking news. Spend too long away from your live stream, and you're pretty much guaranteed to have missed something important! When information is delivered in this format, time can't be spent writing up the details (which is undoubtedly why Twitter has a 140 character limit on it's posts).
In this game, you find something out, you condense it into as few words as possible, and you get it out into the big, wide world. As with any instance of reporting 'the news', if you can reference a source, then your post becomes more credible. However, web pages, which are frequently used as sources themselves, can often have in excess of 140 characters. This would become a problem were it not for services such as Bit.ly and tinyURL. By condensing URLs, all web pages have the potential to be identified as a reference. Unfortunately, there are a number of problems with this system - which range from broken links (should the service experience some downtime) to security issues (users are unable to see the destination URL/the type of site a link references, until they visit it). Whilst work has been carried out in order to combat such problems, there is a larger underlying problem, which has the potential to cause Twitter (and also 'the web' in general) no end of problems in the future.
This issue comes in the form of the theoretical limits, which are unavoidable when it comes to URL shorteners. Shortened URLs are constructed based on a hashed code (of the actual URL), which is created by the supplier (i.e., Bit.ly). Each provider has it's own preference as to which characters can be used in the codes and in fact, how many characters the code consists of. Not long ago, tinyURL liked to use up to 6 characters per hashed URL, making use of only numbers and lowercase letters. This meant that the service was capable of shortening just over 2 billion URLs. Whilst this number is reasonably large, it is not infinite, meaning the service would need to look at introducing more characters into it's alphabet/service, or alternatively, increasing the length of the shortened URLs - which it has since done (tinyURL now uses a hash of 7 characters).
Whichever way you look at the problem, the future for URL shorteners is unclear - meaning the future for Twitter is also. When the limit on 7 characters is reached, should tinyURL move to an 8-character hash? It might be some distance away, but at some stage, URL shorteners have the potential to be as long, if not longer, than the actual URL to which they reference! When it gets to this stage, what should the likes of tinyURL and bit.ly do? Perhaps move to occupy a further domain and effectively start the process again? Well, they could do but in my opinion, this leads to an Internet, which becomes polluted by pages/services, which simply do not enhance or add any real value to the web. We can't just have an endless number of URL shorteners cropping up.
We need to move away from this process. It may be a little late now, but would it not have been beneficial for Twitter to include a URL field into it's interface, into which a direct web address could be placed - on top of the 140 permitted characters? I'd suggest that it may have been worth developing the service along these lines as it would certainly have avoided this problem as well as helping with the integration of Tweets into search engine ranking algorithms and all things SEO!