I've never understood why URL shortening has not become a part of the browser... surely it's easy enough to shorten a URL by compressing it, so the browser could see abc.ly/q2FFqta8ep and decompress it without a single http request.
Still, the world loves ease of use and hence we get the inevitable abuse of URL shortening to send people to spam/scam/phising sites.
So why not have a public URL database, where a people enter the shortened URL and the database either gives them the URL or tells them it is spam/scam etc? Crowd-source it - let people use browser extensions to report bad links.
Importantly, to avoid it just being yet another redirector, the service would need to provide public database dumps so that others could provide competing services, verify the data, etc. Vitally, ISPs could provide access or even use the list to warn the users about the content. Think of it a bit like the freedb or the imdb - public service databases.
It would allow the browser to look up every link on a page to find out where it ultimately points, and take the URL shortening services out of the loop entirely. It could even communicate via the minimal overheads of http error codes - 404 if the link is not known, 307 to provide the correct redirect (skipping advert pages, further pages of redirection), 303 to indicate a dodgy site and supply the URL of a relavant page (e.g. the Dodgy Scammy Warning page).
Come on people, one link gets read thousands of times, so if a small proportion are willing to mark dodgy links, that could save a lot of people a lot of headaches. And we could always use it to tag links, and vive la web semantique.