Yesterday was the OAuthalypse--the day when Twitter stopped accepting HTTP Basic authorizations on theis API. I had a few apps break--like almost everything I've done with Twitter. To get them back working I'll have to spend some time on each moving them over to OAuth. For some that won't be hard--they're already using a library that supports OAuth. For others it will be more work. All of them are single user apps (like the UtahPolitics retweeter and so will use the OAuth single token pattern.
The reason for moving to OAuth is so that apps won't need to ask users for their Twitter password or store it anymore. Twitter had a bad experience with this and that led to the decision to go nuclear on usernames and passwords on their API. This is a clear win for delegated authorization protocols like OAuth and the more capable ones that are surely to follow. What's more it trains users to use a delegated authorization scheme. I love it.
But what's curious about the move is that in everycase (except the retweeter) my apps are not updating information. These are read-only apps that simply read a friend timeline for a partcular user. I can't figure out why any authorization is needed at all. Since who I follow is public information, it would be simple enough to reconstruct my friend timeline from available information. My theory is that Twitter uses authentication on read-only data as a substitute for a poorly designed API. That is, they use the authentication as a substitute for merely allowing me to specify whose timeline I want to see.
This is classic REST stuff and it seems that Twitter got it wrong. Thousands of apps are failing today because Twitter requires them to authorize when they don't really need to. Am I wrong?