Steve Gillmor has a great article on how RSS aggregators could do a better job of showing items of interest. He gives three specific examples of what they might do to do their job smarter:

To begin with, we need to harness the information we already possess about who and what we read. Rather than relying on content creators to signal already consumed material, let's let the RSS aggregator (offline or online) filter out the links, but not the supporting commentary, to already consumed posts. Instrumenting the browser to record what is read, in what order, and for how long is trivial, says Adam Bosworth, in the context of his Alchemy caching architecture.

Next, let's incent that cache, mirrored on both server and client, to save posts that appear of interest or import not just to me but my peers on the network, as represented by the RSS feeds that I and they are subscribed to. If Jon Udell, Dave Winer, Doc Searls and 70% of their subscribers find the RSS BitTorrent thread compelling, then please send a message to my cache engine not to throw that post away, no matter whether I have ever heard of the poster or the horse they rode in on, the idea he or she is promoting.

Next, compare all the posts and posters and produce a weighted priority list that takes into account variables such as author, subject, updates, Technorati cosmos tracking, the amount of time I have before the next meeting on my calendar, and so on, producing a post rank based not just on my attention but the attention dynamics of those I choose to do my filtering with and for me.
From Steve Gillmor's Blogosphere - Wednesday, June 23, 2004 Entries
Referenced Thu Jun 24 2004 14:52:38 GMT-0600

Please leave comments using the sidebar.

Last modified: Thu Oct 10 12:47:21 2019.