The Role of Trusted Human Editors In Filtering The Web

When you place a big bet on a new model, it’s always nice hear that smart people are thinking about the big trends that underlie that model. So it was great to hear Robert Scoble, Paul Graham, and Larry Kramer thinking about human-driven information filtering on the Web — and particularly the role of TRUSTED humans.

Scoble has a video series on how TechMeme, Mahalo, and Facebook will beat Google. Here’s a brief rundown of some interesting points he makes:

  • Humans can judge what’s missing from an aggregation of information on a topic
  • The key to effective human filtering is leveraging a “fabric of trusted individuals”/”people who are trusted and credible”
  • By connecting these trusted people through a social network, you can leverage that resulting social graph to validate trust and create network effects

(Part II is where Scoble makes his main argument.)

Rand Fishkin at SEOMoz takes Scoble to task for some sloppy thinking and other errors related to Google and search, but I look to Scoble not as an architect of new models but as a lightning rod for big trends. And the idea that humans — in particular trusted humans — will play a more overt role in the future of information filtering on the Web is definitely in the air.

It was interesting to see Scoble draw numbers on the board to represent Mahalo’s guides, but then instinctively exemplify them with trusted information filters like Guy Kawasaki, himself, Mike Arrington, and Steve Rubel (who are of course not actually Mahalo guides) — that’s exactly it, and it’s why TechMeme works so well using top bloggers as a proxy for trust.

By using journalists and serious bloggers as a proxy for trust, Publish2 aims to solve the scalability problem that Scoble raises in Part III of his video, by creating a scalable mechanism for identifying the RIGHT people, i.e. people who are trusted and people who are GOOD at filtering the web. We’re going to seed Publish2 with trusted, skilled human editors and then let THEM decide who else to trust.

This gets to a very interesting feature that Paul Graham introduced to Hacker News (previously Startup News), a Digg/Reddit-like submit and vote site:

Of course, it’s easy to have a good site when you start out with a core group of smart users. How do you keep it good as more people find out about it? We think we have an answer to that. We’re going to have a group of human editors who train the system in what counts as a good story. Each user’s voting power will then be scaled based on whether they vote for good stories or bad ones. This should protect us against the arrival of users who vote up dumb stories. The worse stuff a user upvotes, the less effect their future votes will have. And vice versa: someone who consistently recommends interesting stories will be rewarded with a louder voice.

Imagine that. Some people vote for “dumb” stories. Other people have better judgment about what is a good story — so Paul is putting human editors in place to make sure that only the humans who are actually GOOD at judging news can influence the system. I can see this working in a niche topic area like Hacker News, but to scale across every topic, you need a proxy for news judgment — which is what led us to journalists.

The potential advantage of human judgment over computer algorithms seems to be on everyone’s mind these days. Kara Swisher interviews Larry Kramer, and in the last 20 seconds she asks him what’s overhyped, and Kramer answers:

Search is going to wear out over time. Web users want guides. The efficiency of search, which can get sort of better, still lacks the human touch. The human touch will add a lot of value.

We couldn’t agree more (the part about the “human touch” adding a lot of value). Algorithms are fast and can cover a lot more ground than individual human, but they lack a fundamental human gift — judgment.

Of course, human judgment is what powers Google PageRank, via human link patterns, but the judgment is implicit, rather than explicit, as it is in the model that Digg, Del.icio.us Popular, and Reddit pioneered. When you network a large group of trusted, skilled humans — and the network effect kicks in — suddenly you can cover a lot of ground.

The potential for a network of trusted humans to play a big role in the future of the Web is everywhere you look, and it’s why we’re so excited about Publish2.

Please sign up for the beta if you’re interested in being part of Publish2 — it’s just around the corner.

7 thoughts on “The Role of Trusted Human Editors In Filtering The Web”

  1. Huh?

    “News is in its infancy on the Web.”

    News papers were “a delivery system to an eco-structure.”

    “Click per acquistion” and “click per thousand models”

    What the heck is he talking about?

    Venture capitalist in what? He never mentioned a company? He never answered the question!

  2. This is a subject we certainly believe in. Given the context of Africa it is important that we work to bring out the stories, news and information that give a more balanced view of the continent. A simple search on the web usually brings mostly negative news.

    Let us know what you think http://www.africanews.com.

  3. This is a great post, and the whole concept of human driven information filtering will surely be a big part of our life on the web.

    Notably, as computing becomes ubiquitous, the mass of people filtering the information on the web will scale just as the amount of information does.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>