The Editor As Curator Of ALL The News On The Web

Jeff Jarvis challenges news organizations to define the role of editor in the 21st century, i.e. Editor 2.0. Jeff connects a number of dots that involve a significant, even radical shift in the traditional editorial role, such as new search/tag editor positions. But one of the most radical shifts taking place is that editors are now being asked to curate OTHER news organization’s content in addition to their own.

In the age of limited, monopoly distribution, editors were able to focus exclusively on the product of their own newsrooms, because that was the only content their readers could get in most cases. Now that the web and search has made ALL content from EVERY source easily accessible, many media brands are realizing they can’t just be in the business of creating their own content — they need to bring their readers the ENTIRE universe of content on the web.

A number of traditional media brands have already started curating content from other news organizations — these efforts typically employ a traditional, command-and-control, single editor model, but they nonetheless represent a sea change in the disposition of news organizations towards content produced inside their walls vs. content produced outside their walls. In a networked media world, no content brand can do it all by themselves — news consumers, empowered by search and news aggregators, know this, and that’s what’s driving news organizations to take this radical step.

For example, BusinessWeek.com has launched a new feature called Executive Summary, which aggregates the top business stories of the day, mostly from sources OTHER THAN BusinessWeek.

Here’s how BusinessWeek describes the Executive Summary:

The Executive Summary is BusinessWeek’s daily roundup of the most important business news from around the Web. Edited by Chi-Chu Tschang, the early edition is posted every weekday by 6 a.m. Eastern Standard Time. An afternoon edition, edited by Harry Maurer, is available by 3 p.m. EST.

Another excellent example is the Anchorage Daily News’ Alaska Newsreader:

ADN editors find the news from all over Alaska every morning so you don’t have to.

alaska-newsreader.jpg

Time.com has been publishing for a while now a blog called The Ag (short for the Aggregator), which takes a similar approach.

BusinessWeek, Anchorage Daily, Time and many other news organizations have wisely realized that if they want to remain a principal daily destination for their readers, they need to do more than publish their own original content — there are too many other high quality content sources on the web — too many for news consumers to get their arms around. That’s why there’s a huge value creation opportunity for editors to curate ALL the news, not just what their own news organization can produce.

That said, what these first brave forays into news aggregation miss is the opportunity to harness the power of the web, to extend the editorial reach and enhance the editorial intelligence by taking a networked approach rather than a traditional siloed approach. A single editor can only read a limited number of sources, and can only post the aggregation once a day.

But imagine, instead, many editors and journalists collaborating to find and select the most important stories of the day. Image how many more sources they could cover. Imagine the news aggregation updated dynamically throughout the day.

Imagine “the editor” as a powerful networked intelligence, bringing you ALL the news on the web.

10 thoughts on “The Editor As Curator Of ALL The News On The Web”

  1. Scott–Excellent post. But I can’t help point out while so many of us are sitting here typing about the future of this profession, there are vast wildfires burning in SoCal and, vast amounts of excellent UGC being made in real time. . .and no one aggregating it. (See my blog entry on this.)

    I believe we need a single big, branded, civic-minded site that aggregates the best UGC at a moment like this. Unless one exists and I’m missing it.

    Anybody?

  2. Directing attention with skill and care so to allow people to use their time more effectively (by getting them what they want more quickly) is and will continue to be a service in massive demand. (Popurls.com pleasantly presents the work of many of these services.)

    Your proposal sounds like Digg driven by journalists and editors. Is it possible sub-communities would form (or already have?) within service sites such as Digg (for example, journalists.digg.com)?

    It seems “information about information” services will spread as quickly as information does on the Web. Some will be machine-driven, others human-driven.

    With the proliferation, we’ll start to need “information about information about information.”

    However, at some point, doesn’t it become impossible to keep up, unless your audience or topic is so small (comprehensive precision works well on the Web) that it remains manageable?

    The other strategy to remain relevant in an environment of exponentially expanding information, information about information, and information about information about information, is to, as you say, build networks of intelligence to sort information (and information…). But that requires everyone to agree to participate and continue to participate in your intelligence network. To do that, the network will have to demonstrate consistent and remarkable utility.

    The problem is as a network grows, it is threatened with becoming less useful. It pulls in more information, and at some point, there is too much information to be useful, and to retain its utility, there must come an “information about information” service, which will be followed by “an information about information about information” service…

    Anyway, the point is that controlling attention in an information environment that is constantly changing and rapidly growing will have to be a constantly evolving business/endeavor. Your attention-directing organization will need to split, adapt and specialize in perpetuity as the information perpetually grows. As a whole, this is how the Web is working. Search engines seem to retain a simple (but limited) way to explore such a complex landscape made up of a steadily increasing number of networks.

    Anyway, I’m looking forward to hearing more about Publish2.

  3. On our site all editorial staff contribute to a section called ‘Editor’s picks’ where we select news stories from a huge range of providers.

    While we are limited by staff numbers and time in doing this, updating this section throughout the day has become part of our routine.

    Not only does it keep me updated on breaking news throughout the day by having to manually select these stories, it gives an impression to our audience that we are trying to stay at the front of developments in our sector (new media).

    Given our small editorial team however, this process is hugely time consuming and I wonder whether there would be an automated way to overcome this problem.

    This in turn would bring its own problems like how much editorial control over this section we should relinquish to an automatic process.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>