Making Emergent

The story behind the real-time news rumour tracker

You may have read some of the Twitter chatter or news coverage of Craig Silverman’s rumour tracker, Emergent. We’d like to share some details of how Emergent came to be, and how we approached its design.

The rising importance of news verification

Craig has been examining news rumours for many years, and runs the popular blog Regret the Error which posts rumours and corrections. He’s known for a while that that verification of news would become increasingly important.

Craig worked with software developer Adam Hooper to develop a database for capturing news articles about rumours. The database enabled Craig and a research assistant to quickly capture articles about a given claim, and to have the system check to see if the articles were updated over time. The system also captured share counts for the articles to gauge their level of popularity.


“Before, you had very packaged news that came out at certain times a day and was very reliable in terms of what it would give to you. Today with networks and the fact that it’s very decentralized we have huge amounts of information of all different levels of verification and truthiness.”
Craig Silverman, Emergent

Craig’s initial goal was to gather data for a research project he is running as a fellow with the Tow Center for Digital Journalism at Columbia University. But he also wanted to create and launch a public-facing version of this system.

Offered a lightning talk at the 2014 Online News Association (ONA) conference, Craig saw an opportunity to share his vision with the journalism community and gather valuable feedback. With the conference as a deadline, Normative partnered with Craig’s team to put together a proof of concept prototype that he could use to communicate what Emergent is all about, and test some of the basic hypotheses with the audience. Would people at the conference be interested enough to warrant further effort on the product?

We had two huge challenges ahead of us:

How could we take the complexity of Craig’s deep analysis of news stories and make it easy to understand? And what could we build in three weeks?

With a short window of time to create the first prototype, we had to be laser focused on the most critical aspects of the product, and on helping Craig tell a great story at the ONA conference.

There were many features we decided to leave out of this first prototype in order to make something that told a clear story about Emergent.

We started down two separate but related streams — data analysis and prototyping, and interface design.

Stream 1: Collecting and graphing the data

Andrew Hull, our Technology Director, worked with his team to pull data from the existing database, collecting it and graphing it in different ways. He found that there were a lot of questions around the meaning of this data, questions we couldn’t answer without looking at live visualizations.

The screenshot below is from an early attempt to show trends by incremental shares. It turned out that the data was too spiky for this visualization to be useful.

An early attempt to show trends by incremental shares


The image below is our breakdown by sharing platform. This proved to be surprisingly uninteresting when broken down with real data, as Facebook almost always dominated the number of shares. Although aspects of this remained in the launched prototype, it was given much less prominence.

Graphs that showed a breakdown of stories by number of shares were dominated by Facebook


We also tried visualizing data to show the cumulative shares over time. This was more interesting. Below is a rough prototype of what became the final visualization.

A rough prototype of the final visualization


Stream 2: Working on user flows and visual relationships in data

While Andrew was modeling data, Matt Nish-Lapidus, our Design Director, was working with his team on user flows, graphing, and building visual relationships between different aspects of the data.


When the design team first started thinking about the visualizations, we sketched area graphs and stacked charts that told the story well. But when we plugged in real data the they just didn’t work. For example what we thought was going to be a really nicely balanced graph ended up being 90% one value which left the other 10% practically invisible.


“Having these constant feedback loops every day or two helped us cut off avenues of research before we got too far down a path that wasn’t working. This was especially important as the data itself was changing so much from one day to the next, depending on what was happening in the news.”
Andrew Hull, Normative

Together with Craig and Adam, we determined that the main focus for this iteration should be the the claim itself, the main summary view you see below.

The main summary view for an individual story


We had to consider how we could highlight the right types of information at the right levels and draw the right relationships between them.

We played with a number of different ideas. The initial concept was that we wanted the data itself to be the interface so we didn’t want to add a lot of interface elements — buttons and controls. We wanted people to be able to directly manipulate the information. We took this concept pretty far and eventually started running into some big issues around having to duplicate information on the page, and long views where you would lose the controls as you scrolled.

An early mockup of the claim summary view


While these were all things we could address with time, our time constraints means that some of those initial aspirations were shelved in favour of things like the buttons pictured below.

This type of button-based filter control wasn’t part of the initial concept, but did the job for the first iteration


A common thread throughout the first version of Emergent was maintaining a balance between using all the data at hand, simplicity, and storytelling. We all saw so many opportunities for exciting visualizations and data exploration, but ultimately we had to save those for version 2.

Through three complete iterations over our three week timeframe we managed to create an initial version of the software that enabled us to answer our questions about the project.

We were excited to launch Emergent on time, and the project has received a great response from both the journalism community and the general public. So far we’ve received great coverage from media outlets, as well as feedback from the ONA conference and continued use of the site. We’re really excited take what we’ve learned and get to work on the next version of Emergent, which is starting right now.


Emergent is a real-time news rumour tracker that is part of a research project with the Tow Center for Digital Journalism at Columbia University that focuses on how unverified information and rumor are reported in the media. Read more about the research here.

Normative is a software design firm. We help organizations design for the networked world. Get the monthly Normative Dispatch for the best from our blog, studio news and articles about Design for the Network.

Email me when Design For The Network publishes stories