Is It Time For Computers To Have Their Own .Data Domains?

Erick Schonfeld

Erick Schonfeld is a technology journalist and the executive producer of DEMO. He is also a partner at bMuse, a product incubator in New York City. Schonfeld is the former Editor in Chief of TechCrunch. At TechCrunch, he oversaw the editorial content of the site, helped to program the Disrupt conferences and CrunchUps, produced TCTV shows, and wrote daily... → Learn More

Tuesday, January 10th, 2012
data

The web, as we all know, was built for humans. A nice graphical interface to the internet, which has been around much longer. But as the web has grown from a nice way to display information to the largest computing infrastructure on the planet, we need to make the web friendlier for computers once again. Computers don’t want to look at pretty web pages. They want data.

Of course, there are a whole mish-mash of APIs and other ways computers speak to one another across the internet. But it is not standardized, and it is a mess. Computer scientist and Wolfram Alpha founder Stephen Wolfram thinks there is a better way for computers to speak to each other. He suggests that it is time for a new .data top-level domain.

The familiar top-level domains are .com, .org, .gov, and so on. But the number of top-level domains is about to be expanded greatly. Some people think that adding more top-level domains is a waste (hi, Esther!). But Wolfram’s suggestion is worth considering. He lays out his thinking in this blog post:

My concept for the .data domain is to use it to create the “data web”—in a sense a parallel construct to the ordinary web, but oriented toward structured data intended for computational use. The notion is that alongside a website like wolfram.com, there’d be wolfram.data.

If a human went to wolfram.data, there’d be a structured summary of what data the organization behind it wanted to expose. And if a computational system went there, it’d find just what it needs to ingest the data, and begin computing with it.

Sure, there are already many ways to extract data from sites today, but “it’s mostly from a complicated patchwork of data files and feeds and database dumps,” he concludes. What he wants instead is a standard way for computers to retrieve structured data from any site.

Us humans would keep going to the .com domains in our browsers, while computers can visit the .data domains directly. Right now, popular services have to keep their websites current and also keep their APIs current, which can take almost as much work. But every site’s APIs are slightly different, and they are not all readily exposed or available like HTML web pages are to human eyes. A web of .data sites would be built for computers. It’s time they had information standards too.


Company: Wolfram Alpha
Website: wolframalpha.com

Wolfram Research is building a computational knowledge engine called Wolfram|Alpha for the web to be launched in May 2009. The product will contain data in various fields including physical sciences, technology, geography, weather, cooking, business, music, etc. in order to provide answers to questions that users input. Its language interface will accommodate variations in how users frame their questions, such as the use of abbreviations. Wolfram Alpha’s vision is to create a system which can do for formal...

→ Learn more

Stephen Wolfram is a distinguished scientist, inventor, author, and business leader. He is the creator of Mathematica, the author of A New Kind of Science, and the founder and CEO of Wolfram Research. His career has been characterized by a sequence of highly original and significant achievements. Born in London in 1959, Wolfram was educated at Eton, Oxford, and Caltech. He published his first scientific paper at the age of 15, and had received his PhD in theoretical physics from...

→ Learn more