Commerce Department Announces New Open Government Initiatives

Printer-friendly version

FOR IMMEDIATE RELEASE

Wednesday, December 9, 2009

CONTACT OFFICE OF PUBLIC AFFAIRS

202-482-4883

Commerce Department Announces New Open Government Initiatives

The U.S. Commerce Department today announced several initiatives that reflect President Obama’s commitment to increasing transparency and accountability in Washington and ensuring greater access and information for the American people.

“President Obama took office with a call for unprecedented openness in government, and we are heeding that call,” U.S. Commerce Secretary Gary Locke said. “Americans have a right to understand the decisions made by their government, and today’s announcement will shed important light on them.”

Yesterday, the White House announced the beginning of an ongoing commitment across the administration to create a culture of openness in government. Among the new initiatives announced by the White House yesterday were two from the Commerce Department.

The Department’s National Institute of Standards and Technology committed to making information available on Data.gov in an RSS feed related to:

    1. Small Business Innovation Research opportunities

    2. Publicly-funded technologies available for license

Along with five other federal agencies, NIST is working to increase access to information to empower innovators to find the information they need and receive real-time updates, which can fuel entrepreneurial momentum, create new jobs, and strengthen economic growth.

Additionally, the Commerce Department’s Patent and Trademark Office committed to making all patents, published patent applications, and related materials more easily searchable by the public online. The ability to sift through over 7 million patents on useful inventions, design patents, and plant patents will enable entrepreneurs to find patents on which to found new businesses and discover improvements leading to original patentable products and services.

With intellectual property-based businesses estimated to contribute nearly 40 percent of growth achieved by all U.S. private industry, the impact on jobs and the economy of more patent transparency is likely to be significant. As a step towards improved access to historical and current U.S. published patent data, the USPTO will begin posting this data online for free download through a third-party provider in the first quarter of calendar year 2010.

Details about the new initiatives announced by the Commerce Department today can be found below.

National Institute of Standards and Technology (NIST)
Project:
Improving Dissemination of Basic Research Results via Web and Social Media

What’s New: Tagging content to ease search; simplifying public feedback process

As part of an effort to improve broad dissemination of its research results, NIST will implement a new website design based on a content management system. The system includes access to an improved database of research papers authored or co-authored by NIST researchers. Content posted on the new website will be “tagged” by topic area and the public can then subscribe to receive new information posted on the website on specific topics of interest such as nanotechnology or energy-related research. The new website will also allow the public to comment or ask questions about posted research articles and to easily share content from the site with their own websites. NIST has recently created YouTube, Facebook, and Twitter sites as well. To ensure that as many people benefit from NIST’s work as possible, news of major research results posted on the new NIST website will be routinely announced through these additional social media sites.

Project: Improving Access to the Digital Data Repository of NIST Collections including publications, artifacts, and photographs relating to measurement science

What’s New: Using Open Archives Protocol to allow for automatic harvesting by major search engines and research repositories

Currently, information regarding NIST publications is available online via the NIST Research Library’s online catalog including links to full text of many publications. Information about some of the objects in the NIST Museum is available through the NIST Virtual Museum (NVM). The online catalog and the NVM are available to the public. In FY10, NIST will implement a digital library repository. This repository will conform to current and emerging library and publishing metadata standards to enhance discoverability and harvesting by other scholarly and research repositories. The repository will contain the full text of NIST technical publications, including the Journal of Research, as well as images and information about NIST’s historical scientific objects. The metadata will conform with Open Archives Protocol for Metadata Harvesting (OAI-PMH) which is the accepted standard within scholarly and scientific communities for making the contents of information collections available to researchers. File formats will be consistent with GPO, Library of Congress, and National Archives preservation formats. The repository will permit the digital forms of NIST Technical Publications and other content to be easily searchable by the public through major Internet Search Engines, such as Google, Google Books, Google Scholar, WorldCat, and Yahoo. This will significantly improve publication/distribution of NIST research.

National Oceanic and Atmospheric Administration (NOAA)
Project:
Modernizing the NOAA Climate Data Base

What’s New: Digitizing data from weather stations collected in the 18th and 19th centuries.

The Climate Data Modernization Program (CDMP) supports the NOAA mission to collect, integrate, assimilate and effectively manage Earth observations on a global scale, ranging from atmospheric, weather, and climate observations to oceanic, coastal, and marine life observations. Many of these data were originally recorded on paper, film, and other fragile media. Prior to CDMP, not only were these valuable data sources mostly unavailable to the scientific community, but storage technology for the archive was obsolete. Today, CDMP has greatly improved the preservation and access to NOAA’s holdings by migrating many of these resources to new digital media. CDMP has placed online over 53 million weather and environmental images that are now available to researchers around the world via the Internet. The amount of data online has grown from 1.75 TBs in 2001 to over 11 TBs in 2009. Hourly weather records keyed through CDMP continue to be integrated into NOAA’s digital data base holdings, extending the period of record for many stations back into the 1890’s. Additional daily data records keyed through the CDMP will extend this data period back to the 18th century for several weather stations.

Project: Severe Weather Data Inventory (SWDI)

What’s New: Simplified access to current and past information on severe weather incidents

The Severe Weather Data Inventory (SWDI) at NOAA’s National Climatic Data Center (NCDC) provides users access to archives of several datasets critical to the detection and evaluation of severe weather. These datasets include:

  • NEXRAD Level-III point features describing general storm structure, hail, mesocyclone and tornado signatures
  • National Weather Service Local Storm Reports collected from storm spotters
  • National Weather Service Warnings
  • Lightning strikes from Vaisala's National Lightning Detection Network (NLDN)

SWDI archives these datasets in a spatial database that allows for convenient searching. These data are accessible via the NCDC web site, FTP or automated web services. The results of interactive Google Maps-based web page queries may be saved in a variety of formats, including plain text, XML, Google Earth’s KMZ and Shapefile. Summary statistics, such as daily counts, allow efficient discovery of severe weather events. For more information, please refer to http://www.ncdc.noaa.gov/swdi.

Project: Ocean Surface Current Simulator

What’s New: Upgrading the ability to visualize changes in ocean surface currents

The Ocean Surface Current Simulator (OSCURS) numerical model is a research tool that allows oceanographers and fisheries scientists to perform retrospective analyses of daily ocean surface currents anywhere in a 90-km ocean-wide grid from Baja California to China and from 10N to the Bering Strait. The model is used to measure the movement of surface currents over time, as well as the movement of what is in or on the water. Ocean surface currents affect organisms suspended in the water column such as fish eggs, small larvae, and plankton, and may affect their survival by determining their location after a few months of drift. Even swimming or migrating fish or mammals may have their destinations significantly offset by currents or annual variability of currents. OSCURS has gained visibility as an accidental debris tracker to analyze accidental but fortuitous at-sea events beyond the scale of normal oceanographic science. Investigations of events such as spills of cargo containers loaded with plastic bathtub toys have been used to fine-tune the OSCURS model.

The model has been served for many years by a Live Access Server (LAS) at NOAA and has been used heavily, however the old LAS requires an outdated browser (Netscape) and only allows the user to visualize and download one OSCURS run at a time. Data serving technology has greatly improved, and we are developing a new interface to serve the OSCURS model (http://las.pfeg.noaa.gov/oscurs) that uses Google Maps as the visualization tool and the latest in AJAX technology to substantially improve the user experience. Users will be able to visualize many runs at a time and possibly view other relevant environmental data using the same interface. This project should be ready for the public by the end of the calendar year.

Project: San Francisco Exploratorium

What’s New: Near real-time ability to visualize weather and water conditions in San Francisco Bay

NOAA Fisheries is developing a new way to visualize regional data in the San Francisco Bay (http://las.pfeg.noaa.gov/SFBay). Data are available from shore stations, buoys, high-frequency radar, and satellites, but are scattered among many web pages and stored in many formats. It is difficult for regional and public interests in the San Francisco Bay area to visualize and use for assessment of real-time conditions. As a demonstration tool to support NOAA's new partnership with the renowned science museum, the Exploratorium, and in collaboration with CeNCOOS and other regional data providers, NOAA is developing a Web page to make it easy to visualize near-real time data in San Francisco Bay. The interface will use Google Maps and the latest AJAX technology to combine and compare data from diverse sources. Users will be able to visualize water temperature, salinity, and other station-based measurements along with overlays of satellite measurements of SST and radar measurements of currents. Users will also be able to compare time series of measurements from various stations and sources. In addition, model data and animations will be added as they become available. The development of this project will be on-going, as new data will continue to be added as it becomes available, but a public version will be ready by the end of the year.

Project: U.S. Drought Portal—addition of soil moisture observation data

What’s New: Making publicfor the first timesoil moisture observation data

Recognition of drought risks in a timely manner is dependent on our ability to monitor and forecast the diverse physical indicators of climatological drought, as well as relevant economic, social, and environmental impacts. A 2004 report from the Western Governors’ Association made it clear that recent and ongoing droughts expose the critical need for a coordinated, integrated drought monitoring, forecasting, and early warning information system. To fill this need, Congress passed the National Integrated Drought Information System Act of 2006 (Public Law 109-430) (NIDIS). The first component of NIDIS is the Drought Portal (www.drought.gov). It is part of the interactive system to:

  • Provide early warning about emerging and anticipated droughts
  • Assimilate and quality control data about droughts and models
  • Provide information about risk and impact of droughts to different agencies and stakeholders
  • Provide information about past droughts for comparison and to understand current conditions
  • Explain how to plan for and manage the impacts of droughts
  • Provide a forum for different stakeholders to discuss drought-related issues

The next major addition to the drought portal will be soil moisture observation data from the U.S. Climate Reference Network, not currently available to the public. The U.S. Drought Portal will add soil moisture data operationally by December 31, 2009.

Project: Historical Climate Reanalysis Project

What’s New: Re-launching and expanding access to data sets describing past weather

The 20th Century Reanalysis project is using a 3-D globally-complete climate model as well as available weather observations to produce output fields of weather variables 4 times daily, starting in 1871 and ending close to the present. Using what are often, especially in earlier years, sparse data sets of observations the project is able to "reconstruct" past weather and fill in missing data values over the rest of the globe. These data will be available via a number of different types of Web-based, interactive plotting pages as well as via file download. In addition to generating plots, users will be able to do basic analyses of the data, download subsets of the data, and obtain the data in Google Earth format, allowing them to be visualized easily by the general public using the Google Earth application. Currently, the data are available at NOAA’s Earth Systems Research Laboratory/Physical Sciences Division, but only in 'grib' format -- a format that is extremely hard to read and it is not available for online plotting and analysis. The complete dataset itself is well over 4 Terabytes—examining even parts of it can use enormous space and computing resources. By enabling the public to work with the data and data products online, NOAA will be enabling users to examine past weather and climate events in detail in a way that was never before possible. Version 1 of the project is available today at www.esrl.noaa.gov/psd/data/20thC_Rean/. However, it only spans the years 1908-1958 and does not include the interactive plotting tools described above. Version 2 is currently under development. NOAA expects that data for 1891 to the present to be available online in FY2010 Q3 and will include online plotting and analysis tools.

National Technical Information Service (NTIS)
Project:
Making 5 years of Bibliographic Data Searchable

What’s New: Making 180,000 records describing federal reports available in XML

NTIS is making the latest 5 years of the NTIS Bibliographic File searchable via Data.gov. The file contains over 180,000 bibliographic records that link to a Web-store of federally funded technical reports from a broad spectrum of federal agencies. We are making the NTIS Bibliographic file available to Data.gov in a compiled XML format which will for the first time open the access to the NTIS technical reports collection to full Web exposure and extraction. NTIS will measure the effect of increased exposure via Data.gov by comparing future ordering information to existing baseline data. The increased exposure of scientific and technical content will be a significant step forward in opening public access to heretofore limited library and commercial vendor access to this valuable collection.

United States Patent and Trademark Office (USPTO)
Project Name:
Enhancement to Patent Maintenance Fee Events Data (Machine Readable)

What’s New: Making fee data available in machine-readable form for the first time

In FY2010 Quarter 1, the USPTO plans to make available to the public a new machine-readable online product—Patent Maintenance Fee Events. Patent Maintenance Fee information has previously been available only via interactive patent application retrieval from the USPTO Public PAIR system. This data has been frequently requested by USPTO data customers and will be the first machine-readable, raw data from the USPTO Public PAIR system.

Project: Expansion of Patent Bibliographic Data

What’s New: Expanding the online availability of information on past patent grants and applications.

FY2010 Quarter 1, the USPTO plans to make available more Patent Bibliographic Data – Grants (09/1996 – 12/2008) and Patent bibliographic Data – Applications (03/15/2001 – 12/2008). These data will expand the current USPTO dataset offerings on Data.Gov.

Project: Enhancement of Existing USPTO Data Capabilities Available to the Public

What’s New: Upgrading of existing mechanisms for training users on IPR

USPTO is developing an outsourcing model for public e-learning opportunities to globally educate and train the public on intellectual property, patents and trademarks. USPTO is identifying better search tools and it is re-architecting application management systems to improve applicants’ electronic business experience with 24/7 capability. In addition to the USPTO data sets already available on data.gov, USPTO is working with the public to identify mechanisms to quickly expand public access to more USPTO data.

National Telecommunications and Information Administration (NTIA)

NTIA is embarking on a series of data collection and dissemination initiatives to provide a more detailed, quantitative understanding of broadband Internet access and use in the United States. This information can inform efforts to increase broadband access and adoption, supporting economic growth. Initiatives will include data collected through NTIA’s broadband mapping program and a new broadband-related survey.

Project: Mapping

What’s New: National, interactive map showing broadband availability and speeds

NTIA’s State Broadband Data and Development Grant Program, funded by the American Recovery and Reinvestment Act (ARRA), provides grants for broadband data collection and planning. Data will be displayed in NTIA’s national broadband map, which will be made publicly available no later than February 2011. The map will display the geographic areas where broadband service is available; the technology used to provide the service; the speeds of the service; and broadband availability at public schools, libraries, hospitals, colleges, universities, and public buildings. The national map will be interactive, searchable by address, and show the broadband providers offering service in the corresponding census block or street segment. Data collection for the map will be conducted on a semi-annual basis between 2009 and 2011, with data to be presented in a clear, accessible, and open format to the public, government, and research community. This new initiative will provide broadband information at an unprecedented level of comprehensiveness and granularity.

Project: Broadband Survey

What’s New: Resuming use of the Census Bureau’s periodic “current population survey” to study Internet usage

Working with the Census Bureau, NTIA launched a 75,000-household Internet-use survey in the October 2009 Current Population Survey. This effort will examine why people do not use high-speed Internet service and explore differences in Internet usage patterns around the country and across socio-economic groups. NTIA intends to release data in open, web-based formats as well as make the survey instruments and associated reports widely available to the extent possible.