|
||||
February 2009
MADIS Seminar
Patty
Thursday, 12 February 2009, 1:30 PM
Unidata Conference Room, FL4, 1318Abstract
Presentation:
PDF/PPT
Webcast
December 2008
A prototype Earth-gauging system integrating weather and health data to manage meningitis
Raj Pandya, UCAR
Monday, 8 December 2008, 1:30 PM
Unidata Conference Room, FL4, 1318Abstract
The overarching goal of this seminar is to describe efforts to save lives and enhance livelihoods in Ghana through integration of health and environmental data, and by using that data in service of health-related decision-making. Specifically, we aim to build and implement prototype decision-support system that integrates two- to 14-day weather forecasts and epidemiological data to provide actionable information that can be used to contain the spread of meningitis epidemics. By applying a preliminary economic evaluation of this decision support system, we will be able assess the potential benefit of using environmental data to improve public health outcomes, help prioritize continuing investment in meningitis management in Ghana and throughout the Meningitis Belt, and determine the appropriateness of extending the prototype to other diseases, nations, and continents.
This effort is a small piece of an overall Google.org effort to develop an /Earth-gauging System /that will integrate environmental, health and development data into products that stakeholders and researchers can use to monitor variables, analyze trends and identify relationships among different variables. The /Earth-gauging System /will support the prediction of emerging threats, and provide the basis for a robust early-warning system that will improve health, food security, and development and conservation outcomes.
October 2008
Impact of the 2008 tropical cyclone season on the Baja California Peninsula, Mexico
Luis M Farfan, CICESE
La Paz, Baja California Sur, MexicoFriday, 31 October 2008, 12:30 PM
Unidata Conference Room, FL4, 1318Abstract
The current season has almost ended, with 16 named systems, of which three made landfall over the southern half of the peninsula. The landfall cyclones included Tropical Storm Julio (August 24), Tropical Storm Lowell (September 10), and Hurricane Norbert (October 10). Beginning this summer, CICESE is receiving satellite imagery from Unidata, and this has been used to document the development of the above cases. In addition, GEMPAK is used to make graphical displays on a real-time basis.
The presentation includes meteorological aspects of the structure and movement, track predictions, and the impact of the storms on the population in the area, with emphasis on Hurricane Norbert. This season was remarkable in that it was the first time, since observations were based on satellite imagery that three tropical cyclones made landfall over the southern peninsula.
Link to July 2008 Unidata CommunitE-letter Site Highlight article by Dr. Farfan.
25 September 2008
Special Announcement: A Joint NCAR-UOP Seminar presented by Stick Ware, 3:30 p.m. - NCAR/FL2, Rm 1022. The title of the seminar: Continuous Temperature, Humidity and Liquid Water Profiling - PDF Presentation
July 2008
(Rescheduled from 24 June 2008)
An update on the Scientific Data Type Layer of Unidata's Common Data Model
John Caron, Unidata Program CenterTuesday, July 8, 2008, 1:30 pm
Unidata Conference Room, FL4, 1318Abstract
This talk will give an update on the CDM's abstractions for Scientific Data Types (now called "feature types", to emphasize similarities to the Open Geospatial Consortium (OGC) feature and coverage abstract models). Special attention will be given to the one-dimensional point feature types.
May 2008
Securing the Legacy of the International Polar Year
Mark Parsons, National Snow and Ice Data Center
Tuesday, May 6, 2008, 1:30 pm.
Unidata Conference Room, FL4, 1318Abstract
We are in the midst of one of the most exciting international and interdisciplinary science projects that many of us will encounter in our professional careers - the International Polar Year. Scientists in the natural, social, and health sciences are collaborating on some 228 endorsed projects in both the Arctic and Antarctic during a two-year period (March 2007-March 2009) of intense field observations. These science projects address crucial issues at a critical time in the evolution of the Earth system. A common thread in all projects is how we manage the data for collaboration now, during this IPY, and in the future as new science topics and issues emerge.
The International Polar Year Data and Information Service (IPYDIS) is a global partnership of data centers, archives, and networks working to ensure proper stewardship of IPY and related data. The National Snow and Ice Data Center acts as a coordination office for the IPYDIS to ensure the long-term preservation of broad, interdisciplinary, and non-expert access to IPY data. The IPYDIS tracks the data flow for IPY and helps researchers and data users identify data access mechanisms, archives, and services. The IPYDIS also provides information and assistance to data managers on compliance with standards, development of a union catalog of IPY metadata, and other data management requirements for IPY. It provides a general communication forum for all matters related to accessing, managing, and preserving IPY and related data. The IPYDIS is guided by the IPY Data Policy and Management Subcommittee, which develops the overall IPY data strategy and policies. The IPYDIS also supports and participates in the Electronic Geophysical Year (eGY), which promotes a modern e-Science approach to issues of data stewardship: open access to data, data preservation, data discovery, data rescue, capacity building, and outreach.
This presentation reviews current activities and future challenges for the IPYDIS in creating a sustained polar data system. Ultimately, we seek to create a data preservation and access "utility" a core infrastructure of science that is simple, predictable, reliable, extensible, accessible, and durable. But just like with existing utilities, such as water, electricity, and communications, the basic simplicity on the surface belies deep complexity, structure, planning, and professionalism. Creating that level of infrastructure requires great collaboration around standards, maintenance, and professional development and certification. We must bridge cultural barriers between scientific disciplines, between data managers and researchers, between libraries and data centers.
Presentation
Webcast
April 2008
MicroWave Weather Cam
Randolph Ware, Chief Scientist, Radiometrics Corporation and Visiting Scientist NCAR/MMMThursday, April 24, 2008, 12:00 PM, UOP Brown Bag
Unidata Conference Room, FL4, 1318Abstract
A hyperspectral microwave camera reveals otherwise invisible air temperature, humidity and liquid structure, during all weather conditions. Combined with Internet browser-based control and display software, this technology is a powerful new tool for research and education.
Presentation
Webcast
PDF - The presentation has a call-out icon on the upper left of each slide (the mouse pointer activates it)
URL: http://r1.rmetrics.com - usr: unidata pwd: radiometer
April 2008
DeSouza Community Award Presentation: Use of Unidata Software, Tools and Data
Mark Laufersweiler, University of OklahomaThursday, April 10, 2008, 1:30 pm.
Unidata Conference Room, FL4, 1318
October 2007
Opportunities and Challenges for Meteorology in Africa with a focus on
West Africa: A Perspective from current ongoing UCAR projects
Roelof Bruintjes, RALMonday, October 15, 2007, 1:30 PM
Unidata Conference Room, FL4, 1318Abstract
UCAR and the universities have embarked on an Africa Initiative about a year ago. The primary purpose was to focus some of the work conducted at UCAR and NCAR in Africa in a coordinated manner and to involve the university community and others in the U.S., Europe and Africa in this effort. UCAR has currently several ongoing projects in Africa with most in West Africa. Several pilot projects were launched in the past year. The seminar will provide an overview of these projects and the plans to enhance these projects in the coming year. Several other U.S. institutions (US-AID, U.S. State Department and the National Academies of Sciences) have also recently or are on the process of developing new initiatives in Africa.
The seminar will also provide a brief introduction to these initiatives and our potential interaction and participation in these initiatives.
Presentation
Webcast
PPT/PDF
September 2007
OGC Interoperability Day at Unidata: Standards-based Web Services Interfaces to Existing Atmospheric/Oceanographic Data Systems.
Wednesday, September 19, beginning at 8:00 AM.
Center Green,Webcasts:
(Morning sessions)
(Afternoon sessions)
Joint Unidata NCAR/EOL Seminar
Writing NetCDF Files: Best Practices
John Caron and Russ Rew, Unidata Program CenterThursday, June 28, 1:30 PM
Unidata Conference Room, FL4, 1318Abstract
Data formats, data models, and data conventions play different roles in capturing the meaning in data and enhancing interoperability. The Unidata Common Data Model, the netCDF-4 format, and developments with the CF data conventions offer opportunities for better data representations, but compatibility with existing programs and services must be a consideration during a transition to use of the new capabilities.We list potential benefits in using what the new data model offers and describe advantages of using the new model with the "classic" format. In addition, we provide some early recommendations for data providers and software developers and speculate on changes to best practices.
We will also look at some examples of existing conventions for observational data to understand their strengths and limitations.
June 2007
Data and Services provided by EUMETSAT, the European Organisation for Exploitation of Meteorological Satellites
Volker Gaertner, EUMETSATTuesday, June 5, 1:30 PM
Unidata Conference Room, FL4, 1318Abstract
The seminar talk will provide an overview of the tasks, duties and services of the European Organisation for Exploitation of Meteorological Satellites, EUMETSAT. As an operational agency EUMETSAT is providing data for weather forecasting, climate monitoring and environmental applications. EUMETSAT maintains and develops space based satellite systems like Meteosat and Metop.
May 2007 - a
Joint NCAR/EOL-Unidata Seminar
Steve Williams, Chris Webster, Dennis Flanigan, NCAR EOLTuesday, May 29, 2007, 9:30-11:30
Unidata Conference Room, FL4 1318
March 2007
The Comprehensive Large Array Stewardship System (CLASS)--Living with a multi-petabyte resource
Eric A. Kihn, NOAAMonday, March 19, 1:30 PM
Unidata Conference Room, FL4, 1318Abstract
The CLASS project derives in part from an effort by NOAA to centralize its numerous data systems for environmental data access. The goal of this effort was to eliminate the various "stove-pipe” systems and produce a unified "enterprise” data access system for the major NOAA data holdings.
The CLASS the system is:
- The IT component for archive of certain NOAA data.
- A secure storage system for other NOAA data.
- A portal for some NOAA and non-NOAA data.
- A component of the infrastructure necessary for science data stewardship activities.
- Interoperable with other systems relating to the NOAA mission.
A new node of the CLASS system will open this year in Boulder at the National Geophysical Data Center and eventually be populated by many petabytes of data including GOES, POES, NPOESS, NEXRAD and many others. This talk will present the impacts and opportunities such an archive allows the environmental data user. It will cover API development, metadata and high-volume data access as well a discussion of the future of CLASS within NOAA.
November 2006
AND Archives: Freeing Ourselves from the "Tyranny of the OR"
Ted Habermann, NOAA National Data CentersWednesday, November 29, 1:30 PM
Unidata Conference Room, FL4, 1318Abstract
Jim Collins described the Tyranny of the OR in his now classic book Built to Last: the rational view that can not easily accept paradox, that cannot live with two seemingly contradictory forces or ideas at the same time. This concept may have some relevance in the data management community. In many cases, data users have been divided into two groups: science users OR GIS users, and systems are built that serve one group OR the other. We might call these OR Archives. Collins suggests that highly visionary companies do not oppress themselves with the Tyranny of the OR. In the data management world, these might be termed AND Archives. They build systems that support science users AND GIS users. The Common Data Model and THREDDS Data Server work done recently at Unidata has demonstrated the possibility that AND can be integrated into the foundation of our data systems. I will explore this idea using some recent work at NGDC with Level 2 Sea Surface Temperature observations from NESDIS. It turns out that these data blur some traditional distinctions between GIS and satellite data. I will also discuss some recent work on rich inventories and diffusion of standards in mature organizations.
August 2006
Initial Results and Operational Plans for the six-satellite Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC)
Chris Rocken and Doug Hunt, UOP, COSMIC ProgramTuesday, August 29, 1:30 PM
Unidata Conference Room, FL4, 1318Abstract
Formosat-3 / COSMIC (F3/C) is a joint Taiwan - US radio occultation (RO) satellite mission for weather forecasting, climate monitoring, space weather monitoring, and geodetic research. NSF, NOAA, NASA, USAF and ONR jointly support the US component of the mission. The F3/C mission was successfully launched into a circular ~ 500 km low-Earth orbit from Vandenberg Air Force Base, California, on 15 April 2006. Six identical micro satellites, each carrying an advanced GPS radio occultation receiver, a tiny ionospheric photometer (TIP) and a Coherent Electromagnetic Radio Tomography (CERTO) beacon for ionospheric tomography/scintillation, were deployed successfully. All satellites and payloads appear to be working well. Operational data processing and distribution of the data to users started in July, 2006. Presently (mid-August 2006) the mission is generating about 1000 profiles in the neutral atmosphere and up to 2500 profiles in the ionosphere. Ultimately the mission will generate over 2500 high-resolution daily profiles of atmospheric bending, refractivity, pressure, temperature and humidity with high vertical resolution and high long-term stability in all weather. The mission will also provide a large ionospheric data set including line-of sight satellite to satellite and satellite to ground TEC, ionospheric profiles, scintillation (S4 parameter) observations, and TIP radiometric observations. This presentation will provide a description of the F3/C mission and data products with emphasis on data distribution and a demonstration of tools for data analysis.
July 2006
Interoperability between Earth Sciences and GIS models: an holistic approach
Stefano Nativi, Italian National Research Council and the University of FlorenceThursday, July 27, 2006, 1:30 PM
Unidata Conference Room, FL4, 1318As observational and model output datasets in the Earth Sciences (e.g. oceanography and atmospheric science) increase in resolution, there is an increasing demand for information systems that interoperate between the Land Management (mainly using GIS) and Earth Sciences realms. However, differences in the way the two communities think about their data and services can cause difficulties. In the present Web era, these different conceptual approaches produce diverse content models, generating disciplinary Markup Languages, and diverse service protocols and interfaces. As the technology of web services accessible by computer programs evolves, the challenge for those studying the Earth from an interdisciplinary perspective is to develop interoperable data models that can span the specific models employed in individual disciplines. Moreover, these interoperable models have to be integrated with the semi-structured framework of the Web itself.
International initiatives (e.g. ISO TC 211 and the Open Geospatial Consortium OpenGIS) have released geo-information standard models conceived to support general interoperability. These efforts led to the definition of "more general" models for Geospatial Information. This approach is gaining value in several international initiatives, such as: GEOSS, National Spatial Data Infrastructure initiatives (e.g. ESDI and FGDC/NSDI), NSDL, GMES (Global Monitoring of Environment and Security) initiative, etc. There exists the need to shift from a "traditional" cartographic to a more general informatics viewpoint.
An holistic approach to mediate and harmonize the different data and metadata models of the two communities (i.e. Land Management and Earth Sciences) is discussed. An implemented solution is presented; based on it, a valuable framework is introduced and some experimentations are reported.
Several important and open issues will be introduced, such as: discovery services harmonization, scientific ML harmonization/mediation, spatial data infrastructure implementing rules, security and trustability services.
May 2006
A Virtual Operations Center (VOC) for Field Experiments in the Atmospheric Sciences
Mike Daniels, Earth Observing LaboratoryTuesday, May 30, 2006, 1:30 PM
Unidata Conference Room, FL4, 1318Abstract
Field experiments in the atmospheric sciences have grown in complexity, in part due to enhanced capabilities of our observing platforms and models as well as more available access to a whole array of new operational networks (e.g. satellites, mesonets, WSR-88D radars, etc.). During a field campaign, these data are sent to an on-site Operations Center and used in the real-time direction of observing platforms and subsequent analysis to plan future missions. Advances in communications and Information Technology have given us an opportunity to significantly improve our ability to meet the science-driven requirements of these centers, now and in the future. The Virtual Operations Center (VOC) proposal, recently submitted to the NSF, seeks to formally address the challenges of building these centers in more cohesive way using technologies that are being developed across UCAR. The VOC will address new capabilities in the areas of visualization, forecast model assimilation, real-time data management and network-based collaborative tools which will lead to more efficient use of our observing platforms and improved field project datasets. It will also expand the field participation of interested researchers, staff, students and others at remote sites.
February 2006
The Earth System Modeling Framework and Earth System Curator: Software Components as Building Blocks of Models and Community
Cecelia DeLuca, UCAR/Scientific Computing DivisionMonday, February 27, 1:30 PM
Unidata Conference Room, FL4, 1318Abstract
The Earth System Modeling Framework (ESMF) is an established multi-agency effort to develop infrastructure for building and coupling Earth system models. The newly-funded Earth System Curator is a related database and toolkit that will store information about model configurations, prepare models for execution, and run them locally or in a distributed fashion. The key concept that underlies both projects is that of software components. These components may be representations of physics domains, such as atmospheres or oceans; processes within particular domains such as atmospheric radiation; or physics or computational functions, such as data assimilation or coupling. ESMF provides interfaces, an architecture, and tools for structuring components hierarchically to form complex, coupled modeling applications. The Earth System Curator will enable modelers to archive and manipulate components. Together these projects encourage a new paradigm for modeling: one in which the community can draw from a federation of many interoperable components in order to assemble and run applications. Groups that are using ESMF include WRF, CCSM, NOAA NCEP and GFDL, NASA climate programs, the Army, Air Force, and Navy, and universities including MIT and UCLA.
January 2006
GRID-BGC: A Grid-Enabled Terrestrial Carbon Cycle Modeling System
Matthew Woitaszek and Jason Cope (University of Colorado)Monday, January 23, 1:30 PM
Unidata Conference Room, FL4, 1318Abstract
Grid-BGC is a Grid-enabled terrestrial biogeochemical cycle simulator collaboratively developed by the National Center for Atmospheric Research (NCAR) and the University of Colorado (CU) with funding from NASA. The primary objective of the project is to utilize Globus Grid technology to integrate inexpensive commodity cluster computational resources at CU with the mass storage system at NCAR, while hiding the logistics of data transfer and job submission from the scientists. We describe a typical process for simulating the terrestrial carbon cycle, present our solution architecture and software design, and describe our implementation experiences with Grid technology on our systems. By design, the Grid-BGC software framework is extensible in that it can utilize other Grid-accessible computational resources and be readily applied to other climate simulation problems with similar workflows. Overall, this project demonstrates an end-to-end system that leverages Grid technologies to harness distributed resources across organizational boundaries to achieve a cost-effective solution to a compute-intensive problem.
December 2005
The LEAD Effort at Unidata
Tom Baltzer, Brian Kelly, Doug Lindholm, Anne Wilson (Unidata)Wednesday, December 14, 1:30 PM
Unidata Conference Room, FL4 1318Abstract
The LEAD project (http://lead.ou.edu/) aims to build a cyberinfrastructure for mesoscale research and education. Unidata is one of the ten institutions involved in this effort. In addition to providing experience with atmospheric data and deployment of robust, reliable software, currently Unidata's contributions to this effort fall into three main areas: implementation of steered forecasts, the building of a large testbed for computing and storage, and the THREDDS Data Repository, a THREDDs-compatible data storage framework that provides high level data storage abstractions and also supports data discovery through browsing and querying of metadata. This talk will describe these efforts including their benefits to both LEAD and the Unidata community.
November 2005
The Virtual Observatory
Peter Fox, High Altitude ObservatoryMonday, November 28, 1:30 PM
Unidata Conference Room, FL4 1318Abstract:
Since being named around 2000, Virtual Observatories (VO) have started to change the way some fields of science think about their data collection, storage and distribution. Unfortunately, the concept of VO's far preceded any instantiations. As a result this has led to a great deal of uncertainty
about what a VO is and is not and how they do or do not different from generic distributed data systems. Are they systems, or frameworks, do they contain data, metadata, etc.?This talk will present the current concepts and examples of instances of VOs in a few different disciplines, and discuss the nature and diversity of these VOs. We then turn to the emerging need for interdisciplinary VOs and how that presents a major challenge to present VOs and their current technologies and especially interoperability.
This discussion will lead into use-case driven methodologies, the importance of semantics, data integration, ontologies, and rapid prototyping. We will use some examples from several discipline specific VOs, and two new projects: the Virtual Solar-Terrestrial Observatory (VSTO) and Semantically Enabled Scientific Data Integration (SESDI).
September 2005
The National Polar-orbiting Operational Satellite System (NPOESS) Ground Segments
Dr. Scott Turek, Raytheon Intelligence and Information Systems, Aurora, CO, 80011Tuesday, September 27, 1:30 PM
Unidata Conference Room, FL4 1318PPT / PDF Presentation
Overview movie
WebcastThe National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). Northrop Grumman is the prime contractor for NPOESS; Raytheon is a primary teammate with overall responsibility for the design and development of the NPOESS Ground Segments. NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. Demanding NPOESS mission data latency and availability requirements drive the selection of a ground architecture that implements a globally distributed network of ground stations, connected via terrestrial fiber. This architecture is part of the Raytheon-developed Command, Control and Communications Segment (C3S). The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), also developed by Raytheon. The IDPS processes NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. The IDPS will process environmental data products beginning with the NPOESS Preparatory Project (NPP) and continuing through the lifetime of the NPOESS system. The IDPS must process a data volume significantly greater than the current POES and DMSP systems and within significantly reduced processing times. This paper will describe the architecture approach to hardware and software that is necessary to meet these challenging NPOESS C3S and IDPS design requirements.
August 2005
National Lambda Rail (NLR) and Potential Merger with Internet2/Abilene
Marla Meehl, Scientific Computing DivisionWednesday, August 24, 1:30 PM
Unidata Conference Room, FL4 1318
PDF Presentation
Webcast
720x480 at 225kbps, Requires RealPlayer 8 or better
Abstract:
This seminar will provide an overview of NLR and its capabilities and the discussions in progress for a potential merger with I2/Abilene.
June 2005LDM 6.4: Combining Flexibility with Power
Steve Emmerson, Unidata Program CenterMonday, June 27, 1:30 PM
Unidata Conference Room, FL4 1318Abstract:
Version 6.4 of the Local Data Manager (LDM) has several new features that make it the most flexible and powerful LDM to date. New features include:
- AUTOSHIFTING: A downstream LDM will now automatically switch the data-product request-mode between PRIMARY and ALTERNATE
in order to maximize throughput while minimizing bandwidth utilization.- UPSTREAM FILTERING: A pattern can now be specified for an upstream LDM to use in filtering the data-products that a
downstream LDM will receive. This enables separation of data feeds into multiple logical ones -- depending on who is requesting the data.- PORT INDEPENDENCE: The well-known LDM port, 388, can now be overridden at both build-time and run-time -- making it
unnecessary to install the LDM as root and allowing the easy creation of private LDM networks as well as LDM "bridges".The seminar will examine these features in detail. It will also present results from a stress-test of a high-availability LDM cluster whose rate of outgoing data was not limited by the LDM and whose peak rate was 8.4 terabytes per day.
May 2005National Mosaic and Quantitative Precipitation Estimation Project (NMQ)
Ken Howard, NOAA/NSSLTuesday, May 17, 1:30 PM
Unidata Conference Room, FL4 1318
April 2005Plans for AWIPS Next Generation
Darien Davis, NOAA/FSLTuesday, April 12, 1:30 PM
Unidata Conference Room, FL4 1318
- Archived Webcast (Note: this video is 720x480@30fps, requires RealPlayer 9 and runs at a bandwidth of 250kbps.)
- PowerPoint Presentation/PDF
Abstract:
The current software and hardware supporting the National Weather Service (NWS) forecast functions is almost ten years old. The current warning system IT was developed as a prototype at the Forecast Systems Laboratory (FSL) and has not had an infrastructure upgrade since. Although the quantity of data has seen a huge influx during that time period, the system has adapted to the needs of the forecaster. However, the time has come to look at new technology to handle the diverse datasets available in the near and distant future.
Activities for technology infusion are underway. In addition to NOAA-wide and NWS-wide activities to address infrastructure changes, FSL is involved with minor prototyping activities to give guidance toward new data retrieval methods. Adding capabilities to simplify the retrieval and display of data is critical for future NWS forecaster needs.
In addition, a visualization programmer interface is instrumental for a collaborative development environment. To leverage the software development talents available with personnel at the NWS field offices and research from other NOAA laboratories, an interface is necessary.
This talk will address the activities at NOAA, NWS, and FSL that are underway investigating new technologies for the forecast office and NOAA in general.
March 2005
The NMQ/Jade Project
Steven Vasiloff, NOAA/NSSLMonday, March 28, 1:30 PM
Unidata Conference Room, FL4 1318
- Archived webcast (Note: this video is 720x480@30fps, requires RealPlayer 9 and runs at a bandwidth of 250kbps.)
- PowerPoint Presentation / PDF
Abstract:
While fresh water resource accounting is critical to the social and economic welfare of the United States, there currently exists no seamless and systematic high-resolution multisensor-based monitoring of precipitation for water resource management. Further, an open, end-to-end infrastructure for research, development, and comparison assessment of new multiple sensor quantitative precipitation estimation (QPE) and short-term quantitative precipitation forecast (QPF) applications necessary for hydrometeorology does not exist at this time. The National Mosaic and QPE (NMQ) project is a joint initiative between the NOAA/National Severe Storms Laboratory and the NOAA/National Weather Service/Office of Hydrologic Development to address the pressing needs for high-resolution multisensor QPE for all seasons, regions, and terrains in support of comprehensive hydrometeorological and hydrologic data assimilation and distributed hydrologic modeling.
The objectives of the NMQ project are as follows:
- Create a framework for community-wide research and development of hydrometeorological applications for monitoring and prediction of freshwater resources in the United States across a wide range of time and space scales;
- Through a national hydrometeorological testbed, facilitate community-wide collaborative research and development and research-to-operations of new applications, techniques and approaches to QPE and short-range precipitation forecasting; and
- Create a scientifically sound real-time system to develop and test methodologies and techniques for physically realistic high-resolution rendering of hydrometeorological and meteorological processes.
This seminar will describe the NMQ computing and algorithm infrastructure that will form the basis for QPE. In addition, the initial formulation of JADE will be described.
February 2005The Developmental Testbed Center
Bob Gall, NCAR and Steve Koch, NOAA/FSLMonday, February 7, 1:30
FL4 1318Abstract: The Developmental Testbed Center (DTC) is a facility where the NWP (Numerical Weather Prediction) research and operational communities interact to accelerate testing and evaluation of new models and techniques for research applications and operational implementation, without interfering with current operations. Currently, the transfer of new NWP science and technology from research into operations is inefficient. This is primarily due to all research being conducted at operational centers and/or their associated research organizations, which does not take advantage of the considerable talent elsewhere in the research community. There are few opportunities in the NWP research community to collaborate in an operations-like environment; and, there is nowhere that these communities can join to perform extensive rigorous model testing without disrupting operations. We will discuss the mission and activities at the DTC, with an emphasis on the current DTC Winter Forecast Experiment (DWFE). The research and operational communities have truly come together to carry out the DWFE, which involves running a 5-km version of the Weather Research and Forecast (WRF) model over the CONUS in real-time. We will discuss the purpose of this experiment, the role that FX-Net and AWIPS (as well as web sites) are playing in the dissemination of model forecast fields and products, the model verification and archival activities, and examples demonstrating the kind of detailed mesoscale phenomena that are being forecast and seen by the National Weather Service.
December 2004NetCDF-Java version 2.2 and the Common Data Model
John Caron, UnidataFriday, December 10, 1:30
FL2 1001Abstract: The latest version of the netCDF-Java library implements a "Common Data Model" (CDM), a merger of the netCDF, OPeNDAP and HDF5 data models. It is a prototype for the netCDF-4 project, which provides a C language API for the data access layer of the CDM, on top of the HDF5 file format. It also is a 100% Java framework for translation of other file formats into netCDF, where the actual writing of the netCDF file is optional.
This technical talk will explain the Common Data Model, and overview netcdf-Java version 2.2 architecture, including the use of NcML and THREDDS to annotate and modify datasets.
November 2004Unidata.binaries.data: Have Some Data With Your Coffee?
Anne Wilson, UnidataThursday, November 4, 1:30pm
Unidata Conference RoomAbstract: Have you ever read NetNews? Have you heard of newsgroups like sci.geo.meteorology, comp.lang.java or rec.games.backgammon? If so, then you're already somewhat familiar with this technology that we are testing to relay atmospheric data in near real time.
NetNews, also known as Usenet, has been a forum for computer-based information exchange since 1979. In spite of newer web-based technologies, it remains a highly popular medium. Although it is impossible to accurately measure current usage, it is estimated that 600GB per day are exchanged among one to ten million servers worldwide, serving roughly 100,000 newsgroups to around 25 million users.
Here at Unidata the NLDM project has been testing data relay using INN, a popular, freely available, open source news server package. In additional to efficient real time delivery and local data management, INN offers features beyond our current LDM technology. These features will be presented in this talk, along with a scenario or two illustrating their possible use.
October 2004
GEON Cyberinfrastructure Developments in the Earth Sciences
Chuck Meertens, UNAVCOFriday, October 1, 1:30pm
Unidata Conference RoomAbstract: The Earth and Computer Sciences communities are engaged in joint efforts to create new Information Technologies to address complex geological and geophysical problems such as will be encountered with the new EarthScope project.
One such collaborative effort in Cyberinfrastructure, called GEON, is an NSF Large Scale ITR that involves the development of a distributed, services-based system that enables geoscientists to publish, share, integrate, analyze, and visualize their data.
The GEON Grid technology-based system enables development and sharing of databases, ontology, tools, workflows, applications, and models. Here in Boulder, scientists and students at UNAVCO, DLESE, and the University of Colorado are participating in GEON efforts to study the Rocky Mountains and working on IT components. UNAVCO is modifying a version of UNIDATA's powerful IDV software to tailor it to meet Earth Science-specific visualization needs. An example will be shown where three-dimensional mantle seismic tomography and geodynamic models were converted to netCDF, placed on UNAVCO's OPeNDAP server, and viewed with the IDV.
September 2004
Exploring the Community Data Portal
Luca Cinquini, SCDMonday, September 13, 1:30pm
FL2, Room 1001Abstract: The NCAR Community Data Portal (CDP) is part of the NCAR Cyberinfrastructure Strategic Initiative and is aimed at developing a central institutional gateway to the large and diversified data holdings of UCAR, NCAR and UOP. The ultimate goal is to provide a state of the art data portal with a broad spectrum of functionality ranging from data search and discovery to catalogs and metadata browsing, from high performance and reliable data download to analysis and visualization. This talk will describe the various technologies and standards that are integrated within the portal architecture, and will demo the currently available functionality.
August 2004
GIS Initiative: Developing an Atmospheric Data Model for GIS
Olga Wilhelmi and Jennifer BoehnertAugust 30, 2004 at 1:30pm
UCAR Foothills Lab 2, room 1001Abstract: Traditionally, geographic information science and technology have evolved from the requirements of a GIS community which until recently did not include atmospheric scientists. In recent years, progress has been made to extend capabilities of general-purpose GIS to the needs of hydrologic and oceanographic communities through development of domain-specific data models. Atmospheric science is one of the fields for which state-of-the-art, general-purpose GIS cannot fully accommodate the data and process models of the phenomena under study. Developing a community-based atmospheric data model for ArcGIS has recently begun with the goal to create means for effective organization of atmospheric data and facilitate extension of ESRI GIS capabilities for representing, querying, and analyzing time-dependent, raster, and volumetric data in a geographic context. This presentation will give an overview of the GIS Initiative ongoing work and discuss recent involvement of the Initiative in developing the atmospheric community data model. We will present the first steps of the conceptual design of the data model and discuss complexities that reside in representing the multidimensional, dynamic atmospheric phenomena.
May 2004
A technical overview of the Abstract Data Distribution Environment (ADDE)
Don MurrayAbstract: ADDE is a client/server protocol developed as part of the McIDAS system. The Unidata community maintains a network of cooperating ADDE servers supplying real-time and archived satellite, radar, grid and point data. McIDAS and IDV users can connect to these and other servers to display and analyze data. A Java ADDE client system was developed as part of the VisAD package allowing other non-McIDAS Java-based programs to access these data. Don's talk will cover the protocol itself, the different types of data in ADDE, the Java interface and future directions. He will also provide a brief comparison with the OPeNDAP (DODS) client/server protocol.
April 2004
MeteoForum: An international network of meteorological training centers for the 21st century
Tom Yoksas and Tim SpanglerAbstract: MeteoForum is a pilot project that is being developed jointly by the COMET and Unidata programs of UCAR. The concept involves the creation and enhancement of an international network where WMO Regional Meteorological Training Centers (RMTCs) work collaboratively with universities to enhance their roles of regional training and education through information technologies and multi-lingual collections of online resources. MeteoForum seeks to build a stronger sense of community among the RMTCs through internet-based interactions and through the sharing of educational concepts, educational materials, and real-time hydrometeorological data with one another and with affiliated universities.
January 2004
OPeNDAP and THREDDS: Access and discovery of distributed scientific data
Yuan Ho and Ethan DavisAbstract: OPeNDAP (Open source Project for a Network Data Access Protocol) provides a protocol and data model through which users can access scientific datasets from distributed locations and various data formats. In the main HTTP based implementation, the user requests data from OPeNDAP servers via URLs and the data are returned from the server via http communication protocols. The client application must then decode the returned data before it can be used. So that existing applications can quickly work with OPeNDAP, various existing data access APIs have been retrofitted to encapsulate the client-side OPeNDAP functionality.
The OPeNDAP protocol does not include a dataset cataloging or discovery mechanism. To this end, the THREDDS (Thematic Real-time Environmental Distributed Data Services) project is developing a framework for cataloging and describing datasets. On this framework, various discovery services are being built.
This presentation will review the fundamental concepts behind the OPeNDAP and THREDDS frameworks. From there, we will discuss how they fit together and future directions.
November 2003
Linked Environments for Atmospheric Discovery (LEAD)
Mohan RamamurthyAbstract: The Linked Environments for Atmospheric Discovery (LEAD) is a National Science Foundation-funded multi-institutional large Information Technology Research effort. In addition to the Unidata Program Center, the following institutions are involved in LEAD: University of Oklahoma, University of Illinois at Urbana-Champaign, Indiana University, University of Alabama in Huntsville, Howard University, Millersville University, and Colorado State University.
The goal of LEAD is to create an integrated and scalable framework for identifying, accessing, preparing, assimilating, predicting, managing, analyzing, mining, and visualizing a broad array of meteorological data and model output, independent of format and physical location. To that end, LEAD will create a series of interconnected, heterogeneous Grid environments to provide a complete framework for mesoscale research, including a set of integrated Grid and Web Services.
September 2003
Atmospheric science and GIS interoperability issues: some data model and computational interface aspects
Stefano Nativi, University of Florence (visiting scientist)Abstract: Some issues dealing with interoperability between atmospheric science and GIS services/data are discussed. Data model and computational interface aspects of interoperability are considered. WCS interface implementation for THREDDS and NcML-G (the GIS extension for NcML) are presented.
August 2003
An overview of RAP's 4DWX (4-D Weather) program, with emphasis on globally-relocatable coupled-model applications, GIS, PDA's,a nd real-time products, in support of Homeland Security
Scott SwerdlinAbstract: RAP's 4DWX program, sponsored by a variety of DOD organizations, is in its 8th year of existence. The emphasis has been on providing operational, high-resolution mesoscale forecasts and storm nowcasts in support of seven Army test ranges and the Defense Threat Reduction Agency, the latter mostly for anti-terrorism applications. Recently, more emphasis has been placed o disseminating platform-independent visual information, including lightweight devices such as PDA's. Following a program overview, several capabilities will be demonstrated.
July 2003
Differences between LDM5 and LDM6
Steve EmmersonAbstract:
June 2003
Recent Works on THREDDS and NetCDF
John CaronAbstract:
May 2003
Wavelet Transform-Based Data Compression and Its Application to Multi-Dimensional Gridded Data Sets
Ning Wang and Renate Brummer, FSLAbstract: Wavelet transform-based data compression is one of the most effective data compression techniques that emerged and evolved in 1990s. This technique takes advantage of the wavelet basis functions that are localized in both time and frequency, and combines that with a well-designed quantization scheme to achieve superior data compression performance.
In this presentation, we will first provide an introduction of the wavelet data compression technique and the three major coherent steps of the compression procedure: transform, quantization and entropy encoding. We will discuss the selection of a wavelet mother function, pros and cons of different quantization techniques, and their implications to practical implementations. Comparison of our compression technique with widely used ‘standard’ image compression methods, including JPEG, will be made. We will present experimental results based on visible satellite images, utilizing our modified zero-tree like quantization.
We will also discuss the possibilities of applying our technique to multi-dimensional model grids. A maximum allowable error is imposed for each model parameter at all grid points—tunable to user-specified accuracy requirement. We will discuss the problems and difficulties that arise from this error control requirement. Several different compression schemes will be described in our presentation, followed by our experimental results based on the Eta-12 numerical weather prediction forecast model. We will conclude our presentation with directions for future development of this technique.
April 2003
A brief tour through COMET with a look to the future
Tim SpanglerAbstract: Tim will provide a tour through some of what COMET has produced lately, provide a glimpse of the COMET -Unidata MeteoForum project, and talk about some new initiatives that could involve both programs.
March 2003
IDV, THREDDS, VGEE Don Murray, Ben Domenico, Rajul Pandya
Abstract: Have you heard these acronyms around UCAR? Join us at a Unidata-sponsored seminar to learn more about them. Discussion and demonstrations of each will be provided. Murray will present the Java-based Integrated Data Viewer, an analysis and visualization tool now offered by Unidata. Ben Domenico will cover the Thematic Realtime Environmental Data Distributed Services (THREDDS), which provides coherent access to large collections of real-time and archived datasets from a variety of environmental data sources at a number of distributed server sites. Raj Pandya will provided details and demonstration of the Visual Geophysical Exploration Environment, an inquiry based geoscience learning environment.
Contact Us Site Map Search Terms and Conditions Privacy Policy Participation Policy | ||||||
|
||||||