text-only page produced automatically by LIFT Text Transcoder Skip all navigation and go to page contentSkip top navigation and go to directorate navigationSkip top navigation and go to page navigation
National Science Foundation Home National Science Foundation - Computer & Information Science & Engineering (CISE)
 
Computer & Information Science & Engineering (CISE)
design element
CISE Home
About CISE
Funding Opportunities
Awards
News
Events
Discoveries
Publications
Advisory Committee
Career Opportunities
See Additional CISE Resources
View CISE Staff
CISE Organizations
Computing and Communication Foundations (CCF)
Computer and Network Systems (CNS)
Information & Intelligent Systems (IIS)
Proposals and Awards
Proposal and Award Policies and Procedures Guide
  Introduction
Proposal Preparation and Submission
bullet Grant Proposal Guide
  bullet Grants.gov Application Guide
Award and Administration
bullet Award and Administration Guide
Award Conditions
Other Types of Proposals
Merit Review
NSF Outreach
Policy Office
Additional CISE Resources
Subscribe to receive special CISE announcements
Assistant Director's Presentations
CISE Distinguished Lecture Series
Contact CISE OAD
Other Site Features
Special Reports
Research Overviews
Multimedia Gallery
Classroom Resources
NSF-Wide Investments


Event
The Virtualized Reality System: 4D Digitization of a Time-Varying Real Event and Its Application

April 17, 2001 2:00 PM  to 
April 17, 2001 3:00 PM
NSF, Room 110, Arlington, VA

Lecturer: Takeo Kanade

Can we digitize a three-dimensional, time-varying scene from the world into a computer as a 3D event, just like real-time CT can digitize body volume? Since mid 90's, the Carnegie Mellon University's Virtualized Reality (TM) project has been developing computer vision technologies for this purpose with the 3D Room - a fully digital room that can capture events occurring in it by many (at this moment 50) surrounding video cameras, including pan/tilt heads.

With this facility, we digitize the event occurring in the room and generate its complete three-dimensional, time-varying, and volumetric/surface representation. Then, not only can we render images from any viewpoint or angle - even those at which there were no cameras (the concept of "Let's watch the NBA on the court"), but also we can conceive of a whole new notion of "event archiving and manipulation." I will discuss this theory, our facility, computation, and results.

We worked recently for CBS Sports to develop a multi-robot camera system that was used to broadcast the Super Bowl XXXV on January 28, 2001. The system, which CBS calls "EyeVision", produced a surrounding view of various interesting and controversial plays during the game. I will present our contribution to and experience with that real-time system.

This event is part of Distinguished Lecture Series.

Meeting Type
Lecture

Contacts
Michael J. Pazzani, mpazzani@nsf.gov

NSF Related Organizations
Directorate for Computer & Information Science & Engineering

 



Print this page
Back to Top of page
  Web Policies and Important Links | Privacy | FOIA | Help | Contact NSF | Contact Webmaster | SiteMap  
National Science Foundation Computer & Information Science & Engineering (CISE)
The National Science Foundation, 4201 Wilson Boulevard, Arlington, Virginia 22230, USA
Tel:  (703) 292-5111, FIRS: (800) 877-8339 | TDD: (800) 281-8749
Last Updated:
July 27, 2005
Text Only


Last Updated: July 27, 2005