Defense Advanced Research Projects AgencyTagged Content List

Information Processing

Computational tools and techniques for manipulating and analyzing data

Showing 7 results for Processing + Networking RSS
I2O explores game-changing technologies in the fields of information science and software to anticipate and create rapid shifts in the complex national security landscape. Conflict can occur in traditional domains such as land, sea, air, and space, and in emerging domains such as cyber and other types of irregular warfare. I2O’s research portfolio is focused on anticipating new modes of warfare in these emerging areas and developing the concepts and tools necessary to provide decisive advantage for the U.S. and its allies.
05/18/2015
Modern society depends on information and information depends on information systems. Timely, insightful, reliable, and relevant information drives success. This is not lost on military leaders who well appreciate the critical importance of information for national security. As Sir Francis Bacon wrote in 1597, “Knowledge is power.”
01/01/1962

DARPA’s Information Processing Techniques Office (IPTO) was born in 1962 and for nearly 50 years was responsible for DARPA’s information technology programs. IPTO did not itself perform research, but rather invested in breakthrough technologies and seminal research projects that led to significant developments in computer hardware and software.

June 30, 2016,
DARPA Conference Center
DARPA’s Information Innovation Office (I2O) is hosting a Proposers Day to provide information to potential proposers on the objectives of the upcoming Dispersed Computing program. The program will seek to develop scalable, robust decision systems that enable secure, collective tasking of computing assets in a mission-aware fashion by users with competing demands, and across large numbers of heterogeneous computing platforms. These systems must be able to operate in environments where network connectivity is highly variable and degraded.
In the current art, users with significant computing requirements have typically depended on access to large, highly shared data centers to which they backhaul their data (e.g., images, video, or network log files) for processing. However, in many operational scenarios, the cost and latency of this backhaul can be problematic, especially when network throughput is severely limited or when the user application requires a near real-time response. In such cases, users’ ability to leverage computing power that is available “locally” (in the sense of latency, available throughput, or similar measures that are relevant to the user or mission) could substantially improve application performance while reducing mission risk.