Previous Fora / 2003

Speakers

Professor Peter A. Freeman

Assistant Director
National Science Foundation
USA

 

Dr. Peter A. Freeman received his Ph.D. in computer sciences at Carnegie-Mellon University (1970), his M.A. in mathematics and psychology from the University of Texas at Austin (1965), and his B.S. in physics at the Rice University (1963).

For almost twenty years Dr. Freeman served on the faculty of the Department of Information and Computer Science at the University of California, Irvine. In 1990 he moved on to the Georgia Institute of Technology, where he was a professor and a founding dean of the College of Computing. Dr. Freeman has served in a variety of national and regional committees, including serving as the Chair of the Government Affairs Committee. He is also a Fellow of the Institute for Electrical and Electronics Engineers, the American Association for the Advancement of Science, and the Association for Computing Machinery. Currently, Dr. Freeman is assistant director of the Computer & Information Science & Engineering (CISE) division of the National Science Foundation.

Dr. Freeman's research has focused on creating reusable software for use in formal transformation systems. His earliest work (1961-63) involved developing advanced scientific applications in the days before operating systems and other support software. This led him to design one of the first interactive time-sharing operating systems (1964) and later to develop one of the first initiatives applying artificial intelligence in the software design process (1965-75). These projects culminated in the publication of his first book: Software System Principles.

Dr. Freeman has found that the reuse of software parts enables a user to design a completely new system with less effort. An example being when reusable parts are kept in libraries, they not only contain information about the items, but also about the interconnection between them. Some key terms used when discussing software reuse are abstraction (identification and classification of parts), selection (finding the parts), specialization (customizing for use in a specific instance), and integration (compounding a part with other parts in a specific instance). Parts can simply be plugged (black-box library systems) or can be modified for individual purposes (white-box library system). In the second case, not only does the user need to overcome the problem of classification and search-ability, but also the problem of structural specification and flexibility specification. A possible solution Dr. Freeman presents for the challenges the white-box system presents is the usage of languages instead of libraries. Languages not only tell the programmer what the software part does, but also how it works.

Since 1987, Dr. Freeman has shared his time between the University of California and the National Science Foundation. He has focused his attention on national policy and local action intended to advance the field of computing. Thus, Dr. Freeman has remained actively engaged in teaching, advanced technology research, as well as in public policy and academic administration.

As assistant director for NSF's CISE Directorate, Dr. Freeman has encouraged both the stabilization of a funding base for cyberinfrastructure initiatives and interdisciplinary research. In February 2002, the NSF released a report on cyberinfrastructure, which speaks to how information and communication technologies, combined with personnel and integrating components, provide a long-term platform to empower modern scientific research endeavors. Future mediums for cyberinfrastructure are bandwidth, middle ware, storage capacity, collaboration tools, training efforts, and the massive support of grid computing systems.

Grid computing consists of the clustering of a wide variety of geographically distributed resources, such as super computers, storage systems, data sources, and special devices and services that can be used as unified resources. Grid technologies seek to make this possible by providing the protocols, services, and software development kits needed to enable flexible and controlled resource sharing on a large scale. In summer 2002 NSF began to install the hardware for the TeraGrid, a transcontinental super computer that should function for computing power as the Internet does for documents. This virtual computer will process problems at a rate of up to 13.6 trillion floating-point operations per second, or "teraflops". This is eight times faster than the most powerful academic super computer available today.  

 

Resources on the Web

NFS Report on Cyberinfrastructure:
http://www.newswise.com/articles/2003/2/CYBER.NSF.html

Tetragrid website:
http://www.teragrid.org/

Carnegie-Mellon University:
http://www.cs.cmu.edu/scs/scs.html

University of Texas at Austin:
http://www.utexas.edu/

Rice University:
http://www.rice.edu/

Department of Information and Computer Science at the University of California, Irvine:
http://www.ics.uci.edu/

Georgia Institute of Technology:
http://www.gatech.edu/

College of Computing:
http://www.cc.gatech.edu/

Institute for Electrical and Electronics Engineers, IEEE:
http://www.ieee.org/

American Association for the Advancement of Science, AAAS
http://www.aaas.org/

Association for Computing Machinery, ACM
http://www.cra.org/