Cargando…
Evolution and Revolution in the Design of Computers Based on Nanoelectronics
<!--HTML-->Today's computers are roughly a factor of one billion less efficient at doing their job than the laws of fundamental physics state that they could be. How much of this efficiency gain will we actually be able to harvest? What are the biggest obstacles to achieving many ord...
Autor principal: | |
---|---|
Lenguaje: | eng |
Publicado: |
2004
|
Materias: | |
Acceso en línea: | http://cds.cern.ch/record/1564579 |
_version_ | 1780930841773342720 |
---|---|
author | Williams, Stan |
author_facet | Williams, Stan |
author_sort | Williams, Stan |
collection | CERN |
description | <!--HTML-->Today's computers are roughly a factor of one billion less efficient at doing their
job than the laws of fundamental physics state that they could be. How much of this
efficiency gain will we actually be able to harvest? What are the biggest obstacles
to achieving many orders of magnitude improvement in our computing hardware, rather
that the roughly factor of two we are used to seeing with each new generation of
chip? Shrinking components to the nanoscale offers both potential advantages and
severe challenges. The transition from classical mechanics to quantum mechanics is
a major issue. Others are the problems of defect and fault tolearance: defects are
manufacturing mistakes or components that irreversibly break over time and faults
are transient interuptions that occur during operation. Both of these issues become
bigger problems as component sizes shrink and the number of components scales up
massively. In 1955, John von Neumann showed that a completely general approach to
building a reliable machine from unreliable components would require a redundancy
overhead of at least 10,000 - this would completely negate any advantages of
building at the nanoscale. We have been examining a variety of defect and fault
tolerant techniques that are specific to particular structures or functions, and are
vastly more efficient for their particular task than the general approach of von
Neumann. Our strategy is to layer these techniques on top of each other to achieve
high system reliability even with component reliability of no more than 97% or so,
and a total redudancy of less than 3. This strategy preserves the advantages of
nanoscale electronics with a relatively modest overhead. |
id | cern-1564579 |
institution | Organización Europea para la Investigación Nuclear |
language | eng |
publishDate | 2004 |
record_format | invenio |
spelling | cern-15645792022-11-02T22:23:29Zhttp://cds.cern.ch/record/1564579engWilliams, StanEvolution and Revolution in the Design of Computers Based on NanoelectronicsCHEP04Conferences<!--HTML-->Today's computers are roughly a factor of one billion less efficient at doing their job than the laws of fundamental physics state that they could be. How much of this efficiency gain will we actually be able to harvest? What are the biggest obstacles to achieving many orders of magnitude improvement in our computing hardware, rather that the roughly factor of two we are used to seeing with each new generation of chip? Shrinking components to the nanoscale offers both potential advantages and severe challenges. The transition from classical mechanics to quantum mechanics is a major issue. Others are the problems of defect and fault tolearance: defects are manufacturing mistakes or components that irreversibly break over time and faults are transient interuptions that occur during operation. Both of these issues become bigger problems as component sizes shrink and the number of components scales up massively. In 1955, John von Neumann showed that a completely general approach to building a reliable machine from unreliable components would require a redundancy overhead of at least 10,000 - this would completely negate any advantages of building at the nanoscale. We have been examining a variety of defect and fault tolerant techniques that are specific to particular structures or functions, and are vastly more efficient for their particular task than the general approach of von Neumann. Our strategy is to layer these techniques on top of each other to achieve high system reliability even with component reliability of no more than 97% or so, and a total redudancy of less than 3. This strategy preserves the advantages of nanoscale electronics with a relatively modest overhead.oai:cds.cern.ch:15645792004 |
spellingShingle | Conferences Williams, Stan Evolution and Revolution in the Design of Computers Based on Nanoelectronics |
title | Evolution and Revolution in the Design of Computers Based on Nanoelectronics |
title_full | Evolution and Revolution in the Design of Computers Based on Nanoelectronics |
title_fullStr | Evolution and Revolution in the Design of Computers Based on Nanoelectronics |
title_full_unstemmed | Evolution and Revolution in the Design of Computers Based on Nanoelectronics |
title_short | Evolution and Revolution in the Design of Computers Based on Nanoelectronics |
title_sort | evolution and revolution in the design of computers based on nanoelectronics |
topic | Conferences |
url | http://cds.cern.ch/record/1564579 |
work_keys_str_mv | AT williamsstan evolutionandrevolutioninthedesignofcomputersbasedonnanoelectronics AT williamsstan chep04 |