Cargando…
Gazing through windows at component software development
There are a number of problems being discussed actively in the HEP community these days. With the disappearance of the large central mainframes, the role of centralised computing and IT organisations is being questioned. The promise of cheap commodity computing is being researched as a solution to t...
Autor principal: | |
---|---|
Lenguaje: | eng |
Publicado: |
1996
|
Materias: | |
Acceso en línea: | https://dx.doi.org/10.1142/9789814447188_0005 http://cds.cern.ch/record/338409 |
Sumario: | There are a number of problems being discussed actively in the HEP community these days. With the disappearance of the large central mainframes, the role of centralised computing and IT organisations is being questioned. The promise of cheap commodity computing is being researched as a solution to the ever expanding computing needs of the physics community. Computing farms of cheap batch processing engines potentially offer the scaleable, replicatable, data processing units to deal with the increasing volumes of data. Finally, with LHC some 10 years away, the question of how software engineering and data processing will look in the future is inevitably vague and yet we are required to make strategic investment decisions now. While these questions seem unrelated, they represent a set of concerns which are being voiced by industry in general. These concerns arise because of recent changes in computing technology and business methods which now provide an opportunity for the application of technology in new ways, but without clear indication of how to do so effectively. It is not surprising that the purveyor of commodity computing, Microsoft, has realised the opportunity that these changes will bring. This paper will examine the technologies we can expect to see from Microsoft, and others, within the next few years and will show that these technologies will play an increasingly significant role in HEP computing as they mature. |
---|