XEL Elastic favicon

XEL Elastic

Many scientific problems, e.g., the simulation of quantum mechanics, the non-linear optimization used in weather forecasting and the optimization and minimization of logic circuits, require computing resources that exceed those of any single computer by many orders of magnitude. The increasing adoption of these methods in today's science has been the driving force behind finding ways to create environments where these problems can be solved in a reasonable amount of time. As early as in the 1990s the term "grid computing" emerged; an approach that comprises pooling the computational resources of many individual computers into one holistic entity which can deliver its combined compute power as easy as accessing power from the grid. While setting up a local grid-computing cluster seems very straight-forward, it comes at a considerable cost. Not only do you have to purchase enough equipment to deliver the resources you need, but you also have to pay the recurring costs for maintenance and operation. This is not a very economical way for occasional calculations and one-time projects. A few approaches to decentralize grid computing, however, have been presented in the late 90s and build upon the fact that lots of people today have very powerful computers which are all connected to the internet. If you think about it, there is a considerable amount of computing power – not in supercomputer centers or laboratories but peoples houses. Platforms like Distributed.net and later SETI@Home and BOINC have shown ways to use these (user's) devices to form a more powerful grid-computer with almost no initial setup costs. However, these platforms mostly relied on (and rely) on volunteer computing, i.e., on volunteering participants that are willing to contribute their computational power to projects they aspire. Needless to say, that more boring projects (such as your homework assignment simulation) have only little chances to pull in any reasonable amount of computing resources.