johnm wrote:I don't get this, it was technique in use a few years back with some limited success but the advent of large scale cloud computing from AWS, Azure, Google et al effectively made it redundant. You can run up a zillion instances on AWS for pennies to run these sort of simulation models, so what am I missing???
Some arithmetic (for continuous use)?
Cloud computing isn't "cheap". It is cost effective in certain use cases, in the same way renting a car for a short period is relative to buying a car.
Sustained usage, like the use case for large scale distributed computing gets expensive. It would cost somewhere between 2-5k a year to rent the equivalent of a home spec PC in the cloud. Compare that with something close to zero for a distributed computing project, and the approach makes perfect sense.
For example, this equates to approx £25bn of computing resource for the cost of running the SETI@Home project.
By the way, many projects (like Vodafone's DreamLab) exist to exploit unused compute resource on your phone.