Saturday, July 19, 2014

Estimating the energy cost of evolution

Want to create human-equivalent AI? Well, broadly speaking, there are 3 approaches open to you: design it, reverse-engineer it or evolve it. The third of these - artificial evolution - is attractive because it sidesteps the troublesome problem of having to understand how human intelligence works. It's a black box approach: create the initial conditions then let the blind watchmaker of artificial evolution do the heavy lifting. This approach has some traction. For instance David Chalmers, in his philosophical analysis of the technological singularity, writes "if we produce an AI by artificial evolution, it is likely that soon after we will be able to improve the evolutionary algorithm and extend the evolutionary process, leading to AI+". And since we can already produce simple AI by artificial evolution, then all that's needed is to 'improve the evolutionary algorithm'. Hmm. If only it were that straightforward.

About six months ago I asked myself (and anyone else who would listen): ok, but even if we had the right algorithm, what would be the energy cost of artificially evolving human-equivalent AI? My hunch was that the energy cost would be colossal; so great perhaps as to rule out the evolutionary approach altogether. That thinking, and some research, resulted in me submitting a paper to ALIFE 14. Here is the abstract:
This short discussion paper sets out to explore the question: what is the energy cost of evolving complex artificial life? The paper takes an unconventional approach by first estimating the energy cost of natural evolution and, in particular, the species Homo Sapiens Sapiens. The paper argues that such an estimate has value because it forces us to think about the energy costs of co-evolution, and hence the energy costs of evolving complexity. Furthermore, an analysis of the real energy costs of evolving virtual creatures in a virtual environment, leads the paper to suggest an artificial life equivalent of Kleiber's law - relating neural and synaptic complexity (instead of mass) to computational energy cost (instead of real energy consumption). An underlying motivation for this paper is to counter the view that artificial evolution will facilitate the technological singularity, by arguing that the energy costs are likely to be prohibitively high. The paper concludes by arguing that the huge energy cost is not the only problem. In addition we will require a new approach to artificial evolution in which we construct complex scaffolds of co-evolving artificial creatures and ecosystems.
The full proceedings of ALIFE 14 have now been published online, and my paper Estimating the Energy Cost of (Artificial) Evolution can be downloaded here.

And here's a very short (30 second) video introduction on YouTube:


My conclusion? Well I reckon that the computational energy cost of simulating and fitness testing something with an artificial neural and synaptic complexity equivalent to humans could be around 10^14 KJ, or 0.1 EJ. But evolution requires many generations and many individuals per generation, and - as I argue in the paper - many co-evolving artificial species. Also taking account of the fact that many evolutionary runs will fail (to produce smart AI), the whole process would almost certainly need to be re-run from scratch many times over. If multiplying those population sizes, generations, species and re-runs gives us (very optimistically) a factor of 1,000,000 - then the total energy cost would be 100,000 EJ. In 2010 total human energy use was about 539 EJ. So, artificially evolving human-equivalent AI would need the whole human energy generation output for about 200 years.


The full paper reference:

Winfield AFT, Estimating the Energy Cost of (Artificial) Evolution, pp 872-875 in Proceedings of the Fourteenth International Conference on the Synthesis and Simulation of Living Systems, Eds. H Sayama, J Rieffel, S Risi, R Doursat and H Lipson, MIT Press, 2014.

Related blog posts:

5 comments:

  1. Thanks, very interesting angle. Have the 'human brain project' people ignored that?

    ReplyDelete
    Replies
    1. Thanks! Yes I wondered that too. Supercomputers consume megawatts, i.e. megajoules/s. But there's a huge gap between mega- and exa- (10^12 s = ~32,000 years).

      Delete
  2. It's interesting to compare your estimates with "How Hard is Artificial Intelligence? Evolutionary Arguments and Selection Effects" http://www.nickbostrom.com/aievolution.pdf :
    "The argument from evolutionary algorithms requires an estimate of how much computing power it would take to match the amount of optimization power provided by natural selection over geological timescales. We explored one way of placing an upper bound on the relevant computational demands and found it to correspond to more than a century’s worth of continuing progress along Moore’s Law—an impractically vast amount of computing power. Large efficiency gains are almost certainly possible, but they are difficult to quantify in advance. It is doubtful that the upper bound down calculated in our paper could be reduced sufficiently to enable the argument from evolutionary algorithms to succeed."

    They then discuss problem difficulty - the paper does explore a lot of good angles.

    I do have doubts about how long Moore's Law can continue. If you make a quick line graph of the land- or air-speed records (I used Wikipedia), they progress fairly steadily, then hit a plateau until some physical limit is overcome by new technology (e.g. by switching to rocket power), and then get stuck again. What you can't assume is that the physical limits will be overcome indefinitely based on the progress of a few decades when everything will eventually have a finite operative scale.

    What estimates of future computing power often leave out is the physical cost of storing and processing the bits (per second).

    Shulman and Bostrom use the Japanese K, the world’s most powerful supercomputer, to work out their figures above. However, this currently consumes 10MW, so even if you feed in Moore's Law, against world energy consumption (or potential energy production) this doesn't start to look so good.

    Very stimulating entry in your web log. I may yet leave another comment about simulations!

    ReplyDelete
    Replies
    1. Many thanks Paul - I'm grateful for the reference - and now reading Nick Bostrom's paper!

      Delete
  3. The 10MW of what was the most powerful computer at the time of the paper doesn't seem that bad until you look at the number of them you'd need to get to the bottom end of the massive range they give as their estimate. In addition to that, how many supercomputers could you support (with all the human, material and energy resources that lie behind them) to work on the simulation?

    We also can't assume that the energy-efficiency of these computers will keep pace with Moore's Law, although if it does manage to, then it doesn't seem so implausible that by 2050 (as I think Moravec suggests) there could be computers of comparable processing power to ourselves.

    I assume what you're trying to get at is not to rule out some sort of artificial general intelligence in the future, but that it won't be achieved by evolutionary algorithms, rather by the usual cultural evolution that has brought humans our existing technological progress.

    However, an earlier paper by Bostrom (preceded by an idea of Moravec's) tried to assess the likelihood that we're all living in a computer simulation - see http://www.simulation-argument.com/

    I've been very suspicious of these in the past, because the more accurate the simulation gets, the more it would seem to involve 1:1 mapping. Now it also looks like the energy requirements might be very fanciful. I suspect that the less an organism suspects it's in a simulation, the greater the possibility it is; conversely, the more we think about it, the less likely it is to be true!

    ReplyDelete