Abstract

Many believe that the future of innovation lies in simulation. However, as computers are becoming ever more powerful, so does the hyperbole used to discuss their potential in modelling across a vast range of domains, from subatomic physics to chemistry, climate science, epidemiology, economics and cosmology. As we are about to enter the era of quantum and exascale computing, machine learning and artificial intelligence have entered the field in a significant way. In this article we give a brief history of simulation, discuss how machine learning can be more powerful if underpinned by deeper mechanistic understanding, outline the potential of exascale and quantum computing, highlight the limits of digital computing – classical and quantum – and distinguish rhetoric from reality in assessing the future of modelling and simulation, when we believe analogue computing will play an increasingly important role.

Highlights

  • We first glimpsed the potential of computers to model the world eight decades ago, in 1936, when Alan Turing devised a hypothetical computing machine while studying the foundations of mathematics [1]

  • In many of these applications, the computer is programmed to solve partial differential equations that are bereft of analytical solutions; and, they can be used to describe discrete systems, such as lattice gases and other models of fluids, gene regulatory networks with small numbers of molecules [9], agent-based simulations in ecology [10], economics [11] and epidemiology, population dynamics [12], and so on

  • The enormous interest in applying machine learning (ML) and artificial intelligence (AI) has proved so intoxicating that some researchers claim that we can dispense with conventional approaches to science, relying on big data to “teach us” how the world works [22,23]

Read more

Summary

Introduction

We first glimpsed the potential of computers to model the world eight decades ago, in 1936, when Alan Turing devised a hypothetical computing machine while studying the foundations of mathematics [1]. New approaches predicated on machine learning (ML) and artificial intelligence (AI)—terms which are often used interchangeably and synonymously in association with “big data”— have become prominent in tackling a range of complex problems and are sometimes regarded as unbounded in terms of the scope of their domains of application All this has created the widespread expectation among the general public that we can effortlessly use computers to create virtual worlds across a range of domains, from cosmic associations of galaxies stretching over one hundred million light-years to the mesoscale that is most directly accessible to our senses, and from the molecular machines in our cells down to structures within the heart of an atom and inside the particles that comprise its nucleus. As we move into the exascale and quantum eras and start to discern what lies beyond them for modelling, there is considerable potential for surpassing the limitations of digital descriptions by falling back on a form of computation which dates back millennia: analogue computing

Dirac’s dream
Big data need big theory
Classical computational chemistry
Quantum chemistry
Multiscale modelling
Exascale computing
Quantum computing
Coping with chaos
Findings
10. Universe in a computer
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call