The World as a Neural Network

Yuri Barzov
4 min readApr 29, 2021

--

Photo by Andre Moura from Pexels

The simplest mathematical model of a neural network can generate the dynamics of both quantum and classical processes. Thus, the hypothesis that the universe is a neural network can become a unifying theory of everything. Some people call it God’s Algorithm. This is the essence of Vitaly Vanchurin’s hypothesis.

If it were otherwise, humanity would still not be aware of the fact that there are quantum and classical processes. If the neural networks of individual people, at least, could not reproduce the dynamics of these processes, no one would understand that they exist.

It is impossible to remove science from science, but the fruit fly, for example, or E. coli bacteria reproduce the dynamics of both quantum and classical processes in the networks of their cells, molecules, and atoms, ensuring their survival and reproduction with great success.

Neural networks are just a special case of the ubiquitous proliferation of networks (they are also very often called lattices). The main thing is the dynamics of the production of an ordered signal at the output from chaos (or Kolmogorov complexity) at the input.

The more complex the organism is, the more ordered signals it is able to produce, the more complex and extended in time and space is its Universe.

I constantly cut off the question, where does the chaos at the input come from, because we cannot find a scientific way to refute any of the hypotheses: from matter, from God, by itself. And I want to limit the book to scientific hypotheses and theories.

A completely unique group of scientists from an astrophysicist and jazzman to the quantum computer architect of Microsoft published an article on April 9 that the universe is a neural network that learns its own laws. At the center of the list of seven co-authors was Lee Smolin, the author of the theory that there is a universe in each black hole. Four are Microsoft employees.

The authors cited the work of Vitaly Vanchurin that the Universe is a neural network, published in August 2020, but for some reason, they called his theory “a new framework for the presentation of known laws”, apparently with a hint that the “deep, recurrent, cyclic neural network” proposed by them is much cooler than the basic model used by Vanchurin.

The place seems to be getting hot because for the second day I wade through an article written by Vanchurin in collaboration with Mikhail Katsnelson and published in December 2020 that the behavior of learning neurons in a neural network can, under certain conditions, be described by the Schrödinger equation, in other words, it can be quantum and correspond to process 2 as defined by John von Neumann, albeit at the macro level.

But that was not the only thing that impressed me. In fact, there is only one condition — the level of free energy in the hidden layer of the network should have an ensemble of values, and not one value. It immediately occurred to me that the variational free energy according to Karl Friston is also an ensemble of values.

The multivaluedness is achieved by the arrangement that the neural network has access to a source of new neurons and the number of neurons in it can both increase and decrease. Neurogenesis goes on in the hippocampus constantly, as we remember, but only those neurons that get connected to the network survive. Neurodegeneration happens all the time, and the network takes in more new neurons when it learns harder.

The number of neurons in the epsilon neural network of Lana Sinapayen also increases or decreases depending on the level of complexity of the data being processed.

As a cherry on top, quantum learning is reversible, which allows the network to maintain a constant level of entropy because the production of negative (destruction of) entropy in the learning process is balanced by the production of entropy in the unlearning process.

So it happens straight according to Alvin Toffler: learn, unlearn and relearn.

References:

  1. Vitaly Vanchurin, “The world as a neural network,” Entropy 22 (2020) 1210. https://doi.org/10.3390/e22111210
  2. Stephon Alexander, William J. Cunningham, Jaron Lanier, Lee Smolin, Stefan Stanojevic, Michael W. Toomey, Dave Wecker (March 2021) The Autodidactic Universe. arXiv:2104.03902v1 [hep-th]
  3. Mikhail I. Katsnelson, Vitaly Vanchurin (Dec. 2020) Emergent Quantumness in Neural Networks. arXiv:2012.05082 [quant-ph]
  4. Friston, K.J., Mattout, J., Trujillo-Barreto, N., Ashburner, J., & Penny, W. (2007). Variational free energy and the Laplace approximation. NeuroImage, 34, 220–234.
  5. Shors, T. J., Anderson, M. L., Curlik, D. M., 2nd, & Nokia, M. S. (2012). Use it or lose it: how neurogenesis keeps the brain fit for learning. Behavioral brain research, 227(2), 450–458. https://doi.org/10.1016/j.bbr.2011.04.023
  6. Tracey J. Shors. From Stem Cells to Grandmother Cells: How Neurogenesis Relates to Learning and Memory. Cell Stem Cell, September 11, 2008, DOI: https://doi.org/10.1016/j.stem.2008.08.010
  7. Semënov M. V. (2019). Adult Hippocampal Neurogenesis Is a Developmental Process Involved in Cognitive Development. Frontiers in neuroscience, 13, 159. https://doi.org/10.3389/fnins.2019.00159
  8. Sinapayen, L., & Ikegami, T. (2017). Online fitting of computational cost to environmental complexity: Predictive coding with the ε-network. ECAL
  9. Toffler, A. (1970). Future Shock. New York, N.Y.: Random House, Inc.

--

--

Yuri Barzov
Yuri Barzov

Written by Yuri Barzov

Curious about life and intelligence

No responses yet