• Advanced Photonics
  • Vol. 6, Issue 5, 050500 (2024)
Md Sadman Sakib Rahman1,2,3 and Aydogan Ozcan1,2,3,*
Author Affiliations
  • 1University of California, Electrical and Computer Engineering Department, Los Angeles, California, United States
  • 2University of California, Bioengineering Department, Los Angeles, California, United States
  • 3University of California, California NanoSystems Institute, Los Angeles, California, United States
  • show less
    DOI: 10.1117/1.AP.6.5.050500 Cite this Article Set citation alerts
    Md Sadman Sakib Rahman, Aydogan Ozcan, "Physics and artificial intelligence: illuminating the future of optics and photonics," Adv. Photon. 6, 050500 (2024) Copy Citation Text show less
    Some of the major contributions of John Hopfield and Geoffrey Hinton, with their connections to physical systems. (a) Left: the Hopfield model, a recurrent neural network capable of storing and retrieving patterns. John Hopfield proposed an energy-based memory model capable of storing and retrieving patterns, drawing intuition from the energy dynamics of spin glass. The network’s dynamics are driven by minimizing the energy of the model state E(s). Center: the energy landscape of a Hopfield network, depicting how the network converges to stable states (attractors) represented as valleys in the energy landscape. Right: a spin glass system, i.e., a disordered material with magnetic interactions among atomic spins. The energy of a state in the Hopfield model is similar to the energy Hamiltonian H(σ) of the spin glass state σ. (b) Left: the Boltzmann machine, a stochastic generalization of the Hopfield network, includes hidden units (represented by gray nodes) that enhance its representation capability. By incorporating the Boltzmann distribution from statistical mechanics, Geoffrey Hinton introduced stochasticity into his neural network models, enhancing their ability to learn complex patterns. Right: the Boltzmann distribution, which governs the probability of a state based on its energy. This distribution plays a key role in the stochastic activation of artificial neurons in Boltzmann machines.
    Fig. 1. Some of the major contributions of John Hopfield and Geoffrey Hinton, with their connections to physical systems. (a) Left: the Hopfield model, a recurrent neural network capable of storing and retrieving patterns. John Hopfield proposed an energy-based memory model capable of storing and retrieving patterns, drawing intuition from the energy dynamics of spin glass. The network’s dynamics are driven by minimizing the energy of the model state E(s). Center: the energy landscape of a Hopfield network, depicting how the network converges to stable states (attractors) represented as valleys in the energy landscape. Right: a spin glass system, i.e., a disordered material with magnetic interactions among atomic spins. The energy of a state in the Hopfield model is similar to the energy Hamiltonian H(σ) of the spin glass state σ. (b) Left: the Boltzmann machine, a stochastic generalization of the Hopfield network, includes hidden units (represented by gray nodes) that enhance its representation capability. By incorporating the Boltzmann distribution from statistical mechanics, Geoffrey Hinton introduced stochasticity into his neural network models, enhancing their ability to learn complex patterns. Right: the Boltzmann distribution, which governs the probability of a state based on its energy. This distribution plays a key role in the stochastic activation of artificial neurons in Boltzmann machines.