Quantum Boltzmann Machines (QBMs) are quantum-enhanced probabilistic models that improve upon classical Boltzmann Machines using quantum effects for enhanced learning and sampling.
Algorithm Details
Quantum Boltzmann Machines (QBMs) are a class of quantum machine learning models that generalize the classical Boltzmann machines to the quantum domain1. Boltzmann machines are probabilistic graphical models that learn the probability distribution underlying a set of input data and can be used for tasks such as unsupervised learning, generative modeling, and combinatorial optimization.
Problem Target
QBMs exploit the power of quantum computing to represent and manipulate complex probability distributions more efficiently than classical Boltzmann machines, particularly for high-dimensional and strongly correlated data2. The key idea behind QBMs is to use quantum states and quantum operations to represent the model parameters and perform the learning and inference tasks.
Quantum Approach
A QBM consists of a network of quantum nodes, each representing a qubit or a group of qubits, and connected by quantum edges that encode the interactions between the nodes3. The quantum state of the QBM represents the joint probability distribution of the variables in the model, and the goal of training is to adjust the parameters of the quantum edges to minimize the difference between the model distribution and the target distribution of the input data.
Implementation Steps
InitiState preparationalisation
The input data is encoded into a quantum state, typically using amplitude encoding or qubit encoding. In amplitude encoding, each data sample is represented by a quantum state, where the amplitudes of the basis states correspond to the feature values. In qubit encoding, each feature is assigned to a qubit, and the feature values are encoded in the qubit states.
Model Initialisation
The parameters of the QBM, such as the weights of the quantum edges and the biases of the quantum nodes, are initialised to random values or based on prior knowledge.
Quantum sampling
A quantum sampling algorithm, such as quantum annealing or quantum Gibbs sampling, is used to generate samples from the model distribution4. These algorithms exploit the quantum superposition and quantum tunnelling effects to explore the state space more efficiently than classical sampling methods.
Gradient estimation
The gradients of the model parameters with respect to the objective function, such as the log-likelihood or the Kullback-Leibler divergence, are estimated using the quantum samples and classical post-processing. This can be done using techniques such as quantum back-propagation or quantum natural gradient.
Parameter update
The model parameters are updated based on the estimated gradients, using classical optimization algorithms such as gradient descent or Adam.
Steps three to five are repeated until the model converges or a maximum number of iterations is reached. After training, the QBM can be used for tasks such as data generation, anomaly detection, and classification, by sampling from the learned distribution or computing the probabilities of the input data.
Practical Applications
The potential advantages of QBMs over classical Boltzmann machines include some of the usual themes we might expect in such quantum explorations. The exponential speedup is a benefit for certain types of data and model architectures, where QBMs can provide an exponential speedup over classical Boltzmann machines in terms of training time and model capacity5. This is due to the ability of quantum computers to represent and manipulate exponentially large state spaces with a linear number of qubits.
These quantum states and quantum operations can potentially capture more complex and expressive probability distributions than classical models, due to the presence of entanglement and interference effects6. Likewise the improved generalisation of QBMs may result in more robust representations of the input data, by exploiting the quantum superposition and quantum parallelism effects to explore a larger hypothesis space.
Implementation Challenges
QBMs face the usual collection of limitations given their reliance on near-term quantum devices. Developing efficient methods for encoding large-scale and high-dimensional classical data into quantum states, while preserving the relevant features and correlations, requires efficient data encoding in the first place. Similar issues exist with designing QBM architectures that can be efficiently implemented on near-term quantum hardware with limited qubit count and gate fidelity. One such example is in noise-resilient training, where the development of robust QBM training algorithms that can operate in the presence of noise and errors in the quantum hardware, requires continual advances in techniques such as error mitigation and quantum error correction.
The practical application of integrating with classical machine learning is also a concern in the current era7. There is much work to be done in exploring hybrid quantum-classical approaches that combine QBMs with classical machine learning techniques, such as pre-training, fine-tuning, and transfer learning, to use the strengths of both paradigms. Experimental demonstrations of QBMs have been reported on various quantum computing platforms, including superconducting qubits, trapped ions, and quantum annealers, showing promising results for small-scale datasets. However, the scalability and performance of QBMs on larger and more realistic datasets remain open research questions.
Bottom Line
Quantum Boltzmann Machines are a promising class of quantum machine learning models that exploit the power of quantum computing to learn and represent complex probability distributions more efficiently than classical models. By exploiting the quantum superposition, entanglement, and interference effects, QBMs have the potential to provide exponential speedups and enhanced expressivity compared to classical Boltzmann machines, with applications ranging from unsupervised learning and generative modeling to optimization and decision-making.
However, significant research efforts are still needed to address the challenges of efficient data encoding, scalable model architectures, noise-resilient training, and integration with classical machine learning techniques, before QBMs can be deployed in real-world scenarios. As quantum technologies continue to advance, QBMs are expected to play an important role in the emerging field of quantum-enhanced artificial intelligence.
References
-
Amin, M. H., Andriyash, E., Rolfe, J., Kulchytskyy, B., & Melko, R. (2018). Quantum Boltzmann machine. Physical Review X, 8(2), 021050.
-
Kieferová, M., & Wiebe, N. (2017). Tomography and generative training with quantum Boltzmann machines. Physical Review A, 96(6), 062327.
-
Benedetti, M., Realpe-Gómez, J., Biswas, R., & Perdomo-Ortiz, A. (2017). Quantum-assisted learning of hardware-embedded probabilistic graphical models. Physical Review X, 7(4), 041052.
-
Johnson, M. W., Amin, M. H. S., Gildert, S., Lanting, T., Hamze, F., Dickson, N., Harris, R., Berkley, A. J., Johansson, J., Bunyk, P., Chapple, E. M., Enderud, C., Hilton, J. P., Karimi, K., Ladizinsky, E., Ladizinsky, N., Oh, T., Perminov, I., Rich, C., ... & Rose, G. (2011). Quantum annealing with manufactured spins. Nature, 473(7346), 194-198.
-
Adachi, S. H., & Henderson, M. P. (2015). Application of quantum annealing to training of deep neural networks. arXiv preprint arXiv:1510.06356.
-
Korenkevych, D., Xue, Y., Bian, Z., Chudak, F., Macready, W. G., Rolfe, J., & Andriyash, E. (2016). Benchmarking quantum hardware for training of fully visible Boltzmann machines. arXiv preprint arXiv:1611.04528.
-
Khoshaman, A., Vinci, W., Denis, B., Andriyash, E., Sadeghi, H., & Amin, M. H. (2018). Quantum variational autoencoder. Quantum Science and Technology, 4(1), 014001.