A quantum ML model is trained using quantum data, which is generated, or simulated, on quantum processors. This kind of data is found in a number of different research fields, including quantum matter, quantum chemistry, and others.
We’ve used TensorFlow Quantum for hybrid quantum-classical convolutional neural networks, machine learning for quantum control, layer-wise learning for quantum neural networks, quantum dynamics learning, generative modeling of mixed quantum states, and learning to learn with quantum neural networks via classical recurrent neural networks.
The key differentiatior of quantum data from classical data is given by superposition and entanglement, two quantum properties of particles that make their state interdependent on one another. This has the implication of requiring to work with joint probability distributions, which quickly become numerically intractable on classical hardware. This is where the promise of quantum machine learning to make handling such complexity feasible comes into play.
TFQ provides the quantum abstractions which are usually used to create quantum experiments, such as qubits, gates, circuits, etc., and makes them usable on noisy intermediate scale quantum (NISQ) processors, which many organizations, including Google, are building and operating today. In a sense, TFQ provides yet another platform which can be used with TensorFlow along with CPUs, GPUs, and TPUs.
Let’s try to clarify the relationship between TensorFlow and the underlying quantum platform provided by Cirq. Using TFQ, quantum data is represented through tensors, with each tensor being associated to a Cirq circuit generating quantum data on-the-fly. The first step is thus creating a quantum dataset executing those tensors on a quantum computer. That dataset is then used to create a quantum neural network model, which basically attempts to disentangle the information available in quantum data and make it available for classical ML processing. This enables the creation of a classical neural network with an associated cost function which is used to fine-tune the parameters used by TensorFlow and feed them back into the ML pipeline.
TFQ does not require a physical quantum processors to be available, since it can also be used in conjunction with a quantum simulator. Related to this, Google has also open-sourced qsim, a high-performance quantum circuit simulator able to simulate up to 32 qubits on a single Google Cloud node.
Read more about TensorFlow Quantum in Google’s TFQ white paper.
.(tagsToTranslate)google quantum tensorflow(t)Development(t)AI(t)ML & Data Engineering(t)Machine Learning(t)TensorFlow(t)Neural Networks(t)Quantum Computing(t)Google(t)Emerging Technologies
This is a syndicated post. Read the original post at Source link .