Time: Tuesday 22-Sept-2020 14:00
Live in 2 days & 23:27:50
Motivation / Abstract
Recent advances in quantum computing hardware have made it possible to run first algorithms on experimental quantum devices. These quantum computers, referred to as noisy intermediate-scale quantum (NISQ) devices, still have a small number of qubits and no error correction. One type of algorithm that is believed to cope well with the limitations of NISQ devices are quantum neural networks (QNNs). In this talk, we are going to explore what QNNs are, why they also suffer from vanishing gradients like their classical counterparts, and introduce a new method to dampen the effect of vanishing gradients in QNNs called layerwise learning.