Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ Apolloarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
versions View all 1 versions
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

Quantum Algorithms for Matrix Problems and Machine Learning

Authors: Subramanian, Sathyawageeswar;

Quantum Algorithms for Matrix Problems and Machine Learning

Abstract

This dissertation presents a study of quantum algorithms for problems that can be posed as matrix function tasks. In Chapter 1 we demonstrate a simple unifying framework for implementing of smooth functions of matrices on a quantum computer. This framework captures a variety of problems that can be solved by evaluating properties of some function of a matrix, and we identify speedups over classical algorithms for some problem classes. The analysis combines ideas from the classical theory of function approximation with the quantum algorithmic primitive of implementing linear combinations of unitary operators. In Chapter 2 we continue this study by looking at the role of sparsity of input matrices in constructing efficient quantum algorithms. We show that classically pre-processing an input matrix by spectral sparsification can be profitable for quantum Hamiltonian simulation algorithms, without compromising the simulation error or complexity. Such preprocessing incurs a one time cost linear in the size of the matrix, but can be exploited to exponentially speed up subsequent subroutines such as inversion. In Chapter 3, we give an application of this theory of matrix functions to the problem of estimating the Renyi entropy of an unknown quantum state. We combine matrix function techniques with mixed state quantum computation in the one-clean qubit model, and are able to bound of the expected runtime of our algorithm in terms of the unknown target quantity. In addition to the theme of analysing the complexity of our algorithms, we also identify instances that are of practical relevance, leading us to some problems of machine learning. In Chapter 4 we investigate kernel based learning methods using random features. We work with the QRAM input model suitable for big data, and show how matrix functions and the quantum Fourier transform can be used to devise a quantum algorithm for sampling random features that are optimised for given input data and choice of kernel. We obtain a potential exponential speedup over the best known classical algorithm even without explicit assumptions of sparsity or low rank. Finally in Chapter 5 we consider the technique of beamsearch decoding used in natural language processing. We work in the query model, and show how quantum search with advice can be used to construct a quantum search decoder that can find the optimal parse (which may for instance be a best translation, or text-to-speech transcript) at least quadratically faster than the best known classical algorithms, and obtain super-quadratic speedups in the expected runtime.

Related Organizations
Keywords

Quantum Algorithms and Computational Complexity, Quantum Computing

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average
Green