Today's fastest machines used in High-Performance Computing can have hundreds of thousands of cores. To take advantage of all the available processing power, algorithms that do computations in parallel must be used. Some of the limiting factors of these algorithms are the need for synchronization between different computational units and the amount of communication that has to be done between them. In addition, as the number of cores grows, useful results may not be achievable because of the failure of these cores.In this thesis, named "Iterative Methods for Solving Systems of Linear Equations: Hyperpower, Conjugate Gradient and Monte Carlo Methods" and submitted to the University of Manchester by Lukas Steiblys for the degree of Master of Philosophy in May 2014, modifications to Newton-Schultz iteration and Conjugate Gradient algorithms are investigated for applicability to solving large systems of linear equations that may also have better performance on multi-core machines than then standard versions of the algorithms. The notions of hard-error and soft-error are also explained, a solution for mitigation of soft-errors in the Conjugate Gradient Method is proposed and its efficacy is examined. In the end, a different method for solving systems of linear equations that incorporates random sampling is introduced.
Date of Award | 31 Dec 2014 |
---|
Original language | English |
---|
Awarding Institution | - The University of Manchester
|
---|
Supervisor | David Silvester (Supervisor) & Jack Dongarra (Supervisor) |
---|
- iterative
- fault tolerance
- monte carlo
- matrix
- hyperpower
- linear algebra
- conjugate gradient
Iterative Methods for Solving Systems of Linear Equations: Hyperpower, Conjugate Gradient and Monte Carlo Methods
Steiblys, L. (Author). 31 Dec 2014
Student thesis: Master of Philosophy