## SIAM ALA 15 – Minisymposium on Matrix Functions

Last week Nick Higham, Edvin Deadman, and I ran a minisymposium on matrix functions at the SIAM Applied Linear Algebra 2015 conference (link). This post gives a brief summary of each talk, links to published work, and (once they appear) links to the slides with synchronised audio.

Edit: Links to the talks are now available.

Attendance at the sessions was very good, with some high-quality questions coming from the audience.

Session 1

• Marcel Schweitzer – Error Estimation in Krylov Subspace Methods for Matrix Functions
• Michele Benzi – Functions of Matrices with Kronecker Sum Structure
• Bruno Iannazzo – First-Order Riemannian Optimization Techniques for the Karcher Mean
• Sivan Toledo – A High Performance Algorithm for the Matrix Sign Function

Session 2

• Peter Kandolf – The Leja Method: Backward Error Analysis and Implementation
• Massimiliano Fasi – An Algorithm for the Lambert W Function on Matrices
• Antii Koskela – An Exponential Integrator for Polynomially Perturbed Linear ODEs
• Edvin Deadman – Estimating the condition number of f(A)b

Peter Kandolf describing the famour “hump” in the matrix exponential.

## How to setup R using conda

Recently I’ve been working with some of the statistics staff at the University of Manchester on sports analytics. Specifically we’ve been looking for useful models in football data. People from this background normally use R to analyze data and fit models.

Normally I would use Python for this kind of task but, since there was already a considerable amount of code in R, it made sense for me to do some work in R. The people at Continuum Analytics (who make the brilliant Anaconda Python distribution) recently announced support for R using their package manager conda. However, it wasn’t easy to find instructions to get a fully working environment, so here is what I did.

## The Biennial Numerical Analysis Conference

This was the 26th Biennial Numerical Analysis Conference, marking 50 years since the conference series begun at the University of Saint Andrews in 1965. Some of the participants had been to (almost) every single one of the 26 conferences, whilst for other like myself it was their first time. The conference is renowned for the high quality of research presented, its friendly atmosphere, and the large range of topics presented.

## How To Prepare For Your Viva

I’ve finally finished! After years of reading papers, designing algorithms, hacking at code, and writing papers, my PhD is complete.

One of the most daunting thoughts I had as a PhD student was the idea of the viva: two experts sit in a room and pick apart the fine details of your work. They ask deep and technical questions, not limited merely to your thesis content, for a few hours (I’ve heard horror stories of 8 hours!) before sending you out of the room to discuss your fate. Fifteen minutes of palpitations later you get your result and (whatever the outcome) head to the pub, either to celebrate or drown your sorrows as appropriate.

In reality, because I was well prepared, my viva was actually just a chat with some knowledgeable people who were very interested in my work. There were a few curveball questions, nothing too serious, and the whole thing was done in an hour.

Here are some of my top tips for viva preparation.

The finished product!

## Ten Things I Learnt During My (PhD) Thesis

For the past three years I’ve been doing my PhD in applied maths at Manchester. Now that I’m almost ready to submit my thesis I thought I’d write up some tips for those who are just beginning their PhD journey.

## The NA-HPC Network

The NA-HPC Network is one of the groups funded by EPSRC Network Grants tasked with supporting the interaction and collaboration between numerical analysts, computer scientists, developers, and users of HPC systems within the UK.

Run by Nick Higham and David Silvester at Manchester the network has run a number of events over its 3 year lifespan. This post contains my highlights of the recent meeting at UCL, details of which can be found here.

## Using implicit matrices in Python

There are lots of new features in SciPy 0.13 (release notes) but for me the most important are the updated matrix functions in scipy.linalg and the one norm estimator in scipy.sparse.linalg.

In some of my recent research (related to section 4 of this) I’ve needed to estimate the one norm of a large (n^4 x n^2) dense matrix without computing each element. All we can assume is the ability to compute matrix-vector products (via some rather complicated function), meaning we only know the entries of the matrix implicitly.

## Gene Golub’s Summer Schools

The late Gene Golub made an enormous contribution to numerical analysis throughout his career. After becoming a professor at Stanford he published the Matrix Computations with Charles Van Loan which is one of the great textbooks in the field (the 4th edition was recently released). Gene also served as SIAM president and was the founding editor of both the SISC and SIMAX journals.

When Gene died he left a significant sum of money behind which SIAM has used to fund summer schools for PhD students on topics relating to numerical linear algebra.

More information on Gene Golub can be found at Wikipedia or in a transcribed interview with Prof. Nick Higham.

## Matrix Functions and Matrix Equations

This year the summer school was held in Shanghai and focused on matrix functions and matrix equations, the former of which is the subject of my PhD research. The next few paragraphs provide a brief summary of each course.

Group photo of everyone at G2S3.