When it comes to HPC development, linear algebra libraries are the essential building blocks. You often hear about BLAS/LAPACK, so what exactly are they and how do they relate to the top 500 run benchmark called LINPACK?
BLAS stands for basic linear algebra subprograms, it contains routines that does basic linear algebra functions like vectors and matrix multiplications. It has existed for nearly 30 years. LAPACK, you guessed it, is an abbreviation for linear algebra package. It is a larger set that include BLAS. Heavily utilized in high performance computing, BLAS/LAPACK have been implemented/optimized, re-implemented many times over for many platforms. LINPACK, though the official benchmark for HPC, relies on a specific subroutine DGEMM in BLAS to solve dense matrix in order to obtain measurement of FLOPS (floating point operations per second). LINKPACK, however, has little practical use, as it has been replaced by LAPACK. There are many commercial and non-commercial versions of BLAS/LAPACK. I’ll mention a few that works on windows.
refblas: The official reference implementation from netlib. C and Fortran implantations.
Intel MKL: The Intel Math Kernel Library.
ACML: The AMD Core Math Library.
ATLAS: Automatically Tuned Linear Algebra Software. (uses cygwin as build env)
CUDA SDK: The NVIDIA CUDA SDK includes BLAS functionality for writing C programs that runs on GeForce 8 Series graphics cards.
Goto BLAS: Kazushige Goto’s implementation of BLAS.
uBLAS: A generic C++ template class library providing BLAS functionality. Part of the Boost library. Note that, unlike other implementations, uBLAS focuses on correctness of the algorithms using advanced C++ features, rather than high performance.
GSL: The GNU Scientific Library Contains a multi-platform implementation in C. windows.
BLAS/LAPACk is the very basics of HPC numerical libraries. Many solver libraries such as PETSC and TRILINOS build on top of it. Here are three links back to my previous blogs that explain, Linear, non-linear, and Time-stepping problems.
Math under the hood (linear solvers)
Explicit and Implicit (time stepping, or PDE solvers)