tsnnls is a fast solver for least-squares problems in the form Ax = b under the constraint that all entries in the solution vector x are non-negative. tsnnls is written in portable ANSI C, and designed to be linked easily into larger applications. The quality of solutions is comparable to those of MATLAB, but tsnnls is much faster and does not require the overhead of an interface with a large commercial math package. The tsnnls solver is based on the standard LAPACK and BLAS libraries, the supernodal multifrontal sparse Cholesky code from TAUCS, and the Stanford LSQR least-squares solver. Both TAUCS and LSQR are built into the tsnnls distribution for simplicity. tsnnls uses the GNU autoconf process to detect and link with a local copy of LAPACK and BLAS. For simplicity, we have designed tsnnls to work with the limited LAPACK functionality provided by ATLAS. A full LAPACK enables better self-testing of tsnnls.

We have tested the build of tsnnls on a RedHat Linux and Mac OS X 10.4-10.11 systems (we don’t anticipate problems with 10.12, but users should report any issues). It should build without modification on most POSIX platforms with the gcc 3.x or 4.x toolchain. The tsnnls source is freely available under the GNU LGPL . Users should note that the included Stanford LSQR (license) and TAUCS (license) libraries are covered under different free-use licenses.

tsnnls is the computational core of our ridgerunner software for simulating the tightening of knotted strings and other self-contact problems in physics and graphics. Some animations of ridgerunner tightenings are posted on the web . Let us know how you use tsnnls in your applications!

We have recently upgraded tsnnls to version 2.3.4. The interface in the current version is backwards compatible with the original release of tsnnls. The internals of tsnnls are quite different– we have replaced the original solver with a more robust design based on the block3 algorithm of Adlers. (See Adlers’ thesis Sparse Least Squares Problems with Box Constraints for a complete description of the algorithm.) The new release also includes a win32 version of the code, courtesy of Tom Finnigan of tsplines, inc. Improvements to the current build to link better with the Accelerate framework on Mac OS 10.11 were funded by the generous support of Perkin-Elmer, Inc. The original authors of the package are Jason Cantarella, Michael Piatek, and Eric Rawdon.

Tsnnls works well with reasonably conditioned matrices with sizes up to a few thousand by a few thousand. It is not designed for much larger matrices, though it may work anyway. It is not designed for underdetermined least squares problems, so if the input matrix is not full rank, you’ll need to reformulate the problem to eliminate the indeterminacy (or find another solver!).

Tsnnls is mature software. It is quite stable and bug-free. Newly discovered bugs will be fixed, but we do not have plans for major upgrades.

## Download

Brave souls can always get the latest version of the source from the public subversion repository at http://www.jasoncantarella.com/tsnnls/. Otherwise, help yourself to one of the point releases below.

- tsnnls README
- tsnnls 2.3.4 distribution now with support for Mac OS 10.11, and better linking with the Accelerate framework, as well as some improvements to the way tsnnls detects sizes of integer types. Some additional potential bugs found by stricter compiler checks were also fixed (though it’s not clear they ever occurred in practice).
- tsnnls 2.3.3 distribution a few minor bug fixes, now compiles and checks correctly even if argtable2 is not present.
- tsnnls 2.2.3 distribution first release under LGPL, includes win32 support
- tsnnls 2.01.2. distribution fixes rarely seen bugs in 2.0.2
- tsnnls 2.0.2. distribution
- tsnnls 1.0.1 distribution includes the same data, but is upgraded with minor bug patches and GNU Autotools installation.
- tsnnls 1.0 distribution (4.7 megs) with Stanford SOL LSQR, limited TAUCS, and generic LAPACK/BLAS included (MD5: a20dca4923aa5bed73c8fac0f58cb637)

We have not published a paper on tsnnls, but we did publish the following preprint describing Tsnnls on arXiv. This is the best academic citation for work involving Tsnnls: Tsnnls preprint.

## Funding

We were supported by the NSF during the development of tsnnls. This work was funded by the the University of Georgia VIGRE grant DMS-00-8992 and DMS-02-04826 (to Cantarella and Fu). Piatek was supported through DMS-0311010 to Eric Rawdon . We also received support from DMS-0810415. Improvements to the build process for OSX were funded by Perkin-Elmer, Inc. in 2016.