Numerical Range for Some Complex Upper Triangular Matrices

This Demonstration gives a portrait of the neighborhood of the spectrum of a matrix , including an approximation of the numerical range of matrices acting in , with low dimension .
We restrict our attention to upper triangular matrices, since neither the numerical range nor the spectrum is altered by a unitary similarity transformation. For every matrix there exists a unitary similarity that transforms to upper triangular form (in Mathematica this can be achieved by SchurDecomposition[B]). Note that the diagonal elements of an upper triangular matrix are exactly the eigenvalues, and an upper triangular matrix that is normal must be diagonal.
Choose a dimension! Then some of the entries of a default matrix are depicted as locators: all diagonal entries (with a red inner point), some strict upper diagonal entries (with a blue inner point), and no others. Moreover, for your convenience, you can see the convex hulls of the entries of (in blue) and of the diagonal entries of (in red), which, at the same time, are the eigenvalues of .
In addition, the whole matrix is shown below the graphic and updated as you drag any locator.
A yellow point with a red boundary marks the mean of the diagonal entries of , which is also the mean of the eigenvalues of . In the literature this is usually expressed as . We call this point the hub of , because it is the natural pivot point for all matrices similar to . Indeed, one could claim that the whole question of similarity hinges on this point! For contrast, a light blue point with a dark blue boundary marks the mean of all the entries of .
Most importantly, a family of yellow rectangles of different orientations is drawn with a green frame around each. Their intersection constitutes an outer approximation of the field of values. The number of those rectangles can be influenced by pushing the slider for the minimal rotational angle .
To be able to go back to the original matrices, click "reset ".
To be able to compare results for different dimensions more easily, you can fix the plot range.
Now, you can experiment by dragging some of the locators. You will probably be surprised!
You can neither create nor delete locators; you can only drag the ones given.
Almost every item in the graphics is annotated, so mouseover them to see explanations.


  • [Snapshot]
  • [Snapshot]
  • [Snapshot]
  • [Snapshot]


When you try to locate the eigenvalues of complex matrices in the complex plane or to separate characteristic features of nonnormal matrices from normal ones, you inevitably will find terms like field of values, numerical range, Wertebereich, or Rayleigh quotient, and maybe also the important new notion of pseudospectra. These ideas are in part more than a hundred years old, but up to and including Mathematica 7, there are no built-in functions for them. When starting to fill this gap it seems appropriate to give more explanations than usual; more details can be found in any of the books mentioned in the references.
Normal matrices are usually characterized as the ones that commute with their adjoints , but this description appears anemic. Some of the really attractive properties that make them interesting are: they are not deficient, that is, the algebraic and geometric multiplicities of each eigenvalue agree, and the eigenspaces of different eigenvalues are orthogonal. So, within each eigenspace, you can find an orthonormal basis of eigenvectors spanning that particular eigenspace. Since different eigenspaces are orthogonal, too, for normal matrices , there is an orthonormal set of eigenvectors that spans the whole space, that is, for normal matrices there exists an orthonormal basis consisting purely of eigenvectors of . For a discussion of normal matrices, see, for instance, Teil 1, §16 of [1], and chapter 2 of [2].
Nonnormal matrices are either deficient, in which case there are not even enough linearly independent eigenvectors to build a basis (never mind an orthonormal basis), or, if they are diagonalizable, lack the property that eigenspaces belonging to different eigenvalues are orthogonal. In the latter case there certainly exists a basis consisting purely of eigenvectors, even normalized ones, but not orthonormal ones. For if they were orthonormalized, some of them would not be eigenvectors anymore, because a linear combination of eigenvectors belonging to different eigenvalues can never be an eigenvector!
For a normal matrix , all that matters is the knowledge of its spectrum . Geometrically, the spectrum of a matrix is a point set in the complex plane consisting of at least one to at most points, where is the dimension (over ) of the space in which acts.
Within an orthonormal basis of eigenvectors of , the action of the normal matrix reduces to a simple multiplication of the eigenvectors with their corresponding eigenvalues, that is, for an eigenvector with corresponding eigenvalue , the equation holds, which says that the image of an eigenvector under the mapping is just the -fold of itself. Taking the Fourier coefficient with on both sides, one gets , or, , where denotes the inner product of .
But the ratio is well defined for any nonzero vector and any matrix ; the ratio is called the Rayleigh quotient of with respect to . Its relevance lies in showing how strongly any vector is magnified in its own direction under the action of the mapping , regardless of whether is an eigenvector or not! Note that, as long as is not an eigenvector, this is not the same as comparing the norm of an image to the norm of its preimage, because !
Since the Rayleigh quotient is constant on any one-dimensional subspace, it suffices to restrict its domain to the unit ball , from where it acts as a continuous mapping into the complex numbers.
The whole set of self--magnifiers of , the set of images under this mapping, that is, , is called the field of values of , the numerical range of , or, in German, der Wertebereich von .
It is clear that the spectrum of is part of the field of values of , that is, , but more can be said (see chapter 1, pp. 5–13 of [4]): is a compact, connected, and convex subset of the complex numbers that includes the closure of the convex hull of the spectrum , that is, . (The convex hull of a set is the intersection of all convex sets that include .)
Because the diagonal elements of , can be found as the Rayleigh quotients of the standard orthonormal basis vectors with respect to , that is, , the whole diagonal of is also part of the field of values, that is, .
For a normal matrix , , so that these two sets actually coincide! In this case, the strongest magnification that can occur must take place in the eigenspace belonging to the eigenvalue with the biggest modulus; for all other vectors the magnification will be smaller!
Not so for a nonnormal matrix! In the nonnormal case, the eigenvalue with the biggest modulus must by no means necessarily represent the strongest amplification factor!
Consider, for example, the upper triangular matrix , with . is deficient; it has one two-fold eigenvalue 1, but only one corresponding eigenvector, namely , which happens to be just the standard unit vector . So is an orthonormal basis of . Let be any vector in with positive coefficients. Then
But , so ! And the bigger is, the bigger is!
So, for a nonnormal matrix , the knowledge of its spectrum alone might lead to premature conclusions! To be able to understand the action of the underlying mapping, additional information must be taken into account. An important item in that direction will be the determination of the numerical range.
But even for normal matrices , finding more or less crude supersets of the numerical range is a big gain if one is to enclose the spectrum ; see [3], for instance.
Manipulate Graphic: Get familiar with what is shown! Mouseover the graphics to see explanations, push buttons one after the other; drag any of the locators. After any change, mouseover the graphic again. Almost every item is explained, but some annotations only show up when the respective items are not shadowed by other ones.
[1] R. Zurmühl and S. Falk, Matrizen und ihre Anwendungen, Berlin/Heidelberg/New York: Springer–Verlag, 1984.
[2] R. A. Horn and C. R. Johnson, Matrix Analysis, New York: Cambridge University Press, 1985.
[3] R. S. Varga, Gershgorin and His Circles, Berlin/Heidelberg/New York: Springer–Verlag, 2004.
[4] R. A. Horn and C. R. Johnson, Topics in Matrix Analysis, New York: Cambridge University Press, 1991.
[5] L. N. Trefethen and M. Embree, Spectra and Pseudospectra: The Behavior of Nonnormal Matrices and Operators, New Jersey: Princeton University Press, 2005.
    • Share:

Embed Interactive Demonstration New!

Just copy and paste this snippet of JavaScript code into your website or blog to put the live Demonstration on your site. More details »

Files require Wolfram CDF Player or Mathematica.