Share to: share facebook share twitter share wa share telegram print page

 

Jacobi's formula

In matrix calculus, Jacobi's formula expresses the derivative of the determinant of a matrix A in terms of the adjugate of A and the derivative of A.[1]

If A is a differentiable map from the real numbers to n × n matrices, then

where tr(X) is the trace of the matrix X and is its adjugate matrix. (The latter equality only holds if A(t) is invertible.)

As a special case,

Equivalently, if dA stands for the differential of A, the general formula is

The formula is named after the mathematician Carl Gustav Jacob Jacobi.

Derivation

Via matrix computation

Theorem. (Jacobi's formula) For any differentiable map A from the real numbers to n × n matrices,

Proof. Laplace's formula for the determinant of a matrix A can be stated as

Notice that the summation is performed over some arbitrary row i of the matrix.

The determinant of A can be considered to be a function of the elements of A:

so that, by the chain rule, its differential is

This summation is performed over all n×n elements of the matrix.

To find ∂F/∂Aij consider that on the right hand side of Laplace's formula, the index i can be chosen at will. (In order to optimize calculations: Any other choice would eventually yield the same result, but it could be much harder). In particular, it can be chosen to match the first index of ∂ / ∂Aij:

Thus, by the product rule,

Now, if an element of a matrix Aij and a cofactor adjT(A)ik of element Aik lie on the same row (or column), then the cofactor will not be a function of Aij, because the cofactor of Aik is expressed in terms of elements not in its own row (nor column). Thus,

so

All the elements of A are independent of each other, i.e.

where δ is the Kronecker delta, so

Therefore,

Via chain rule

Lemma 1. , where is the differential of .

This equation means that the differential of , evaluated at the identity matrix, is equal to the trace. The differential is a linear operator that maps an n × n matrix to a real number.

Proof. Using the definition of a directional derivative together with one of its basic properties for differentiable functions, we have

is a polynomial in of order n. It is closely related to the characteristic polynomial of . The constant term in that polynomial (the term with ) is 1, while the linear term in is .

Lemma 2. For an invertible matrix A, we have: .

Proof. Consider the following function of X:

We calculate the differential of and evaluate it at using Lemma 1, the equation above, and the chain rule:

Theorem. (Jacobi's formula)

Proof. If is invertible, by Lemma 2, with

using the equation relating the adjugate of to . Now, the formula holds for all matrices, since the set of invertible linear matrices is dense in the space of matrices.

Via diagonalization

Both sides of the Jacobi formula are polynomials in the matrix coefficients of A and A'. It is therefore sufficient to verify the polynomial identity on the dense subset where the eigenvalues of A are distinct and nonzero.

If A factors differentiably as , then

In particular, if L is invertible, then and

Since A has distinct eigenvalues, there exists a differentiable complex invertible matrix L such that and D is diagonal. Then

Let , be the eigenvalues of A. Then

which is the Jacobi formula for matrices A with distinct nonzero eigenvalues.

Corollary

The following is a useful relation connecting the trace to the determinant of the associated matrix exponential:

This statement is clear for diagonal matrices, and a proof of the general claim follows.

For any invertible matrix , in the previous section "Via Chain Rule", we showed that

Considering in this equation yields:

The desired result follows as the solution to this ordinary differential equation.

Applications

Several forms of the formula underlie the Faddeev–LeVerrier algorithm for computing the characteristic polynomial, and explicit applications of the Cayley–Hamilton theorem. For example, starting from the following equation, which was proved above:

and using , we get:

where adj denotes the adjugate matrix.

Remarks

  1. ^ Magnus & Neudecker (1999, pp. 149–150), Part Three, Section 8.3

References

  • Magnus, Jan R.; Neudecker, Heinz (1999). Matrix Differential Calculus with Applications in Statistics and Econometrics (Revised ed.). Wiley. ISBN 0-471-98633-X.
  • Bellman, Richard (1997). Introduction to Matrix Analysis. SIAM. ISBN 0-89871-399-4.
Kembali kehalaman sebelumnya


Index: pl ar de en es fr it arz nl ja pt ceb sv uk vi war zh ru af ast az bg zh-min-nan bn be ca cs cy da et el eo eu fa gl ko hi hr id he ka la lv lt hu mk ms min no nn ce uz kk ro simple sk sl sr sh fi ta tt th tg azb tr ur zh-yue hy my ace als am an hyw ban bjn map-bms ba be-tarask bcl bpy bar bs br cv nv eml hif fo fy ga gd gu hak ha hsb io ig ilo ia ie os is jv kn ht ku ckb ky mrj lb lij li lmo mai mg ml zh-classical mr xmf mzn cdo mn nap new ne frr oc mhr or as pa pnb ps pms nds crh qu sa sah sco sq scn si sd szl su sw tl shn te bug vec vo wa wuu yi yo diq bat-smg zu lad kbd ang smn ab roa-rup frp arc gn av ay bh bi bo bxr cbk-zam co za dag ary se pdc dv dsb myv ext fur gv gag inh ki glk gan guw xal haw rw kbp pam csb kw km kv koi kg gom ks gcr lo lbe ltg lez nia ln jbo lg mt mi tw mwl mdf mnw nqo fj nah na nds-nl nrm nov om pi pag pap pfl pcd krc kaa ksh rm rue sm sat sc trv stq nso sn cu so srn kab roa-tara tet tpi to chr tum tk tyv udm ug vep fiu-vro vls wo xh zea ty ak bm ch ny ee ff got iu ik kl mad cr pih ami pwn pnt dz rmy rn sg st tn ss ti din chy ts kcg ve 
Prefix: a b c d e f g h i j k l m n o p q r s t u v w x y z 0 1 2 3 4 5 6 7 8 9