Epstein Files Full PDF

CLICK HERE
Technopedia Center
PMB University Brochure
Faculty of Engineering and Computer Science
S1 Informatics S1 Information Systems S1 Information Technology S1 Computer Engineering S1 Electrical Engineering S1 Civil Engineering

faculty of Economics and Business
S1 Management S1 Accountancy

Faculty of Letters and Educational Sciences
S1 English literature S1 English language education S1 Mathematics education S1 Sports Education
teknopedia

  • Registerasi
  • Brosur UTI
  • Kip Scholarship Information
  • Performance
Flag Counter
  1. World Encyclopedia
  2. Change of basis - Wikipedia
Change of basis - Wikipedia
From Wikipedia, the free encyclopedia
(Redirected from Coordinate change)
Coordinate change in linear algebra
Not to be confused with Change of base.
icon
This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: "Change of basis" – news · newspapers · books · scholar · JSTOR
(November 2017) (Learn how and when to remove this message)
A linear combination of one basis of vectors (purple) obtains new vectors (red). If they are linearly independent, these form a new basis. The linear combinations relating the first basis to the other extend to a linear transformation, called the change of basis.
A vector represented by two different bases (purple and red arrows).

In mathematics, an ordered basis of a vector space of finite dimension n allows representing uniquely any element of the vector space by a coordinate vector, which is a sequence of n scalars called coordinates. If two different bases are considered, the coordinate vector that represents a vector v on one basis is, in general, different from the coordinate vector that represents v on the other basis. A change of basis consists of converting every assertion expressed in terms of coordinates relative to one basis into an assertion expressed in terms of coordinates relative to the other basis.[1][2][3]

Such a conversion results from the change-of-basis formula which expresses the coordinates relative to one basis in terms of coordinates relative to the other basis. Using matrices, this formula can be written

x o l d = A x n e w , {\displaystyle \mathbf {x} _{\mathrm {old} }=A\,\mathbf {x} _{\mathrm {new} },} {\displaystyle \mathbf {x} _{\mathrm {old} }=A\,\mathbf {x} _{\mathrm {new} },}

where "old" and "new" refer respectively to the initially defined basis and the other basis, x o l d {\displaystyle \mathbf {x} _{\mathrm {old} }} {\displaystyle \mathbf {x} _{\mathrm {old} }} and x n e w {\displaystyle \mathbf {x} _{\mathrm {new} }} {\displaystyle \mathbf {x} _{\mathrm {new} }} are the column vectors of the coordinates of the same vector on the two bases. A {\displaystyle A} {\displaystyle A} is the change-of-basis matrix (also called transition matrix), which is the matrix whose columns are the coordinates of the new basis vectors on the old basis.

A change of basis is sometimes called a change of coordinates, although it excludes many coordinate transformations. For applications in physics and specially in mechanics, a change of basis often involves the transformation of an orthonormal basis, understood as a rotation in physical space, thus excluding translations. This article deals mainly with finite-dimensional vector spaces. However, many of the principles are also valid for infinite-dimensional vector spaces.

Change of basis formula

[edit]

Let B o l d = ( v 1 , … , v n ) {\displaystyle B_{\mathrm {old} }=(v_{1},\ldots ,v_{n})} {\displaystyle B_{\mathrm {old} }=(v_{1},\ldots ,v_{n})} be a basis of a finite-dimensional vector space V over a field F.[a]

For j = 1, ..., n, one can define a vector wj by its coordinates a i , j {\displaystyle a_{i,j}} {\displaystyle a_{i,j}} over B o l d : {\displaystyle B_{\mathrm {old} }\colon } {\displaystyle B_{\mathrm {old} }\colon }

w j = ∑ i = 1 n a i , j v i . {\displaystyle w_{j}=\sum _{i=1}^{n}a_{i,j}v_{i}.} {\displaystyle w_{j}=\sum _{i=1}^{n}a_{i,j}v_{i}.}

Let

A = ( a i , j ) i , j {\displaystyle A=\left(a_{i,j}\right)_{i,j}} {\displaystyle A=\left(a_{i,j}\right)_{i,j}}

be the matrix whose jth column is formed by the coordinates of wj. (Here and in what follows, the index i refers always to the rows of A and the v i , {\displaystyle v_{i},} {\displaystyle v_{i},} while the index j refers always to the columns of A and the w j ; {\displaystyle w_{j};} {\displaystyle w_{j};} such a convention is useful for avoiding errors in explicit computations.)

Setting B n e w = ( w 1 , … , w n ) , {\displaystyle B_{\mathrm {new} }=(w_{1},\ldots ,w_{n}),} {\displaystyle B_{\mathrm {new} }=(w_{1},\ldots ,w_{n}),} one has that B n e w {\displaystyle B_{\mathrm {new} }} {\displaystyle B_{\mathrm {new} }} is a basis of V if and only if the matrix A is invertible, or equivalently if it has a nonzero determinant. In this case, A is said to be the change-of-basis matrix from the basis B o l d {\displaystyle B_{\mathrm {old} }} {\displaystyle B_{\mathrm {old} }} to the basis B n e w . {\displaystyle B_{\mathrm {new} }.} {\displaystyle B_{\mathrm {new} }.}

Given a vector z ∈ V , {\displaystyle z\in V,} {\displaystyle z\in V,} let ( x 1 , … , x n ) {\displaystyle (x_{1},\ldots ,x_{n})} {\displaystyle (x_{1},\ldots ,x_{n})} be the coordinates of z {\displaystyle z} {\displaystyle z} over B o l d , {\displaystyle B_{\mathrm {old} },} {\displaystyle B_{\mathrm {old} },} and ( y 1 , … , y n ) {\displaystyle (y_{1},\ldots ,y_{n})} {\displaystyle (y_{1},\ldots ,y_{n})} its coordinates over B n e w ; {\displaystyle B_{\mathrm {new} };} {\displaystyle B_{\mathrm {new} };} that is

z = ∑ i = 1 n x i v i = ∑ j = 1 n y j w j . {\displaystyle z=\sum _{i=1}^{n}x_{i}v_{i}=\sum _{j=1}^{n}y_{j}w_{j}.} {\displaystyle z=\sum _{i=1}^{n}x_{i}v_{i}=\sum _{j=1}^{n}y_{j}w_{j}.}

(One could take the same summation index for the two sums, but choosing systematically the indexes i for the old basis and j for the new one makes clearer the formulas that follows, and helps avoiding errors in proofs and explicit computations.)

The change-of-basis formula expresses the coordinates over the old basis in terms of the coordinates over the new basis. With above notation, it is

x i = ∑ j = 1 n a i , j y j for  i = 1 , … , n . {\displaystyle x_{i}=\sum _{j=1}^{n}a_{i,j}y_{j}\qquad {\text{for }}i=1,\ldots ,n.} {\displaystyle x_{i}=\sum _{j=1}^{n}a_{i,j}y_{j}\qquad {\text{for }}i=1,\ldots ,n.}

In terms of matrices, the change of basis formula is

x = A y , {\displaystyle \mathbf {x} =A\,\mathbf {y} ,} {\displaystyle \mathbf {x} =A\,\mathbf {y} ,}

where x {\displaystyle \mathbf {x} } {\displaystyle \mathbf {x} } and y {\displaystyle \mathbf {y} } {\displaystyle \mathbf {y} } are the column vectors of the coordinates of z over B o l d {\displaystyle B_{\mathrm {old} }} {\displaystyle B_{\mathrm {old} }} and B n e w , {\displaystyle B_{\mathrm {new} },} {\displaystyle B_{\mathrm {new} },} respectively.

Proof: Using the above definition of the change-of basis matrix, one has

z = ∑ j = 1 n y j w j = ∑ j = 1 n ( y j ∑ i = 1 n a i , j v i ) = ∑ i = 1 n ( ∑ j = 1 n a i , j y j ) v i . {\displaystyle {\begin{aligned}z&=\sum _{j=1}^{n}y_{j}w_{j}\\&=\sum _{j=1}^{n}\left(y_{j}\sum _{i=1}^{n}a_{i,j}v_{i}\right)\\&=\sum _{i=1}^{n}\left(\sum _{j=1}^{n}a_{i,j}y_{j}\right)v_{i}.\end{aligned}}} {\displaystyle {\begin{aligned}z&=\sum _{j=1}^{n}y_{j}w_{j}\\&=\sum _{j=1}^{n}\left(y_{j}\sum _{i=1}^{n}a_{i,j}v_{i}\right)\\&=\sum _{i=1}^{n}\left(\sum _{j=1}^{n}a_{i,j}y_{j}\right)v_{i}.\end{aligned}}}

As z = ∑ i = 1 n x i v i , {\displaystyle z=\textstyle \sum _{i=1}^{n}x_{i}v_{i},} {\displaystyle z=\textstyle \sum _{i=1}^{n}x_{i}v_{i},} the change-of-basis formula results from the uniqueness of the decomposition of a vector over a basis.

Example

[edit]

Consider the Euclidean vector space R 2 {\displaystyle \mathbb {R} ^{2}} {\displaystyle \mathbb {R} ^{2}} and a basis consisting of the vectors v 1 = ( 1 , 0 ) {\displaystyle v_{1}=(1,0)} {\displaystyle v_{1}=(1,0)} and v 2 = ( 0 , 1 ) . {\displaystyle v_{2}=(0,1).} {\displaystyle v_{2}=(0,1).} If one rotates them by an angle of t, one has a new basis formed by w 1 = ( cos ⁡ t , sin ⁡ t ) {\displaystyle w_{1}=(\cos t,\sin t)} {\displaystyle w_{1}=(\cos t,\sin t)} and w 2 = ( − sin ⁡ t , cos ⁡ t ) . {\displaystyle w_{2}=(-\sin t,\cos t).} {\displaystyle w_{2}=(-\sin t,\cos t).}

So, the change-of-basis matrix is [ cos ⁡ t − sin ⁡ t sin ⁡ t cos ⁡ t ] . {\displaystyle {\begin{bmatrix}\cos t&-\sin t\\\sin t&\cos t\end{bmatrix}}.} {\displaystyle {\begin{bmatrix}\cos t&-\sin t\\\sin t&\cos t\end{bmatrix}}.}

The change-of-basis formula asserts that, if y 1 , y 2 {\displaystyle y_{1},y_{2}} {\displaystyle y_{1},y_{2}} are the new coordinates of a vector ( x 1 , x 2 ) , {\displaystyle (x_{1},x_{2}),} {\displaystyle (x_{1},x_{2}),} then one has

[ x 1 x 2 ] = [ cos ⁡ t − sin ⁡ t sin ⁡ t cos ⁡ t ] [ y 1 y 2 ] . {\displaystyle {\begin{bmatrix}x_{1}\\x_{2}\end{bmatrix}}={\begin{bmatrix}\cos t&-\sin t\\\sin t&\cos t\end{bmatrix}}\,{\begin{bmatrix}y_{1}\\y_{2}\end{bmatrix}}.} {\displaystyle {\begin{bmatrix}x_{1}\\x_{2}\end{bmatrix}}={\begin{bmatrix}\cos t&-\sin t\\\sin t&\cos t\end{bmatrix}}\,{\begin{bmatrix}y_{1}\\y_{2}\end{bmatrix}}.}

That is,

x 1 = y 1 cos ⁡ t − y 2 sin ⁡ t and x 2 = y 1 sin ⁡ t + y 2 cos ⁡ t . {\displaystyle x_{1}=y_{1}\cos t-y_{2}\sin t\qquad {\text{and}}\qquad x_{2}=y_{1}\sin t+y_{2}\cos t.} {\displaystyle x_{1}=y_{1}\cos t-y_{2}\sin t\qquad {\text{and}}\qquad x_{2}=y_{1}\sin t+y_{2}\cos t.}

This may be verified by writing

x 1 v 1 + x 2 v 2 = ( y 1 cos ⁡ t − y 2 sin ⁡ t ) v 1 + ( y 1 sin ⁡ t + y 2 cos ⁡ t ) v 2 = y 1 ( cos ⁡ ( t ) v 1 + sin ⁡ ( t ) v 2 ) + y 2 ( − sin ⁡ ( t ) v 1 + cos ⁡ ( t ) v 2 ) = y 1 w 1 + y 2 w 2 . {\displaystyle {\begin{aligned}x_{1}v_{1}+x_{2}v_{2}&=(y_{1}\cos t-y_{2}\sin t)v_{1}+(y_{1}\sin t+y_{2}\cos t)v_{2}\\&=y_{1}(\cos(t)v_{1}+\sin(t)v_{2})+y_{2}(-\sin(t)v_{1}+\cos(t)v_{2})\\&=y_{1}w_{1}+y_{2}w_{2}.\end{aligned}}} {\displaystyle {\begin{aligned}x_{1}v_{1}+x_{2}v_{2}&=(y_{1}\cos t-y_{2}\sin t)v_{1}+(y_{1}\sin t+y_{2}\cos t)v_{2}\\&=y_{1}(\cos(t)v_{1}+\sin(t)v_{2})+y_{2}(-\sin(t)v_{1}+\cos(t)v_{2})\\&=y_{1}w_{1}+y_{2}w_{2}.\end{aligned}}}

In terms of linear maps

[edit]

Normally, a matrix represents a linear map, and the product of a matrix and a column vector represents the function application of the corresponding linear map to the vector whose coordinates form the column vector. The change-of-basis formula is a specific case of this general principle, although this is not immediately clear from its definition and proof.

When one says that a matrix represents a linear map, one refers implicitly to bases of implied vector spaces, and to the fact that the choice of a basis induces an isomorphism between a vector space and Fn, where F is the field of scalars. When only one basis is considered for each vector space, it is convenient to leave this isomorphism implicit, and to work up to an isomorphism. As several bases of the same vector space are considered here, a more accurate wording is required.

Let F be a field, the set F n {\displaystyle F^{n}} {\displaystyle F^{n}} of the n-tuples is a F-vector space whose addition and scalar multiplication are defined component-wise. Its standard basis is the basis that has as its ith element the tuple with all components equal to 0 except the ith that is 1.

A basis B = ( v 1 , … , v n ) {\displaystyle B=(v_{1},\ldots ,v_{n})} {\displaystyle B=(v_{1},\ldots ,v_{n})} of a F-vector space V defines a linear isomorphism ϕ : F n → V {\displaystyle \phi \colon F^{n}\to V} {\displaystyle \phi \colon F^{n}\to V} by

ϕ ( x 1 , … , x n ) = ∑ i = 1 n x i v i . {\displaystyle \phi (x_{1},\ldots ,x_{n})=\sum _{i=1}^{n}x_{i}v_{i}.} {\displaystyle \phi (x_{1},\ldots ,x_{n})=\sum _{i=1}^{n}x_{i}v_{i}.}

Conversely, such a linear isomorphism defines a basis, which is the image by ϕ {\displaystyle \phi } {\displaystyle \phi } of the standard basis of F n . {\displaystyle F^{n}.} {\displaystyle F^{n}.}

Let B o l d = ( v 1 , … , v n ) {\displaystyle B_{\mathrm {old} }=(v_{1},\ldots ,v_{n})} {\displaystyle B_{\mathrm {old} }=(v_{1},\ldots ,v_{n})} be the "old basis" of a change of basis, and ϕ o l d {\displaystyle \phi _{\mathrm {old} }} {\displaystyle \phi _{\mathrm {old} }} the associated isomorphism. Given a change-of basis matrix A, one could consider it the matrix of an endomorphism ψ A {\displaystyle \psi _{A}} {\displaystyle \psi _{A}} of F n . {\displaystyle F^{n}.} {\displaystyle F^{n}.} Finally, define

ϕ n e w = ϕ o l d ∘ ψ A {\displaystyle \phi _{\mathrm {new} }=\phi _{\mathrm {old} }\circ \psi _{A}} {\displaystyle \phi _{\mathrm {new} }=\phi _{\mathrm {old} }\circ \psi _{A}}

(where ∘ {\displaystyle \circ } {\displaystyle \circ } denotes function composition), and

B n e w = ϕ n e w ( ϕ o l d − 1 ( B o l d ) ) . {\displaystyle B_{\mathrm {new} }=\phi _{\mathrm {new} }(\phi _{\mathrm {old} }^{-1}(B_{\mathrm {old} })).} {\displaystyle B_{\mathrm {new} }=\phi _{\mathrm {new} }(\phi _{\mathrm {old} }^{-1}(B_{\mathrm {old} })).}

A straightforward verification shows that this definition of B n e w {\displaystyle B_{\mathrm {new} }} {\displaystyle B_{\mathrm {new} }} is the same as that of the preceding section.

Now, by composing the equation ϕ n e w = ϕ o l d ∘ ψ A {\displaystyle \phi _{\mathrm {new} }=\phi _{\mathrm {old} }\circ \psi _{A}} {\displaystyle \phi _{\mathrm {new} }=\phi _{\mathrm {old} }\circ \psi _{A}} with ϕ o l d − 1 {\displaystyle \phi _{\mathrm {old} }^{-1}} {\displaystyle \phi _{\mathrm {old} }^{-1}} on the left and ϕ n e w − 1 {\displaystyle \phi _{\mathrm {new} }^{-1}} {\displaystyle \phi _{\mathrm {new} }^{-1}} on the right, one gets

ϕ o l d − 1 = ψ A ∘ ϕ n e w − 1 . {\displaystyle \phi _{\mathrm {old} }^{-1}=\psi _{A}\circ \phi _{\mathrm {new} }^{-1}.} {\displaystyle \phi _{\mathrm {old} }^{-1}=\psi _{A}\circ \phi _{\mathrm {new} }^{-1}.}

It follows that, for v ∈ V , {\displaystyle v\in V,} {\displaystyle v\in V,} one has

ϕ o l d − 1 ( v ) = ψ A ( ϕ n e w − 1 ( v ) ) , {\displaystyle \phi _{\mathrm {old} }^{-1}(v)=\psi _{A}(\phi _{\mathrm {new} }^{-1}(v)),} {\displaystyle \phi _{\mathrm {old} }^{-1}(v)=\psi _{A}(\phi _{\mathrm {new} }^{-1}(v)),}

which is the change-of-basis formula expressed in terms of linear maps instead of coordinates.

Function defined on a vector space

[edit]

A function that has a vector space as its domain is commonly specified as a multivariate function whose variables are the coordinates on some basis of the vector on which the function is applied.

When the basis is changed, the expression of the function is changed. This change can be computed by substituting the "old" coordinates for their expressions in terms of the "new" coordinates. More precisely, if f(x) is the expression of the function in terms of the old coordinates, and if x = Ay is the change-of-base formula, then f(Ay) is the expression of the same function in terms of the new coordinates.

The fact that the change-of-basis formula expresses the old coordinates in terms of the new one may seem unnatural, but appears as useful, as no matrix inversion is needed here.

As the change-of-basis formula involves only linear functions, many function properties are kept by a change of basis. This allows defining these properties as properties of functions of a variable vector that are not related to any specific basis. So, a function whose domain is a vector space or a subset of it is

  • a linear function,
  • a polynomial function,
  • a continuous function,
  • a differentiable function,
  • a smooth function,
  • an analytic function,

if the multivariate function that represents it on some basis—and thus on every basis—has the same property.

This is specially useful in the theory of manifolds, as this allows extending the concepts of continuous, differentiable, smooth and analytic functions to functions that are defined on a manifold.

Linear maps

[edit]

Consider a linear map T: W → V from a vector space W of dimension n to a vector space V of dimension m. It is represented on "old" bases of V and W by a m×n matrix M. A change of bases is defined by an m×m change-of-basis matrix P for V, and an n×n change-of-basis matrix Q for W.

On the "new" bases, the matrix representation of T is

P − 1 M Q . {\displaystyle P^{-1}MQ.} {\displaystyle P^{-1}MQ.}

This is a straightforward consequence of the change-of-basis formula.

Endomorphisms

[edit]

Endomorphisms are linear maps from a vector space V to itself. For a change of basis, the formula of the preceding section applies, with the same change-of-basis matrix on both sides of the formula. That is, if M is the square matrix of an endomorphism of V over an "old" basis, and P is a change-of-basis matrix, then the matrix of the endomorphism on the "new" basis is

P − 1 M P . {\displaystyle P^{-1}MP.} {\displaystyle P^{-1}MP.}

As every invertible matrix can be used as a change-of-basis matrix, this implies that two matrices are similar if and only if they represent the same endomorphism on two different bases.

Bilinear forms

[edit]

A bilinear form on a vector space V over a field F is a function V × V → F which is linear in both arguments. That is, B : V × V → F is bilinear if the maps v ↦ B ( v , w ) {\displaystyle v\mapsto B(v,w)} {\displaystyle v\mapsto B(v,w)} and v ↦ B ( w , v ) {\displaystyle v\mapsto B(w,v)} {\displaystyle v\mapsto B(w,v)} are linear for every fixed w ∈ V . {\displaystyle w\in V.} {\displaystyle w\in V.}

The matrix B of a bilinear form B on a basis ( v 1 , … , v n ) {\displaystyle (v_{1},\ldots ,v_{n})} {\displaystyle (v_{1},\ldots ,v_{n})} (the "old" basis in what follows) is the matrix whose entry of the ith row and jth column is B ( v i , v j ) {\displaystyle B(v_{i},v_{j})} {\displaystyle B(v_{i},v_{j})}. It follows that if v and w are the column vectors of the coordinates of two vectors v and w, one has

B ( v , w ) = v T B w , {\displaystyle B(v,w)=\mathbf {v} ^{\mathsf {T}}\mathbf {B} \mathbf {w} ,} {\displaystyle B(v,w)=\mathbf {v} ^{\mathsf {T}}\mathbf {B} \mathbf {w} ,}

where v T {\displaystyle \mathbf {v} ^{\mathsf {T}}} {\displaystyle \mathbf {v} ^{\mathsf {T}}} denotes the transpose of the matrix v.

If P is a change of basis matrix, then a straightforward computation shows that the matrix of the bilinear form on the new basis is

P T B P . {\displaystyle P^{\mathsf {T}}\mathbf {B} P.} {\displaystyle P^{\mathsf {T}}\mathbf {B} P.}

A symmetric bilinear form is a bilinear form B such that B ( v , w ) = B ( w , v ) {\displaystyle B(v,w)=B(w,v)} {\displaystyle B(v,w)=B(w,v)} for every v and w in V. It follows that the matrix of B on any basis is symmetric. This implies that the property of being a symmetric matrix must be kept by the above change-of-base formula. One can also check this by noting that the transpose of a matrix product is the product of the transposes computed in the reverse order. In particular,

( P T B P ) T = P T B T P , {\displaystyle (P^{\mathsf {T}}\mathbf {B} P)^{\mathsf {T}}=P^{\mathsf {T}}\mathbf {B} ^{\mathsf {T}}P,} {\displaystyle (P^{\mathsf {T}}\mathbf {B} P)^{\mathsf {T}}=P^{\mathsf {T}}\mathbf {B} ^{\mathsf {T}}P,}

and the two members of this equation equal P T B P {\displaystyle P^{\mathsf {T}}\mathbf {B} P} {\displaystyle P^{\mathsf {T}}\mathbf {B} P} if the matrix B is symmetric.

If the characteristic of the ground field F is not two, then for every symmetric bilinear form there is a basis for which the matrix is diagonal. Moreover, the resulting nonzero entries on the diagonal are defined up to the multiplication by a square. So, if the ground field is the field R {\displaystyle \mathbb {R} } {\displaystyle \mathbb {R} } of the real numbers, these nonzero entries can be chosen to be either 1 or –1. Sylvester's law of inertia is a theorem that asserts that the numbers of 1 and of –1 depends only on the bilinear form, and not of the change of basis.

Symmetric bilinear forms over the reals are often encountered in geometry and physics, typically in the study of quadrics and of the inertia of a rigid body. In these cases, orthonormal bases are specially useful; this means that one generally prefer to restrict changes of basis to those that have an orthogonal change-of-base matrix, that is, a matrix such that P T = P − 1 . {\displaystyle P^{\mathsf {T}}=P^{-1}.} {\displaystyle P^{\mathsf {T}}=P^{-1}.} Such matrices have the fundamental property that the change-of-base formula is the same for a symmetric bilinear form and the endomorphism that is represented by the same symmetric matrix. The Spectral theorem asserts that, given such a symmetric matrix, there is an orthogonal change of basis such that the resulting matrix (of both the bilinear form and the endomorphism) is a diagonal matrix with the eigenvalues of the initial matrix on the diagonal. It follows that, over the reals, if the matrix of an endomorphism is symmetric, then it is diagonalizable.

See also

[edit]
  • Active and passive transformation
  • Covariance and contravariance of vectors
  • Integral transform, the continuous analogue of change of basis.
  • Chirgwin-Coulson weights — application in computational chemistry

Notes

[edit]
  1. ^ Although a basis is generally defined as a set of vectors (for example, as a spanning set that is linearly independent), the tuple notation is convenient here, since the indexing by the first positive integers makes the basis an ordered basis.

References

[edit]
  1. ^ Anton (1987, pp. 221–237)
  2. ^ Beauregard & Fraleigh (1973, pp. 240–243)
  3. ^ Nering (1970, pp. 50–52)

Bibliography

[edit]
  • Anton, Howard (1987), Elementary Linear Algebra (5th ed.), New York: Wiley, ISBN 0-471-84819-0
  • Beauregard, Raymond A.; Fraleigh, John B. (1973), A First Course In Linear Algebra: with Optional Introduction to Groups, Rings, and Fields, Boston: Houghton Mifflin Company, ISBN 0-395-14017-X
  • Nering, Evar D. (1970), Linear Algebra and Matrix Theory (2nd ed.), New York: Wiley, LCCN 76091646

External links

[edit]
  • MIT Linear Algebra Lecture on Change of Basis, from MIT OpenCourseWare
  • Khan Academy Lecture on Change of Basis, from Khan Academy
  • v
  • t
  • e
Linear algebra
  • Outline
  • Glossary
  • Template:Matrix classes
Linear equations
  • Linear equation
  • System of linear equations
  • Determinant
  • Minor
  • Cauchy–Binet formula
  • Cramer's rule
  • Gaussian elimination
  • Gauss–Jordan elimination
  • Overcompleteness
  • Strassen algorithm
Three dimensional Euclidean space
Matrices
  • Matrix
  • Matrix addition
  • Matrix multiplication
  • Basis transformation matrix
  • Characteristic polynomial
  • Spectrum
  • Trace
  • Eigenvalue, eigenvector and eigenspace
  • Cayley–Hamilton theorem
  • Jordan normal form
  • Weyr canonical form
  • Rank
  • Inverse, Pseudoinverse
  • Adjugate, Transpose
  • Dot product
  • Symmetric matrix, Skew-symmetric matrix
  • Orthogonal matrix, Unitary matrix
  • Hermitian matrix, Antihermitian matrix
  • Positive-(semi)definite
  • Pfaffian
  • Projection
  • Spectral theorem
  • Perron–Frobenius theorem
  • Diagonal matrix, Triangular matrix, Tridiagonal matrix
  • Block matrix
  • Sparse matrix
  • Hessenberg matrix, Hessian matrix
  • Vandermonde matrix
  • Stochastic matrix, Toeplitz matrix, Circulant matrix, Hankel matrix
  • (0,1)-matrix
  • List of matrices
Matrix decompositions
  • Cholesky decomposition
  • LU decomposition
  • QR decomposition
  • Polar decomposition
  • Spectral theorem
  • Singular value decomposition
  • Higher-order singular value decomposition
  • Schur decomposition
  • Schur complement
  • Haynsworth inertia additivity formula
  • Reducing subspace
Relations and computations
  • Matrix equivalence
  • Matrix congruence
  • Matrix similarity
  • Matrix consimilarity
  • Row equivalence
  • Elementary row operations
  • Householder transformation
  • Least squares
  • Linear least squares
  • Gram–Schmidt process
  • Woodbury matrix identity
Vector spaces
  • Vector space
  • Linear combination
  • Linear span
  • Linear independence
  • Basis, Hamel basis
  • Change of basis
  • Dimension theorem for vector spaces
  • Hamel dimension
  • Examples of vector spaces
  • Linear map
  • Shear mapping
  • Squeeze mapping
  • Linear subspace
  • Row and column spaces, Null space
  • Rank–nullity theorem
  • Nullity theorem
  • Cyclic subspace
  • Dual space, Linear functional
  • Category of vector spaces
Structures
  • Topological vector space
  • Normed vector space
  • Inner product space
  • Euclidean space
  • Orthogonality
  • Orthogonal complement
  • Orthogonal projection
  • Orthogonal group
  • Pseudo-Euclidean space
  • Null vector
  • Indefinite orthogonal group
  • Orientation
  • Improper rotation
  • Symplectic structure
Multilinear algebra
  • Multilinear algebra
  • Tensor
  • Tensors (classical)
  • Component-free treatment of tensors
  • Outer product
  • Tensor algebra
  • Exterior algebra
  • Symmetric algebra
  • Clifford algebra
  • Geometric algebra
  • Bivector
  • Multivector
  • Gamas's theorem
Affine and projective
  • Affine space
  • Affine transformation, Affine group, Affine geometry
  • Affine coordinate system, Flat (geometry)
  • Cartesian coordinate system
  • Euclidean group
  • Poincaré group
  • Galilean group
  • Projective space
  • Projective transformation
  • Projective geometry
  • Projective linear group
  • Quadric
Numerical linear algebra
  • Numerical linear algebra
  • Floating-point arithmetic
  • Numerical stability
  • Basic Linear Algebra Subprograms
  • Sparse matrix
  • Comparison of linear algebra libraries
  • Category
Authority control databases Edit this at Wikidata
  • GND
Retrieved from "https://teknopedia.ac.id/w/index.php?title=Change_of_basis&oldid=1341708004"
Categories:
  • Linear algebra
  • Matrix theory
Hidden categories:
  • Articles with short description
  • Short description is different from Wikidata
  • Articles needing additional references from November 2017
  • All articles needing additional references

  • indonesia
  • Polski
  • العربية
  • Deutsch
  • English
  • Español
  • Français
  • Italiano
  • مصرى
  • Nederlands
  • 日本語
  • Português
  • Sinugboanong Binisaya
  • Svenska
  • Українська
  • Tiếng Việt
  • Winaray
  • 中文
  • Русский
Sunting pranala
url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url
Pusat Layanan

UNIVERSITAS TEKNOKRAT INDONESIA | ASEAN's Best Private University
Jl. ZA. Pagar Alam No.9 -11, Labuhan Ratu, Kec. Kedaton, Kota Bandar Lampung, Lampung 35132
Phone: (0721) 702022
Email: pmb@teknokrat.ac.id