Epstein Files Full PDF

CLICK HERE
Technopedia Center
PMB University Brochure
Faculty of Engineering and Computer Science
S1 Informatics S1 Information Systems S1 Information Technology S1 Computer Engineering S1 Electrical Engineering S1 Civil Engineering

faculty of Economics and Business
S1 Management S1 Accountancy

Faculty of Letters and Educational Sciences
S1 English literature S1 English language education S1 Mathematics education S1 Sports Education
teknopedia

  • Registerasi
  • Brosur UTI
  • Kip Scholarship Information
  • Performance
Flag Counter
  1. World Encyclopedia
  2. Row and column vectors - Wikipedia
Row and column vectors - Wikipedia
From Wikipedia, the free encyclopedia
(Redirected from Column vectors)
Matrix consisting of a single row or column
This article includes a list of general references, but it lacks sufficient corresponding inline citations. Please help to improve this article by introducing more precise citations. (November 2022) (Learn how and when to remove this message)

In linear algebra, a column vector with ⁠ m {\displaystyle m} {\displaystyle m}⁠ elements is an m × 1 {\displaystyle m\times 1} {\displaystyle m\times 1} matrix[1] consisting of a single column of ⁠ m {\displaystyle m} {\displaystyle m}⁠ entries. Similarly, a row vector is a 1 × n {\displaystyle 1\times n} {\displaystyle 1\times n} matrix, consisting of a single row of ⁠ n {\displaystyle n} {\displaystyle n}⁠ entries. For example, ⁠ x {\displaystyle {\boldsymbol {x}}} {\displaystyle {\boldsymbol {x}}}⁠ is a column vector and ⁠ a {\displaystyle {\boldsymbol {a}}} {\displaystyle {\boldsymbol {a}}}⁠ is a row matrix:

x = [ x 1 x 2 ⋮ x m ] , a = [ a 1 a 2 … a n ] . {\displaystyle {\boldsymbol {x}}={\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix}},\quad {\boldsymbol {a}}={\begin{bmatrix}a_{1}&a_{2}&\dots &a_{n}\end{bmatrix}}.} {\displaystyle {\boldsymbol {x}}={\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix}},\quad {\boldsymbol {a}}={\begin{bmatrix}a_{1}&a_{2}&\dots &a_{n}\end{bmatrix}}.} (Throughout this article, boldface is used for both row and column vectors.)

The transpose (indicated by T) of any row vector is a column vector, and the transpose of any column vector is a row vector: [ x 1 x 2 … x m ] T = [ x 1 x 2 ⋮ x m ] , [ x 1 x 2 ⋮ x m ] T = [ x 1 x 2 … x m ] . {\displaystyle {\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix}}^{\rm {T}}={\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix}},\quad {\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix}}^{\rm {T}}={\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix}}.} {\displaystyle {\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix}}^{\rm {T}}={\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix}},\quad {\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix}}^{\rm {T}}={\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix}}.} Taking the transpose twice returns the original (row or column) vector: ⁠ ( x T ) ) T = x {\displaystyle \textstyle {\bigl (}{\boldsymbol {x}}^{\rm {T}}{\bigr )}{\vphantom {)}}^{\rm {T}}={\boldsymbol {x}}} {\displaystyle \textstyle {\bigl (}{\boldsymbol {x}}^{\rm {T}}{\bigr )}{\vphantom {)}}^{\rm {T}}={\boldsymbol {x}}}⁠.

The set of all row vectors with n entries in a given field (such as the real numbers) forms an n-dimensional vector space; similarly, the set of all column vectors with m entries forms an m-dimensional vector space.

The space of row vectors with n entries can be regarded as the dual space of the space of column vectors with n entries, since any linear functional on the space of column vectors can be represented as the left-multiplication of a unique row vector.

Notation

[edit]

To simplify writing column vectors in-line with other text, sometimes they are written as row vectors with the transpose operation applied to them.

x = [ x 1 x 2 … x m ] T {\displaystyle {\boldsymbol {x}}={\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix}}^{\rm {T}}} {\displaystyle {\boldsymbol {x}}={\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix}}^{\rm {T}}}

or

x = [ x 1 , x 2 , … , x m ] T {\displaystyle {\boldsymbol {x}}={\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix}}^{\rm {T}}} {\displaystyle {\boldsymbol {x}}={\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix}}^{\rm {T}}}

Some authors also use the convention of writing both column vectors and row vectors as rows, but separating row vector elements with commas and column vector elements with semicolons (see alternative notation 2 in the table below).[citation needed]

Row vector Column vector
Standard matrix notation
(array spaces, no commas, transpose signs)
[ x 1 x 2 … x m ] {\displaystyle {\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix}}} {\displaystyle {\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix}}} [ x 1 x 2 ⋮ x m ]  or  [ x 1 x 2 … x m ] T {\displaystyle {\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix}}{\text{ or }}{\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix}}^{\rm {T}}} {\displaystyle {\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix}}{\text{ or }}{\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix}}^{\rm {T}}}
Alternative notation 1
(commas, transpose signs)
[ x 1 , x 2 , … , x m ] {\displaystyle {\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix}}} {\displaystyle {\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix}}} [ x 1 , x 2 , … , x m ] T {\displaystyle {\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix}}^{\rm {T}}} {\displaystyle {\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix}}^{\rm {T}}}
Alternative notation 2
(commas and semicolons, no transpose signs)
[ x 1 , x 2 , … , x m ] {\displaystyle {\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix}}} {\displaystyle {\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix}}} [ x 1 ; x 2 ; … ; x m ] {\displaystyle {\begin{bmatrix}x_{1};x_{2};\dots ;x_{m}\end{bmatrix}}} {\displaystyle {\begin{bmatrix}x_{1};x_{2};\dots ;x_{m}\end{bmatrix}}}

Operations

[edit]

Matrix multiplication involves the action of multiplying each row vector of one matrix by each column vector of another matrix.

The dot product of two column vectors a, b, considered as elements of a coordinate space, is equal to the matrix product of the transpose of a with b,

a ⋅ b = a T b = [ a 1 ⋯ a n ] [ b 1 ⋮ b n ] = a 1 b 1 + ⋯ + a n b n , {\displaystyle \mathbf {a} \cdot \mathbf {b} =\mathbf {a} ^{\rm {T}}\mathbf {b} ={\begin{bmatrix}a_{1}&\cdots &a_{n}\end{bmatrix}}{\begin{bmatrix}b_{1}\\\vdots \\b_{n}\end{bmatrix}}=a_{1}b_{1}+\cdots +a_{n}b_{n}\,,} {\displaystyle \mathbf {a} \cdot \mathbf {b} =\mathbf {a} ^{\rm {T}}\mathbf {b} ={\begin{bmatrix}a_{1}&\cdots &a_{n}\end{bmatrix}}{\begin{bmatrix}b_{1}\\\vdots \\b_{n}\end{bmatrix}}=a_{1}b_{1}+\cdots +a_{n}b_{n}\,,}

By the symmetry of the dot product, the dot product of two column vectors a, b is also equal to the matrix product of the transpose of b with a,

b ⋅ a = b T a = [ b 1 ⋯ b n ] [ a 1 ⋮ a n ] = a 1 b 1 + ⋯ + a n b n . {\displaystyle \mathbf {b} \cdot \mathbf {a} =\mathbf {b} ^{\rm {T}}\mathbf {a} ={\begin{bmatrix}b_{1}&\cdots &b_{n}\end{bmatrix}}{\begin{bmatrix}a_{1}\\\vdots \\a_{n}\end{bmatrix}}=a_{1}b_{1}+\cdots +a_{n}b_{n}\,.} {\displaystyle \mathbf {b} \cdot \mathbf {a} =\mathbf {b} ^{\rm {T}}\mathbf {a} ={\begin{bmatrix}b_{1}&\cdots &b_{n}\end{bmatrix}}{\begin{bmatrix}a_{1}\\\vdots \\a_{n}\end{bmatrix}}=a_{1}b_{1}+\cdots +a_{n}b_{n}\,.}

The matrix product of a column and a row vector gives the outer product of two vectors a, b, an example of the more general tensor product. The matrix product of the column vector representation of a and the row vector representation of b gives the components of their dyadic product,

a ⊗ b = a b T = [ a 1 a 2 a 3 ] [ b 1 b 2 b 3 ] = [ a 1 b 1 a 1 b 2 a 1 b 3 a 2 b 1 a 2 b 2 a 2 b 3 a 3 b 1 a 3 b 2 a 3 b 3 ] , {\displaystyle \mathbf {a} \otimes \mathbf {b} =\mathbf {a} \mathbf {b} ^{\rm {T}}={\begin{bmatrix}a_{1}\\a_{2}\\a_{3}\end{bmatrix}}{\begin{bmatrix}b_{1}&b_{2}&b_{3}\end{bmatrix}}={\begin{bmatrix}a_{1}b_{1}&a_{1}b_{2}&a_{1}b_{3}\\a_{2}b_{1}&a_{2}b_{2}&a_{2}b_{3}\\a_{3}b_{1}&a_{3}b_{2}&a_{3}b_{3}\\\end{bmatrix}}\,,} {\displaystyle \mathbf {a} \otimes \mathbf {b} =\mathbf {a} \mathbf {b} ^{\rm {T}}={\begin{bmatrix}a_{1}\\a_{2}\\a_{3}\end{bmatrix}}{\begin{bmatrix}b_{1}&b_{2}&b_{3}\end{bmatrix}}={\begin{bmatrix}a_{1}b_{1}&a_{1}b_{2}&a_{1}b_{3}\\a_{2}b_{1}&a_{2}b_{2}&a_{2}b_{3}\\a_{3}b_{1}&a_{3}b_{2}&a_{3}b_{3}\\\end{bmatrix}}\,,}

which is the transpose of the matrix product of the column vector representation of b and the row vector representation of a,

b ⊗ a = b a T = [ b 1 b 2 b 3 ] [ a 1 a 2 a 3 ] = [ b 1 a 1 b 1 a 2 b 1 a 3 b 2 a 1 b 2 a 2 b 2 a 3 b 3 a 1 b 3 a 2 b 3 a 3 ] . {\displaystyle \mathbf {b} \otimes \mathbf {a} =\mathbf {b} \mathbf {a} ^{\rm {T}}={\begin{bmatrix}b_{1}\\b_{2}\\b_{3}\end{bmatrix}}{\begin{bmatrix}a_{1}&a_{2}&a_{3}\end{bmatrix}}={\begin{bmatrix}b_{1}a_{1}&b_{1}a_{2}&b_{1}a_{3}\\b_{2}a_{1}&b_{2}a_{2}&b_{2}a_{3}\\b_{3}a_{1}&b_{3}a_{2}&b_{3}a_{3}\\\end{bmatrix}}\,.} {\displaystyle \mathbf {b} \otimes \mathbf {a} =\mathbf {b} \mathbf {a} ^{\rm {T}}={\begin{bmatrix}b_{1}\\b_{2}\\b_{3}\end{bmatrix}}{\begin{bmatrix}a_{1}&a_{2}&a_{3}\end{bmatrix}}={\begin{bmatrix}b_{1}a_{1}&b_{1}a_{2}&b_{1}a_{3}\\b_{2}a_{1}&b_{2}a_{2}&b_{2}a_{3}\\b_{3}a_{1}&b_{3}a_{2}&b_{3}a_{3}\\\end{bmatrix}}\,.}

Matrix transformations

[edit]
Main article: Transformation matrix

An n × n matrix M can represent a linear map and act on row and column vectors as the linear map's transformation matrix. For a row vector v, the product vM is another row vector p:

v M = p . {\displaystyle \mathbf {v} M=\mathbf {p} \,.} {\displaystyle \mathbf {v} M=\mathbf {p} \,.}

Another n × n matrix Q can act on p,

p Q = t . {\displaystyle \mathbf {p} Q=\mathbf {t} \,.} {\displaystyle \mathbf {p} Q=\mathbf {t} \,.}

Then one can write t = pQ = vMQ, so the matrix product transformation MQ maps v directly to t. Continuing with row vectors, matrix transformations further reconfiguring n-space can be applied to the right of previous outputs.

When a column vector is transformed to another column vector under an n × n matrix action, the operation occurs to the left,

p T = M v T , t T = Q p T , {\displaystyle \mathbf {p} ^{\mathrm {T} }=M\mathbf {v} ^{\mathrm {T} }\,,\quad \mathbf {t} ^{\mathrm {T} }=Q\mathbf {p} ^{\mathrm {T} },} {\displaystyle \mathbf {p} ^{\mathrm {T} }=M\mathbf {v} ^{\mathrm {T} }\,,\quad \mathbf {t} ^{\mathrm {T} }=Q\mathbf {p} ^{\mathrm {T} },}

leading to the algebraic expression QM vT for the composed output from vT input. The matrix transformations mount up to the left in this use of a column vector for input to matrix transformation.

See also

[edit]
  • Covariance and contravariance of vectors
  • Index notation
  • Vector of ones
  • Single-entry vector
  • Standard unit vector
  • Unit vector

Notes

[edit]
  1. ^ Artin, Michael (1991). Algebra. Englewood Cliffs, NJ: Prentice-Hall. p. 2. ISBN 0-13-004763-5.

References

[edit]
See also: Linear algebra § Further reading
  • Axler, Sheldon Jay (1997), Linear Algebra Done Right (2nd ed.), Springer-Verlag, ISBN 0-387-98259-0
  • Lay, David C. (August 22, 2005), Linear Algebra and Its Applications (3rd ed.), Addison Wesley, ISBN 978-0-321-28713-7
  • Meyer, Carl D. (February 15, 2001), Matrix Analysis and Applied Linear Algebra, Society for Industrial and Applied Mathematics (SIAM), ISBN 978-0-89871-454-8, archived from the original on March 1, 2001
  • Poole, David (2006), Linear Algebra: A Modern Introduction (2nd ed.), Brooks/Cole, ISBN 0-534-99845-3
  • Anton, Howard (2005), Elementary Linear Algebra (Applications Version) (9th ed.), Wiley International
  • Leon, Steven J. (2006), Linear Algebra With Applications (7th ed.), Pearson Prentice Hall
  • v
  • t
  • e
Linear algebra
  • Outline
  • Glossary
  • Template:Matrix classes
Linear equations
  • Linear equation
  • System of linear equations
  • Determinant
  • Minor
  • Cauchy–Binet formula
  • Cramer's rule
  • Gaussian elimination
  • Gauss–Jordan elimination
  • Overcompleteness
  • Strassen algorithm
Three dimensional Euclidean space
Matrices
  • Matrix
  • Matrix addition
  • Matrix multiplication
  • Basis transformation matrix
  • Characteristic polynomial
  • Spectrum
  • Trace
  • Eigenvalue, eigenvector and eigenspace
  • Cayley–Hamilton theorem
  • Jordan normal form
  • Weyr canonical form
  • Rank
  • Inverse, Pseudoinverse
  • Adjugate, Transpose
  • Dot product
  • Symmetric matrix, Skew-symmetric matrix
  • Orthogonal matrix, Unitary matrix
  • Hermitian matrix, Antihermitian matrix
  • Positive-(semi)definite
  • Pfaffian
  • Projection
  • Spectral theorem
  • Perron–Frobenius theorem
  • Diagonal matrix, Triangular matrix, Tridiagonal matrix
  • Block matrix
  • Sparse matrix
  • Hessenberg matrix, Hessian matrix
  • Vandermonde matrix
  • Stochastic matrix, Toeplitz matrix, Circulant matrix, Hankel matrix
  • (0,1)-matrix
  • List of matrices
Matrix decompositions
  • Cholesky decomposition
  • LU decomposition
  • QR decomposition
  • Polar decomposition
  • Spectral theorem
  • Singular value decomposition
  • Higher-order singular value decomposition
  • Schur decomposition
  • Schur complement
  • Haynsworth inertia additivity formula
  • Reducing subspace
Relations and computations
  • Matrix equivalence
  • Matrix congruence
  • Matrix similarity
  • Matrix consimilarity
  • Row equivalence
  • Elementary row operations
  • Householder transformation
  • Least squares
  • Linear least squares
  • Gram–Schmidt process
  • Woodbury matrix identity
Vector spaces
  • Vector space
  • Linear combination
  • Linear span
  • Linear independence
  • Basis, Hamel basis
  • Change of basis
  • Dimension theorem for vector spaces
  • Hamel dimension
  • Examples of vector spaces
  • Linear map
  • Shear mapping
  • Squeeze mapping
  • Linear subspace
  • Row and column spaces, Null space
  • Rank–nullity theorem
  • Nullity theorem
  • Cyclic subspace
  • Dual space, Linear functional
  • Category of vector spaces
Structures
  • Topological vector space
  • Normed vector space
  • Inner product space
  • Euclidean space
  • Orthogonality
  • Orthogonal complement
  • Orthogonal projection
  • Orthogonal group
  • Pseudo-Euclidean space
  • Null vector
  • Indefinite orthogonal group
  • Orientation
  • Improper rotation
  • Symplectic structure
Multilinear algebra
  • Multilinear algebra
  • Tensor
  • Tensors (classical)
  • Component-free treatment of tensors
  • Outer product
  • Tensor algebra
  • Exterior algebra
  • Symmetric algebra
  • Clifford algebra
  • Geometric algebra
  • Bivector
  • Multivector
  • Gamas's theorem
Affine and projective
  • Affine space
  • Affine transformation, Affine group, Affine geometry
  • Affine coordinate system, Flat (geometry)
  • Cartesian coordinate system
  • Euclidean group
  • Poincaré group
  • Galilean group
  • Projective space
  • Projective transformation
  • Projective geometry
  • Projective linear group
  • Quadric
Numerical linear algebra
  • Numerical linear algebra
  • Floating-point arithmetic
  • Numerical stability
  • Basic Linear Algebra Subprograms
  • Sparse matrix
  • Comparison of linear algebra libraries
  • Category
Retrieved from "https://teknopedia.ac.id/w/index.php?title=Row_and_column_vectors&oldid=1335425552"
Categories:
  • Linear algebra
  • Matrices (mathematics)
  • Vectors (mathematics and physics)
Hidden categories:
  • Articles with short description
  • Short description is different from Wikidata
  • Articles lacking in-text citations from November 2022
  • All articles lacking in-text citations
  • All articles with unsourced statements
  • Articles with unsourced statements from March 2021

  • indonesia
  • Polski
  • العربية
  • Deutsch
  • English
  • Español
  • Français
  • Italiano
  • مصرى
  • Nederlands
  • 日本語
  • Português
  • Sinugboanong Binisaya
  • Svenska
  • Українська
  • Tiếng Việt
  • Winaray
  • 中文
  • Русский
Sunting pranala
url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url
Pusat Layanan

UNIVERSITAS TEKNOKRAT INDONESIA | ASEAN's Best Private University
Jl. ZA. Pagar Alam No.9 -11, Labuhan Ratu, Kec. Kedaton, Kota Bandar Lampung, Lampung 35132
Phone: (0721) 702022
Email: pmb@teknokrat.ac.id