Epstein Files Full PDF

CLICK HERE
Technopedia Center
PMB University Brochure
Faculty of Engineering and Computer Science
S1 Informatics S1 Information Systems S1 Information Technology S1 Computer Engineering S1 Electrical Engineering S1 Civil Engineering

faculty of Economics and Business
S1 Management S1 Accountancy

Faculty of Letters and Educational Sciences
S1 English literature S1 English language education S1 Mathematics education S1 Sports Education
teknopedia

  • Registerasi
  • Brosur UTI
  • Kip Scholarship Information
  • Performance
Flag Counter
  1. World Encyclopedia
  2. Vector projection - Wikipedia
Vector projection - Wikipedia
From Wikipedia, the free encyclopedia
Concept in linear algebra
For more general concepts, see Projection (linear algebra) and Projection (mathematics).
This article's lead section may be too long. Please read the length guidelines and help move details into the article's body. (September 2024) (Learn how and when to remove this message)

The vector projection (also known as the vector component or vector resolution) of a vector a on (or onto) a nonzero vector b is the orthogonal projection of a onto a straight line parallel to b. The projection of a onto b is often written as proj b ⁡ a {\displaystyle \operatorname {proj} _{\mathbf {b} }\mathbf {a} } {\displaystyle \operatorname {proj} _{\mathbf {b} }\mathbf {a} } or a∥b.

The vector component or vector resolute of a perpendicular to b, sometimes also called the vector rejection of a from b (denoted oproj b ⁡ a {\displaystyle \operatorname {oproj} _{\mathbf {b} }\mathbf {a} } {\displaystyle \operatorname {oproj} _{\mathbf {b} }\mathbf {a} } or a⊥b),[1] is the orthogonal projection of a onto the plane (or, in general, hyperplane) that is orthogonal to b. Since both proj b ⁡ a {\displaystyle \operatorname {proj} _{\mathbf {b} }\mathbf {a} } {\displaystyle \operatorname {proj} _{\mathbf {b} }\mathbf {a} } and oproj b ⁡ a {\displaystyle \operatorname {oproj} _{\mathbf {b} }\mathbf {a} } {\displaystyle \operatorname {oproj} _{\mathbf {b} }\mathbf {a} } are vectors, and their sum is equal to a, the rejection of a from b is given by: oproj b ⁡ a = a − proj b ⁡ a . {\displaystyle \operatorname {oproj} _{\mathbf {b} }\mathbf {a} =\mathbf {a} -\operatorname {proj} _{\mathbf {b} }\mathbf {a} .} {\displaystyle \operatorname {oproj} _{\mathbf {b} }\mathbf {a} =\mathbf {a} -\operatorname {proj} _{\mathbf {b} }\mathbf {a} .}

Projection of a on b (a1), and rejection of a from b (a2)
When 90° < θ ≤ 180°, a1 has an opposite direction with respect to b.

To simplify notation, this article defines a 1 := proj b ⁡ a {\displaystyle \mathbf {a} _{1}:=\operatorname {proj} _{\mathbf {b} }\mathbf {a} } {\displaystyle \mathbf {a} _{1}:=\operatorname {proj} _{\mathbf {b} }\mathbf {a} } and a 2 := oproj b ⁡ a . {\displaystyle \mathbf {a} _{2}:=\operatorname {oproj} _{\mathbf {b} }\mathbf {a} .} {\displaystyle \mathbf {a} _{2}:=\operatorname {oproj} _{\mathbf {b} }\mathbf {a} .} Thus, the vector a 1 {\displaystyle \mathbf {a} _{1}} {\displaystyle \mathbf {a} _{1}} is parallel to b , {\displaystyle \mathbf {b} ,} {\displaystyle \mathbf {b} ,} the vector a 2 {\displaystyle \mathbf {a} _{2}} {\displaystyle \mathbf {a} _{2}} is orthogonal to b , {\displaystyle \mathbf {b} ,} {\displaystyle \mathbf {b} ,} and a = a 1 + a 2 . {\displaystyle \mathbf {a} =\mathbf {a} _{1}+\mathbf {a} _{2}.} {\displaystyle \mathbf {a} =\mathbf {a} _{1}+\mathbf {a} _{2}.}

The projection of a onto b can be decomposed into a direction and a scalar magnitude by writing it as a 1 = a 1 b ^ {\displaystyle \mathbf {a} _{1}=a_{1}\mathbf {\hat {b}} } {\displaystyle \mathbf {a} _{1}=a_{1}\mathbf {\hat {b}} } where a 1 {\displaystyle a_{1}} {\displaystyle a_{1}} is a scalar, called the scalar projection of a onto b, and b̂ is the unit vector in the direction of b. The scalar projection is defined as[2] a 1 = ‖ a ‖ cos ⁡ θ = a ⋅ b ^ {\displaystyle a_{1}=\left\|\mathbf {a} \right\|\cos \theta =\mathbf {a} \cdot \mathbf {\hat {b}} } {\displaystyle a_{1}=\left\|\mathbf {a} \right\|\cos \theta =\mathbf {a} \cdot \mathbf {\hat {b}} } where the operator ⋅ denotes a dot product, ‖a‖ is the length of a, and θ is the angle between a and b. The scalar projection is equal in absolute value to the length of the vector projection, with a minus sign if the direction of the projection is opposite to the direction of b, that is, if the angle between the vectors is more than 90 degrees.

The vector projection can be calculated using the dot product of a {\displaystyle \mathbf {a} } {\displaystyle \mathbf {a} } and b {\displaystyle \mathbf {b} } {\displaystyle \mathbf {b} } as: proj b ⁡ a = ( a ⋅ b ^ ) b ^ = a ⋅ b ‖ b ‖ b ‖ b ‖ = a ⋅ b ‖ b ‖ 2 b = a ⋅ b b ⋅ b b   . {\displaystyle \operatorname {proj} _{\mathbf {b} }\mathbf {a} =\left(\mathbf {a} \cdot \mathbf {\hat {b}} \right)\mathbf {\hat {b}} ={\frac {\mathbf {a} \cdot \mathbf {b} }{\left\|\mathbf {b} \right\|}}{\frac {\mathbf {b} }{\left\|\mathbf {b} \right\|}}={\frac {\mathbf {a} \cdot \mathbf {b} }{\left\|\mathbf {b} \right\|^{2}}}{\mathbf {b} }={\frac {\mathbf {a} \cdot \mathbf {b} }{\mathbf {b} \cdot \mathbf {b} }}{\mathbf {b} }~.} {\displaystyle \operatorname {proj} _{\mathbf {b} }\mathbf {a} =\left(\mathbf {a} \cdot \mathbf {\hat {b}} \right)\mathbf {\hat {b}} ={\frac {\mathbf {a} \cdot \mathbf {b} }{\left\|\mathbf {b} \right\|}}{\frac {\mathbf {b} }{\left\|\mathbf {b} \right\|}}={\frac {\mathbf {a} \cdot \mathbf {b} }{\left\|\mathbf {b} \right\|^{2}}}{\mathbf {b} }={\frac {\mathbf {a} \cdot \mathbf {b} }{\mathbf {b} \cdot \mathbf {b} }}{\mathbf {b} }~.}

Notation

[edit]

This article uses the convention that vectors are denoted in a bold font (e.g. a1), and scalars are written in normal font (e.g. a1).

The dot product of vectors a and b is written as a ⋅ b {\displaystyle \mathbf {a} \cdot \mathbf {b} } {\displaystyle \mathbf {a} \cdot \mathbf {b} }, the norm of a is written ‖a‖, and the angle between a and b is denoted by θ.

Definitions based on angle alpha

[edit]

Scalar projection

[edit]
Main article: Scalar projection

The scalar projection of a on b is a scalar equal to a 1 = ‖ a ‖ cos ⁡ θ , {\displaystyle a_{1}=\left\|\mathbf {a} \right\|\cos \theta ,} {\displaystyle a_{1}=\left\|\mathbf {a} \right\|\cos \theta ,} where θ is the angle between a and b.

A scalar projection can be used as a scale factor to compute the corresponding vector projection.

Vector projection

[edit]

The vector projection of a on b is a vector whose magnitude is the scalar projection of a on b with the same direction as b. Namely, it is defined as a 1 = a 1 b ^ = ( ‖ a ‖ cos ⁡ θ ) b ^ {\displaystyle \mathbf {a} _{1}=a_{1}\mathbf {\hat {b}} =(\left\|\mathbf {a} \right\|\cos \theta )\mathbf {\hat {b}} } {\displaystyle \mathbf {a} _{1}=a_{1}\mathbf {\hat {b}} =(\left\|\mathbf {a} \right\|\cos \theta )\mathbf {\hat {b}} } where a 1 {\displaystyle a_{1}} {\displaystyle a_{1}} is the corresponding scalar projection, as defined above, and b ^ {\displaystyle \mathbf {\hat {b}} } {\displaystyle \mathbf {\hat {b}} } is the unit vector with the same direction as b: b ^ = b ‖ b ‖ {\displaystyle \mathbf {\hat {b}} ={\frac {\mathbf {b} }{\left\|\mathbf {b} \right\|}}} {\displaystyle \mathbf {\hat {b}} ={\frac {\mathbf {b} }{\left\|\mathbf {b} \right\|}}}

Vector rejection

[edit]

By definition, the vector rejection of a on b is: a 2 = a − a 1 {\displaystyle \mathbf {a} _{2}=\mathbf {a} -\mathbf {a} _{1}} {\displaystyle \mathbf {a} _{2}=\mathbf {a} -\mathbf {a} _{1}}

Hence, a 2 = a − ( ‖ a ‖ cos ⁡ θ ) b ^ {\displaystyle \mathbf {a} _{2}=\mathbf {a} -\left(\left\|\mathbf {a} \right\|\cos \theta \right)\mathbf {\hat {b}} } {\displaystyle \mathbf {a} _{2}=\mathbf {a} -\left(\left\|\mathbf {a} \right\|\cos \theta \right)\mathbf {\hat {b}} }

Definitions in terms of a and b

[edit]

When θ is not known, the cosine of θ can be computed in terms of a and b, by the following property of the dot product a ⋅ b a ⋅ b = ‖ a ‖ ‖ b ‖ cos ⁡ θ {\displaystyle \mathbf {a} \cdot \mathbf {b} =\left\|\mathbf {a} \right\|\left\|\mathbf {b} \right\|\cos \theta } {\displaystyle \mathbf {a} \cdot \mathbf {b} =\left\|\mathbf {a} \right\|\left\|\mathbf {b} \right\|\cos \theta }

Scalar projection

[edit]

By the above-mentioned property of the dot product, the definition of the scalar projection becomes:[2] a 1 = ‖ a ‖ cos ⁡ θ = a ⋅ b ‖ b ‖ . {\displaystyle {\displaystyle a_{1}=\left\|\mathbf {a} \right\|\cos \theta ={\frac {\mathbf {a} \cdot \mathbf {b} }{\left\|\mathbf {b} \right\|}}.}} {\displaystyle {\displaystyle a_{1}=\left\|\mathbf {a} \right\|\cos \theta ={\frac {\mathbf {a} \cdot \mathbf {b} }{\left\|\mathbf {b} \right\|}}.}}

In two dimensions, this becomes a 1 = a x b x + a y b y ‖ b ‖ . {\displaystyle a_{1}={\frac {\mathbf {a} _{x}\mathbf {b} _{x}+\mathbf {a} _{y}\mathbf {b} _{y}}{\left\|\mathbf {b} \right\|}}.} {\displaystyle a_{1}={\frac {\mathbf {a} _{x}\mathbf {b} _{x}+\mathbf {a} _{y}\mathbf {b} _{y}}{\left\|\mathbf {b} \right\|}}.}

Vector projection

[edit]

Similarly, the definition of the vector projection of a onto b becomes:[2] a 1 = a 1 b ^ = a ⋅ b ‖ b ‖ b ‖ b ‖ , {\displaystyle \mathbf {a} _{1}=a_{1}\mathbf {\hat {b}} ={\frac {\mathbf {a} \cdot \mathbf {b} }{\left\|\mathbf {b} \right\|}}{\frac {\mathbf {b} }{\left\|\mathbf {b} \right\|}},} {\displaystyle \mathbf {a} _{1}=a_{1}\mathbf {\hat {b}} ={\frac {\mathbf {a} \cdot \mathbf {b} }{\left\|\mathbf {b} \right\|}}{\frac {\mathbf {b} }{\left\|\mathbf {b} \right\|}},} which is equivalent to either a 1 = ( a ⋅ b ^ ) b ^ , {\displaystyle \mathbf {a} _{1}=\left(\mathbf {a} \cdot \mathbf {\hat {b}} \right)\mathbf {\hat {b}} ,} {\displaystyle \mathbf {a} _{1}=\left(\mathbf {a} \cdot \mathbf {\hat {b}} \right)\mathbf {\hat {b}} ,} or[3] a 1 = a ⋅ b ‖ b ‖ 2 b = a ⋅ b b ⋅ b b   . {\displaystyle \mathbf {a} _{1}={\frac {\mathbf {a} \cdot \mathbf {b} }{\left\|\mathbf {b} \right\|^{2}}}{\mathbf {b} }={\frac {\mathbf {a} \cdot \mathbf {b} }{\mathbf {b} \cdot \mathbf {b} }}{\mathbf {b} }~.} {\displaystyle \mathbf {a} _{1}={\frac {\mathbf {a} \cdot \mathbf {b} }{\left\|\mathbf {b} \right\|^{2}}}{\mathbf {b} }={\frac {\mathbf {a} \cdot \mathbf {b} }{\mathbf {b} \cdot \mathbf {b} }}{\mathbf {b} }~.}

Scalar rejection

[edit]

In two dimensions, the scalar rejection is equivalent to the projection of a onto b ⊥ = ( − b y b x ) {\displaystyle \mathbf {b} ^{\perp }={\begin{pmatrix}-\mathbf {b} _{y}&\mathbf {b} _{x}\end{pmatrix}}} {\displaystyle \mathbf {b} ^{\perp }={\begin{pmatrix}-\mathbf {b} _{y}&\mathbf {b} _{x}\end{pmatrix}}}, which is b = ( b x b y ) {\displaystyle \mathbf {b} ={\begin{pmatrix}\mathbf {b} _{x}&\mathbf {b} _{y}\end{pmatrix}}} {\displaystyle \mathbf {b} ={\begin{pmatrix}\mathbf {b} _{x}&\mathbf {b} _{y}\end{pmatrix}}} rotated 90° to the left. Hence, a 2 = ‖ a ‖ sin ⁡ θ = a ⋅ b ⊥ ‖ b ‖ = a y b x − a x b y ‖ b ‖ . {\displaystyle a_{2}=\left\|\mathbf {a} \right\|\sin \theta ={\frac {\mathbf {a} \cdot \mathbf {b} ^{\perp }}{\left\|\mathbf {b} \right\|}}={\frac {\mathbf {a} _{y}\mathbf {b} _{x}-\mathbf {a} _{x}\mathbf {b} _{y}}{\left\|\mathbf {b} \right\|}}.} {\displaystyle a_{2}=\left\|\mathbf {a} \right\|\sin \theta ={\frac {\mathbf {a} \cdot \mathbf {b} ^{\perp }}{\left\|\mathbf {b} \right\|}}={\frac {\mathbf {a} _{y}\mathbf {b} _{x}-\mathbf {a} _{x}\mathbf {b} _{y}}{\left\|\mathbf {b} \right\|}}.}

Such a dot product is called the "perp dot product."

Vector rejection

[edit]

By definition, a 2 = a − a 1 {\displaystyle \mathbf {a} _{2}=\mathbf {a} -\mathbf {a} _{1}} {\displaystyle \mathbf {a} _{2}=\mathbf {a} -\mathbf {a} _{1}}

Hence, a 2 = a − a ⋅ b b ⋅ b b . {\displaystyle \mathbf {a} _{2}=\mathbf {a} -{\frac {\mathbf {a} \cdot \mathbf {b} }{\mathbf {b} \cdot \mathbf {b} }}{\mathbf {b} }.} {\displaystyle \mathbf {a} _{2}=\mathbf {a} -{\frac {\mathbf {a} \cdot \mathbf {b} }{\mathbf {b} \cdot \mathbf {b} }}{\mathbf {b} }.}

By using the Scalar rejection using the perp dot product this gives

a 2 = a ⋅ b ⊥ b ⋅ b b ⊥ {\displaystyle \mathbf {a} _{2}={\frac {\mathbf {a} \cdot \mathbf {b} ^{\perp }}{\mathbf {b} \cdot \mathbf {b} }}\mathbf {b} ^{\perp }} {\displaystyle \mathbf {a} _{2}={\frac {\mathbf {a} \cdot \mathbf {b} ^{\perp }}{\mathbf {b} \cdot \mathbf {b} }}\mathbf {b} ^{\perp }}

Properties

[edit]
If 0° ≤ θ ≤ 90°, as in this case, the scalar projection of a on b coincides with the length of the vector projection.

Scalar projection

[edit]
Main article: Scalar projection

The scalar projection a on b is a scalar which has a negative sign if 90 degrees < θ ≤ 180 degrees. It coincides with the length ‖c‖ of the vector projection if the angle is smaller than 90°. More exactly:

  • a1 = ‖a1‖ if 0° ≤ θ ≤ 90°,
  • a1 = −‖a1‖ if 90° < θ ≤ 180°.

Vector projection

[edit]

The vector projection of a on b is a vector a1 which is either null or parallel to b. More exactly:

  • a1 = 0 if θ = 90°,
  • a1 and b have the same direction if 0° ≤ θ < 90°,
  • a1 and b have opposite directions if 90° < θ ≤ 180°.

Vector rejection

[edit]

The vector rejection of a on b is a vector a2 which is either null or orthogonal to b. More exactly:

  • a2 = 0 if θ = 0° or θ = 180°,
  • a2 is orthogonal to b if 0 < θ < 180°,

Matrix representation

[edit]

The orthogonal projection can be represented by a projection matrix. To project a vector onto the unit vector a = (ax, ay, az), it would need to be multiplied with this projection matrix:

Uses

[edit]

The vector projection is an important operation in the Gram–Schmidt orthonormalization of vector space bases. It is also used in the separating axis theorem to detect whether two convex shapes intersect.

Generalizations

[edit]

Since the notions of vector length and angle between vectors can be generalized to any n-dimensional inner product space, this is also true for the notions of orthogonal projection of a vector, projection of a vector onto another, and rejection of a vector from another.

Vector projection on a plane

[edit]

In some cases, the inner product coincides with the dot product. Whenever they don't coincide, the inner product is used instead of the dot product in the formal definitions of projection and rejection. For a three-dimensional inner product space, the notions of projection of a vector onto another and rejection of a vector from another can be generalized to the notions of projection of a vector onto a plane, and rejection of a vector from a plane.[4] The projection of a vector on a plane is its orthogonal projection on that plane. The rejection of a vector from a plane is its orthogonal projection on a straight line which is orthogonal to that plane. Both are vectors. The first is parallel to the plane, the second is orthogonal.

For a given vector and plane, the sum of projection and rejection is equal to the original vector. Similarly, for inner product spaces with more than three dimensions, the notions of projection onto a vector and rejection from a vector can be generalized to the notions of projection onto a hyperplane, and rejection from a hyperplane. In geometric algebra, they can be further generalized to the notions of projection and rejection of a general multivector onto/from any invertible k-blade.

See also

[edit]
  • Scalar projection
  • Vector notation

References

[edit]
  1. ^ Perwass, G. (2009). Geometric Algebra With Applications in Engineering. Springer. p. 83. ISBN 9783540890676.
  2. ^ a b c "Scalar and Vector Projections". www.ck12.org. Retrieved 2020-09-07.
  3. ^ "Dot Products and Projections". Archived from the original on 2016-05-31. Retrieved 2010-09-05.
  4. ^ M.J. Baker, 2012. Projection of a vector onto a plane. Published on www.euclideanspace.com.

External links

[edit]
  • Projection of a vector onto a plane
  • v
  • t
  • e
Linear algebra
  • Outline
  • Glossary
  • Template:Matrix classes
Linear equations
  • Linear equation
  • System of linear equations
  • Determinant
  • Minor
  • Cauchy–Binet formula
  • Cramer's rule
  • Gaussian elimination
  • Gauss–Jordan elimination
  • Overcompleteness
  • Strassen algorithm
Three dimensional Euclidean space
Matrices
  • Matrix
  • Matrix addition
  • Matrix multiplication
  • Basis transformation matrix
  • Characteristic polynomial
  • Spectrum
  • Trace
  • Eigenvalue, eigenvector and eigenspace
  • Cayley–Hamilton theorem
  • Jordan normal form
  • Weyr canonical form
  • Rank
  • Inverse, Pseudoinverse
  • Adjugate, Transpose
  • Dot product
  • Symmetric matrix, Skew-symmetric matrix
  • Orthogonal matrix, Unitary matrix
  • Hermitian matrix, Antihermitian matrix
  • Positive-(semi)definite
  • Pfaffian
  • Projection
  • Spectral theorem
  • Perron–Frobenius theorem
  • Diagonal matrix, Triangular matrix, Tridiagonal matrix
  • Block matrix
  • Sparse matrix
  • Hessenberg matrix, Hessian matrix
  • Vandermonde matrix
  • Stochastic matrix, Toeplitz matrix, Circulant matrix, Hankel matrix
  • (0,1)-matrix
  • List of matrices
Matrix decompositions
  • Cholesky decomposition
  • LU decomposition
  • QR decomposition
  • Polar decomposition
  • Spectral theorem
  • Singular value decomposition
  • Higher-order singular value decomposition
  • Schur decomposition
  • Schur complement
  • Haynsworth inertia additivity formula
  • Reducing subspace
Relations and computations
  • Matrix equivalence
  • Matrix congruence
  • Matrix similarity
  • Matrix consimilarity
  • Row equivalence
  • Elementary row operations
  • Householder transformation
  • Least squares
  • Linear least squares
  • Gram–Schmidt process
  • Woodbury matrix identity
Vector spaces
  • Vector space
  • Linear combination
  • Linear span
  • Linear independence
  • Basis, Hamel basis
  • Change of basis
  • Dimension theorem for vector spaces
  • Hamel dimension
  • Examples of vector spaces
  • Linear map
  • Shear mapping
  • Squeeze mapping
  • Linear subspace
  • Row and column spaces, Null space
  • Rank–nullity theorem
  • Nullity theorem
  • Cyclic subspace
  • Dual space, Linear functional
  • Category of vector spaces
Structures
  • Topological vector space
  • Normed vector space
  • Inner product space
  • Euclidean space
  • Orthogonality
  • Orthogonal complement
  • Orthogonal projection
  • Orthogonal group
  • Pseudo-Euclidean space
  • Null vector
  • Indefinite orthogonal group
  • Orientation
  • Improper rotation
  • Symplectic structure
Multilinear algebra
  • Multilinear algebra
  • Tensor
  • Tensors (classical)
  • Component-free treatment of tensors
  • Outer product
  • Tensor algebra
  • Exterior algebra
  • Symmetric algebra
  • Clifford algebra
  • Geometric algebra
  • Bivector
  • Multivector
  • Gamas's theorem
Affine and projective
  • Affine space
  • Affine transformation, Affine group, Affine geometry
  • Affine coordinate system, Flat (geometry)
  • Cartesian coordinate system
  • Euclidean group
  • Poincaré group
  • Galilean group
  • Projective space
  • Projective transformation
  • Projective geometry
  • Projective linear group
  • Quadric
Numerical linear algebra
  • Numerical linear algebra
  • Floating-point arithmetic
  • Numerical stability
  • Basic Linear Algebra Subprograms
  • Sparse matrix
  • Comparison of linear algebra libraries
  • Category
Retrieved from "https://teknopedia.ac.id/w/index.php?title=Vector_projection&oldid=1335348484"
Categories:
  • Operations on vectors
  • Transformation (function)
  • Functions and mappings
Hidden categories:
  • Articles with short description
  • Short description matches Wikidata
  • Wikipedia introduction cleanup from September 2024
  • All pages needing cleanup
  • Articles covered by WikiProject Wikify from September 2024
  • All articles covered by WikiProject Wikify

  • indonesia
  • Polski
  • العربية
  • Deutsch
  • English
  • Español
  • Français
  • Italiano
  • مصرى
  • Nederlands
  • 日本語
  • Português
  • Sinugboanong Binisaya
  • Svenska
  • Українська
  • Tiếng Việt
  • Winaray
  • 中文
  • Русский
Sunting pranala
url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url
Pusat Layanan

UNIVERSITAS TEKNOKRAT INDONESIA | ASEAN's Best Private University
Jl. ZA. Pagar Alam No.9 -11, Labuhan Ratu, Kec. Kedaton, Kota Bandar Lampung, Lampung 35132
Phone: (0721) 702022
Email: pmb@teknokrat.ac.id