Epstein Files Full PDF

CLICK HERE
Technopedia Center
PMB University Brochure
Faculty of Engineering and Computer Science
S1 Informatics S1 Information Systems S1 Information Technology S1 Computer Engineering S1 Electrical Engineering S1 Civil Engineering

faculty of Economics and Business
S1 Management S1 Accountancy

Faculty of Letters and Educational Sciences
S1 English literature S1 English language education S1 Mathematics education S1 Sports Education
teknopedia

  • Registerasi
  • Brosur UTI
  • Kip Scholarship Information
  • Performance
Flag Counter
  1. World Encyclopedia
  2. Sigmoid function - Wikipedia
Sigmoid function - Wikipedia
From Wikipedia, the free encyclopedia
Mathematical function having a characteristic S-shaped curve or sigmoid curve

The logistic curve
Plot of the error function

A sigmoid function is any mathematical function whose graph has a characteristic S-shaped or sigmoid curve.

A common example of a sigmoid function is the logistic function, which is defined by the formula[1]

σ ( x ) = 1 1 + e − x = e x 1 + e x = 1 − σ ( − x ) . {\displaystyle \sigma (x)={\frac {1}{1+e^{-x}}}={\frac {e^{x}}{1+e^{x}}}=1-\sigma (-x).} {\displaystyle \sigma (x)={\frac {1}{1+e^{-x}}}={\frac {e^{x}}{1+e^{x}}}=1-\sigma (-x).}

Other sigmoid functions are given in the Examples section. In some fields, most notably in the context of artificial neural networks, the term "sigmoid function" is used as a synonym for "logistic function".

Special cases of the sigmoid function include the Gompertz curve (used in modeling systems that saturate at large values of x) and the ogee curve (used in the spillway of some dams). Sigmoid functions have domain of all real numbers, with return (response) value commonly monotonically increasing but could be decreasing. Sigmoid functions most often show a return value (y axis) in the range 0 to 1. Another commonly used range is from −1 to 1.

There is also the Heaviside step function, which instantaneously transitions between 0 and 1.

A wide variety of sigmoid functions including the logistic and hyperbolic tangent functions have been used as the activation function of artificial neurons. Sigmoid curves are also common in statistics as cumulative distribution functions (which go from 0 to 1), such as the integrals of the logistic density, the normal density, and Student's t probability density functions. The logistic sigmoid function is invertible, and its inverse is the logit function.

Definition

[edit]

A sigmoid function is a bounded, differentiable, real function that is defined for all real input values and has a positive derivative at each point.[1][2]

Properties

[edit]

In general, a sigmoid function is monotonic, and has a first derivative which is bell shaped. Conversely, the integral of any continuous, non-negative, bell-shaped function (with one local maximum and no local minimum, unless degenerate) will be sigmoidal. Thus the cumulative distribution functions for many common probability distributions are sigmoidal. One such example is the error function, which is related to the cumulative distribution function of a normal distribution; another is the arctan function, which is related to the cumulative distribution function of a Cauchy distribution.

A sigmoid function is constrained by a pair of horizontal asymptotes as x → ± ∞ {\displaystyle x\rightarrow \pm \infty } {\displaystyle x\rightarrow \pm \infty }.

A sigmoid function is convex for values less than a particular point, and it is concave for values greater than that point: in many of the examples here, that point is 0.

Examples

[edit]
Some sigmoid functions compared. In the drawing all functions are normalized in such a way that their slope at the origin is 1.
  • Logistic function f ( x ) = 1 1 + e − x {\displaystyle f(x)={\frac {1}{1+e^{-x}}}} {\displaystyle f(x)={\frac {1}{1+e^{-x}}}}
  • Hyperbolic tangent (shifted and scaled version of the logistic function, above) f ( x ) = tanh ⁡ x = e x − e − x e x + e − x {\displaystyle f(x)=\tanh x={\frac {e^{x}-e^{-x}}{e^{x}+e^{-x}}}} {\displaystyle f(x)=\tanh x={\frac {e^{x}-e^{-x}}{e^{x}+e^{-x}}}}
  • Arctangent function f ( x ) = arctan ⁡ x {\displaystyle f(x)=\arctan x} {\displaystyle f(x)=\arctan x}
  • Gudermannian function f ( x ) = gd ⁡ ( x ) = ∫ 0 x d t cosh ⁡ t = 2 arctan ⁡ ( tanh ⁡ ( x 2 ) ) {\displaystyle f(x)=\operatorname {gd} (x)=\int _{0}^{x}{\frac {dt}{\cosh t}}=2\arctan \left(\tanh \left({\frac {x}{2}}\right)\right)} {\displaystyle f(x)=\operatorname {gd} (x)=\int _{0}^{x}{\frac {dt}{\cosh t}}=2\arctan \left(\tanh \left({\frac {x}{2}}\right)\right)}
  • Error function f ( x ) = erf ⁡ ( x ) = 2 π ∫ 0 x e − t 2 d t {\displaystyle f(x)=\operatorname {erf} (x)={\frac {2}{\sqrt {\pi }}}\int _{0}^{x}e^{-t^{2}}\,dt} {\displaystyle f(x)=\operatorname {erf} (x)={\frac {2}{\sqrt {\pi }}}\int _{0}^{x}e^{-t^{2}}\,dt}
  • Generalised logistic function f ( x ) = ( 1 + e − x ) − α , α > 0 {\displaystyle f(x)=\left(1+e^{-x}\right)^{-\alpha },\quad \alpha >0} {\displaystyle f(x)=\left(1+e^{-x}\right)^{-\alpha },\quad \alpha >0}
  • Smoothstep function f ( x ) = { ( ∫ 0 1 ( 1 − u 2 ) N d u ) − 1 ∫ 0 x ( 1 − u 2 ) N   d u , | x | ≤ 1 sgn ⁡ ( x ) | x | ≥ 1 N ∈ Z ≥ 1 {\displaystyle f(x)={\begin{cases}{\displaystyle \left(\int _{0}^{1}\left(1-u^{2}\right)^{N}du\right)^{-1}\int _{0}^{x}\left(1-u^{2}\right)^{N}\ du},&|x|\leq 1\\\\\operatorname {sgn}(x)&|x|\geq 1\\\end{cases}}\quad N\in \mathbb {Z} \geq 1} {\displaystyle f(x)={\begin{cases}{\displaystyle \left(\int _{0}^{1}\left(1-u^{2}\right)^{N}du\right)^{-1}\int _{0}^{x}\left(1-u^{2}\right)^{N}\ du},&|x|\leq 1\\\\\operatorname {sgn} (x)&|x|\geq 1\\\end{cases}}\quad N\in \mathbb {Z} \geq 1}
  • Some algebraic functions, for example f ( x ) = x 1 + x 2 {\displaystyle f(x)={\frac {x}{\sqrt {1+x^{2}}}}} {\displaystyle f(x)={\frac {x}{\sqrt {1+x^{2}}}}}
  • and in a more general form[3] f ( x ) = x ( 1 + | x | k ) 1 / k {\displaystyle f(x)={\frac {x}{\left(1+|x|^{k}\right)^{1/k}}}} {\displaystyle f(x)={\frac {x}{\left(1+|x|^{k}\right)^{1/k}}}}
  • Up to shifts and scaling, many sigmoids are special cases of f ( x ) = φ ( φ ( x , β ) , α ) , {\displaystyle f(x)=\varphi (\varphi (x,\beta ),\alpha ),} {\displaystyle f(x)=\varphi (\varphi (x,\beta ),\alpha ),} where φ ( x , λ ) = { ( 1 − λ x ) 1 / λ λ ≠ 0 e − x λ = 0 {\displaystyle \varphi (x,\lambda )={\begin{cases}(1-\lambda x)^{1/\lambda }&\lambda \neq 0\\e^{-x}&\lambda =0\\\end{cases}}} {\displaystyle \varphi (x,\lambda )={\begin{cases}(1-\lambda x)^{1/\lambda }&\lambda \neq 0\\e^{-x}&\lambda =0\\\end{cases}}} is the inverse of the negative Box–Cox transformation, and α < 1 {\displaystyle \alpha <1} {\displaystyle \alpha <1} and β < 1 {\displaystyle \beta <1} {\displaystyle \beta <1} are shape parameters.[4]
  • Smooth transition function[5] normalized to (−1,1):

f ( x ) = { 2 1 + e − 2 m x 1 − x 2 − 1 , | x | < 1 sgn ⁡ ( x ) | x | ≥ 1 = { tanh ⁡ ( m x 1 − x 2 ) , | x | < 1 sgn ⁡ ( x ) | x | ≥ 1 {\displaystyle {\begin{aligned}f(x)&={\begin{cases}{\displaystyle {\frac {2}{1+e^{-2m{\frac {x}{1-x^{2}}}}}}-1},&|x|<1\\\\\operatorname {sgn}(x)&|x|\geq 1\\\end{cases}}\\&={\begin{cases}{\displaystyle \tanh \left(m{\frac {x}{1-x^{2}}}\right)},&|x|<1\\\\\operatorname {sgn}(x)&|x|\geq 1\\\end{cases}}\end{aligned}}} {\displaystyle {\begin{aligned}f(x)&={\begin{cases}{\displaystyle {\frac {2}{1+e^{-2m{\frac {x}{1-x^{2}}}}}}-1},&|x|<1\\\\\operatorname {sgn} (x)&|x|\geq 1\\\end{cases}}\\&={\begin{cases}{\displaystyle \tanh \left(m{\frac {x}{1-x^{2}}}\right)},&|x|<1\\\\\operatorname {sgn} (x)&|x|\geq 1\\\end{cases}}\end{aligned}}} using the hyperbolic tangent mentioned above. Here, m {\displaystyle m} {\displaystyle m} is a free parameter encoding the slope at x = 0 {\displaystyle x=0} {\displaystyle x=0}, which must be greater than or equal to 3 {\displaystyle {\sqrt {3}}} {\displaystyle {\sqrt {3}}} because any smaller value will result in a function with multiple inflection points, which is therefore not a true sigmoid. This function is unusual because it actually attains the limiting values of −1 and 1 within a finite range, meaning that its value is constant at −1 for all x ≤ − 1 {\displaystyle x\leq -1} {\displaystyle x\leq -1} and at 1 for all x ≥ 1 {\displaystyle x\geq 1} {\displaystyle x\geq 1}. Nonetheless, it is smooth (infinitely differentiable, C ∞ {\displaystyle C^{\infty }} {\displaystyle C^{\infty }}) everywhere, including at x = ± 1 {\displaystyle x=\pm 1} {\displaystyle x=\pm 1}.

Applications

[edit]
Inverted logistic S-curve to model the relation between wheat yield and soil salinity

Many natural processes, such as those of complex system learning curves, exhibit a progression from small beginnings that accelerates and approaches a climax over time.[6] When a specific mathematical model is lacking, a sigmoid function is often used.[7]

The van Genuchten–Gupta model is based on an inverted S-curve and applied to the response of crop yield to soil salinity.

Examples of the application of the logistic S-curve to the response of crop yield (wheat) to both the soil salinity and depth to water table in the soil are shown in modeling crop response in agriculture.

In artificial neural networks, sometimes non-smooth functions are used instead for efficiency; these are known as hard sigmoids.

In audio signal processing, sigmoid functions are used as waveshaper transfer functions to emulate the sound of analog circuitry clipping.[8]

In Digital signal processing in general, sigmoid functions, due to their higher order of continuity, have much faster asymptotic rolloff in the frequency domain than a Heavyside step function, and therefore are useful to smoothen discontinuities before sampling to reduce aliasing. This is, for example, used to generate square waves in many kinds of Digital synthesizer.

In biochemistry and pharmacology, the Hill and Hill–Langmuir equations are sigmoid functions.

In computer graphics and real-time rendering, some of the sigmoid functions are used to blend colors or geometry between two values, smoothly and without visible seams or discontinuities.

Titration curves between strong acids and strong bases have a sigmoid shape due to the logarithmic nature of the pH scale.

The logistic function can be calculated efficiently by utilizing type III Unums.[9]

An hierarchy of sigmoid growth models with increasing complexity (number of parameters) was built[10] with the primary goal to re-analyze kinetic data, the so called N-t curves, from heterogeneous nucleation experiments,[11] in electrochemistry. The hierarchy includes at present three models, with 1, 2 and 3 parameters, if not counting the maximal number of nuclei Nmax, respectively—a tanh2 based model called α21[12] originally devised to describe diffusion-limited crystal growth (not aggregation!) in 2D, the Johnson–Mehl–Avrami–Kolmogorov (JMAK) model,[13] and the Richards model.[14] It was shown that for the concrete purpose even the simplest model works and thus it was implied that the experiments revisited are an example of two-step nucleation with the first step being the growth of the metastable phase in which the nuclei of the stable phase form.[10]

See also

[edit]
Wikimedia Commons has media related to Sigmoid functions.
  • Step function – Linear combination of indicator functions of real intervals
  • Sign function – Function returning minus 1, zero or plus 1
  • Heaviside step function – Indicator function of positive numbers
  • Logistic regression – Statistical model for a binary dependent variable
  • Logit – Function in statistics
  • Softplus function – Smoothed ramp functionPages displaying short descriptions of redirect targets
  • Soboleva modified hyperbolic tangent
  • Softmax function – Smooth approximation of one-hot arg max
  • Swish function – Mathematical activation function in data analysis
  • Weibull distribution – Continuous probability distribution
  • Fermi–Dirac statistics – Statistical description for the behavior of fermions

References

[edit]
  1. ^ a b Han, Jun; Morag, Claudio (1995). "The influence of the sigmoid function parameters on the speed of backpropagation learning". In Mira, José; Sandoval, Francisco (eds.). From Natural to Artificial Neural Computation. Lecture Notes in Computer Science. Vol. 930. pp. 195–201. doi:10.1007/3-540-59497-3_175. ISBN 978-3-540-59497-0.
  2. ^ Ling, Yibei; He, Bin (December 1993). "Entropic analysis of biological growth models". IEEE Transactions on Biomedical Engineering. 40 (12): 1193–2000. doi:10.1109/10.250574. PMID 8125495.
  3. ^ Dunning, Andrew J.; Kensler, Jennifer; Coudeville, Laurent; Bailleux, Fabrice (2015-12-28). "Some extensions in continuous methods for immunological correlates of protection". BMC Medical Research Methodology. 15 (107): 107. doi:10.1186/s12874-015-0096-9. PMC 4692073. PMID 26707389.
  4. ^ "grex --- Growth-curve Explorer". GitHub. 2022-07-09. Archived from the original on 2022-08-25. Retrieved 2022-08-25.
  5. ^ EpsilonDelta (2022-08-16). "Smooth Transition Function in One Dimension | Smooth Transition Function Series Part 1". 13:29/14:04 – via www.youtube.com.
  6. ^ Laurens Speelman, Yuki Numata (2022). "Harnessing the Power of S-Curves". RMI. Rocky Mountain Institute.
  7. ^ Gibbs, Mark N.; Mackay, D. (November 2000). "Variational Gaussian process classifiers". IEEE Transactions on Neural Networks. 11 (6): 1458–1464. doi:10.1109/72.883477. PMID 18249869. S2CID 14456885.
  8. ^ Smith, Julius O. (2010). Physical Audio Signal Processing (2010 ed.). W3K Publishing. ISBN 978-0-9745607-2-4. Archived from the original on 2022-07-14. Retrieved 2020-03-28.
  9. ^ Gustafson, John L.; Yonemoto, Isaac (2017-06-12). "Beating Floating Point at its Own Game: Posit Arithmetic" (PDF). Archived (PDF) from the original on 2022-07-14. Retrieved 2019-12-28.
  10. ^ a b Kleshtanova, Viktoria and Ivanov, Vassil V and Hodzhaoglu, Feyzim and Prieto, Jose Emilio and Tonchev, Vesselin (2023). "Heterogeneous Substrates Modify Non-Classical Nucleation Pathways: Reanalysis of Kinetic Data from the Electrodeposition of Mercury on Platinum Using Hierarchy of Sigmoid Growth Models". Crystals. 13 (12). MDPI: 1690. doi:10.3390/cryst13121690. hdl:10261/341589.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  11. ^ Markov, I. and Stoycheva, E. (1976). "Saturation Nucleus Density in the Electrodeposition of Metals onto Inert Electrodes II. Experimental". Thin Solid Films. 35 (1). Elsevier: 21–35. doi:10.1016/0040-6090(76)90237-6.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  12. ^ Ivanov, V.V. and Tielemann, C. and Avramova, K. and Reinsch, S. and Tonchev, V. (2023). "Modelling Crystallization: When the Normal Growth Velocity Depends on the Supersaturation". Journal of Physics and Chemistry of Solids. 181 111542. Elsevier. doi:10.1016/j.jpcs.2023.111542.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  13. ^ Fanfoni, M. and Tomellini, M. (1998). "The Johnson-Mehl-Avrami-Kolmogorov Model: A Brief Review". Il Nuovo Cimento D. 20. Springer: 1171–1182. doi:10.1007/BF03185527.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  14. ^ Tjørve, E. and Tjørve, K.M.C. (2010). "A Unified Approach to the Richards-Model Family for Use in Growth Analyses: Why We Need Only Two Model Forms". Journal of Theoretical Biology. 267 (3). Elsevier: 417–425. doi:10.1016/j.jtbi.2010.09.008.{{cite journal}}: CS1 maint: multiple names: authors list (link)

Further reading

[edit]
  • Mitchell, Tom M. (1997). Machine Learning. WCB McGraw–Hill. ISBN 978-0-07-042807-2.. (NB. In particular see "Chapter 4: Artificial Neural Networks" (in particular pp. 96–97) where Mitchell uses the word "logistic function" and the "sigmoid function" synonymously – this function he also calls the "squashing function" – and the sigmoid (aka logistic) function is used to compress the outputs of the "neurons" in multi-layer neural nets.)
  • Humphrys, Mark. "Continuous output, the sigmoid function". Archived from the original on 2022-07-14. Retrieved 2022-07-14. (NB. Properties of the sigmoid, including how it can shift along axes and how its domain may be transformed.)
  • v
  • t
  • e
Artificial intelligence (AI)
  • History
    • timeline
  • Glossary
  • Companies
  • Projects
Concepts
  • Parameter
    • Hyperparameter
  • Loss functions
  • Regression
    • Bias–variance tradeoff
    • Double descent
    • Overfitting
  • Clustering
  • Gradient descent
    • SGD
    • Quasi-Newton method
    • Conjugate gradient method
  • Backpropagation
  • Attention
  • Convolution
  • Normalization
    • Batchnorm
  • Activation
    • Softmax
    • Sigmoid
    • Rectifier
  • Gating
  • Weight initialization
  • Regularization
  • Datasets
    • Augmentation
  • Prompt engineering
  • Reinforcement learning
    • Q-learning
    • SARSA
    • Imitation
    • Policy gradient
  • Diffusion
  • Latent diffusion model
  • Autoregression
  • Adversary
  • RAG
  • Uncanny valley
  • RLHF
  • Self-supervised learning
  • Reflection
  • Recursive self-improvement
  • Hallucination
  • Word embedding
  • Vibe coding
Applications
  • Machine learning
    • In-context learning
  • Artificial neural network
    • Deep learning
  • Language model
    • Large
    • NMT
    • Reasoning
  • Model Context Protocol
  • Intelligent agent
  • Artificial human companion
  • Humanity's Last Exam
  • Lethal autonomous weapons (LAWs)
  • Generative artificial intelligence (GenAI)
  • (Hypothetical: Artificial general intelligence (AGI))
  • (Hypothetical: Artificial superintelligence (ASI))
  • Agent2Agent protocol
Implementations
Audio–visual
  • AlexNet
  • WaveNet
  • Human image synthesis
  • HWR
  • OCR
  • Computer vision
  • Speech synthesis
    • 15.ai
    • ElevenLabs
  • Speech recognition
    • Whisper
  • Facial recognition
  • AlphaFold
  • Text-to-image models
    • Aurora
    • DALL-E
    • Firefly
    • Flux
    • GPT Image
    • Ideogram
    • Imagen
    • Midjourney
    • Recraft
    • Stable Diffusion
  • Text-to-video models
    • Dream Machine
    • Runway Gen
    • Hailuo AI
    • Kling
    • Sora
    • Seedance
    • Veo
  • Music generation
    • Riffusion
    • Suno AI
    • Udio
Text
  • Word2vec
  • Seq2seq
  • GloVe
  • BERT
  • T5
  • Llama
  • Chinchilla AI
  • PaLM
  • GPT
    • 1
    • 2
    • 3
    • J
    • ChatGPT
    • 4
    • 4o
    • o1
    • o3
    • 4.5
    • 4.1
    • o4-mini
    • 5
    • 5.1
    • 5.2
  • Claude
  • Gemini
    • Gemini (language model)
    • Gemma
  • Grok
  • LaMDA
  • BLOOM
  • DBRX
  • Project Debater
  • IBM Watson
  • IBM Watsonx
  • Granite
  • PanGu-Σ
  • DeepSeek
  • Qwen
Decisional
  • AlphaGo
  • AlphaZero
  • OpenAI Five
  • Self-driving car
  • MuZero
  • Action selection
    • AutoGPT
  • Robot control
People
  • Alan Turing
  • Warren Sturgis McCulloch
  • Walter Pitts
  • John von Neumann
  • Christopher D. Manning
  • Claude Shannon
  • Shun'ichi Amari
  • Kunihiko Fukushima
  • Takeo Kanade
  • Marvin Minsky
  • John McCarthy
  • Nathaniel Rochester
  • Allen Newell
  • Cliff Shaw
  • Herbert A. Simon
  • Oliver Selfridge
  • Frank Rosenblatt
  • Bernard Widrow
  • Joseph Weizenbaum
  • Seymour Papert
  • Seppo Linnainmaa
  • Paul Werbos
  • Geoffrey Hinton
  • John Hopfield
  • Jürgen Schmidhuber
  • Yann LeCun
  • Yoshua Bengio
  • Lotfi A. Zadeh
  • Stephen Grossberg
  • Alex Graves
  • James Goodnight
  • Andrew Ng
  • Fei-Fei Li
  • Alex Krizhevsky
  • Ilya Sutskever
  • Oriol Vinyals
  • Quoc V. Le
  • Ian Goodfellow
  • Demis Hassabis
  • David Silver
  • Andrej Karpathy
  • Ashish Vaswani
  • Noam Shazeer
  • Aidan Gomez
  • John Schulman
  • Mustafa Suleyman
  • Jan Leike
  • Daniel Kokotajlo
  • François Chollet
Architectures
  • Neural Turing machine
  • Differentiable neural computer
  • Transformer
    • Vision transformer (ViT)
  • Recurrent neural network (RNN)
  • Long short-term memory (LSTM)
  • Gated recurrent unit (GRU)
  • Echo state network
  • Multilayer perceptron (MLP)
  • Convolutional neural network (CNN)
  • Residual neural network (RNN)
  • Highway network
  • Mamba
  • Autoencoder
  • Variational autoencoder (VAE)
  • Generative adversarial network (GAN)
  • Graph neural network (GNN)
Political
  • AI safety (Alignment)
  • Ethics of AI
  • EU AI Act
  • Precautionary principle
  • Regulation of AI
Social and economic
  • AI boom
  • AI bubble
  • AI literacy
  • AI slop
  • AI winter
  • Anthropomorphism
  • In architecture
  • In education
  • In healthcare
    • Chatbot psychosis
    • Mental health
  • In visual art
  • Category
Retrieved from "https://teknopedia.ac.id/w/index.php?title=Sigmoid_function&oldid=1333768420"
Categories:
  • Sigmoid functions
  • Artificial neural networks
Hidden categories:
  • CS1 maint: multiple names: authors list
  • Articles with short description
  • Short description is different from Wikidata
  • Use dmy dates from July 2022
  • Use list-defined references from July 2022
  • Commons category link is on Wikidata
  • Pages displaying short descriptions of redirect targets via Module:Annotated link

  • indonesia
  • Polski
  • العربية
  • Deutsch
  • English
  • Español
  • Français
  • Italiano
  • مصرى
  • Nederlands
  • 日本語
  • Português
  • Sinugboanong Binisaya
  • Svenska
  • Українська
  • Tiếng Việt
  • Winaray
  • 中文
  • Русский
Sunting pranala
url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url
Pusat Layanan

UNIVERSITAS TEKNOKRAT INDONESIA | ASEAN's Best Private University
Jl. ZA. Pagar Alam No.9 -11, Labuhan Ratu, Kec. Kedaton, Kota Bandar Lampung, Lampung 35132
Phone: (0721) 702022
Email: pmb@teknokrat.ac.id