Epstein Files Full PDF

CLICK HERE
Technopedia Center
PMB University Brochure
Faculty of Engineering and Computer Science
S1 Informatics S1 Information Systems S1 Information Technology S1 Computer Engineering S1 Electrical Engineering S1 Civil Engineering

faculty of Economics and Business
S1 Management S1 Accountancy

Faculty of Letters and Educational Sciences
S1 English literature S1 English language education S1 Mathematics education S1 Sports Education
teknopedia

  • Registerasi
  • Brosur UTI
  • Kip Scholarship Information
  • Performance
Flag Counter
  1. World Encyclopedia
  2. The Alignment Problem - Wikipedia
The Alignment Problem - Wikipedia
From Wikipedia, the free encyclopedia
2020 non-fiction book by Brian Christian
This article is about the book. For the alignment problem in artificial intelligence, see AI alignment.

The Alignment Problem: Machine Learning and Human Values
Hardcover edition
AuthorBrian Christian
LanguageEnglish
SubjectAI alignment
PublisherW. W. Norton & Company[1]
Publication date
October 6, 2020
Publication placeUnited States
Media typePrint, e-book, audiobook
Pages496
ISBN0393635821
OCLC1137850003
Websitebrianchristian.org/the-alignment-problem/

The Alignment Problem: Machine Learning and Human Values is a 2020 non-fiction book by the American writer Brian Christian. It is based on numerous interviews with experts trying to build artificial intelligence systems, particularly machine learning systems, that are aligned with human values.

Summary

[edit]

The book is divided into three sections: Prophecy, Agency, and Normativity. Each section covers researchers and engineers working on different challenges in the alignment of artificial intelligence with human values.

Prophecy

[edit]

In the first section, Christian interweaves discussions of the history of artificial intelligence research, particularly the machine learning approach of artificial neural networks such as the Perceptron and AlexNet, with examples of how AI systems can have unintended behavior. He tells the story of Julia Angwin, a journalist whose ProPublica investigation of the COMPAS algorithm, a tool for predicting recidivism among criminal defendants, led to widespread criticism of its accuracy and bias towards certain demographics. One of AI's main alignment challenges is its black box nature (inputs and outputs are identifiable but the transformation process in between is undetermined). The lack of transparency makes it difficult to know where the system is going right and where it is going wrong.

Agency

[edit]

In the second section, Christian similarly interweaves the history of the psychological study of reward, such as behaviorism and dopamine, with the computer science of reinforcement learning, in which AI systems need to develop policy ("what to do") in the face of a value function ("what rewards or punishment to expect"). He calls the DeepMind AlphaGo and AlphaZero systems "perhaps the single most impressive achievement in automated curriculum design." He also highlights the importance of curiosity, in which reinforcement learners are intrinsically motivated to explore their environment, rather than exclusively seeking the external reward.

Normativity

[edit]

The third section covers training AI through the imitation of human or machine behavior, as well as philosophical debates such as between possibilism and actualism that imply different ideal behavior for AI systems. Of particular importance is inverse reinforcement learning, a broad approach for machines to learn the objective function of a human or another agent. Christian discusses the normative challenges associated with effective altruism and existential risk, including the work of philosophers Toby Ord and William MacAskill who are trying to devise human and machine strategies for navigating the alignment problem as effectively as possible.

Reception

[edit]

The book received positive reviews from critics. The Wall Street Journal's David A. Shaywitz emphasized the frequent problems when applying algorithms to real-world problems, describing the book as "a nuanced and captivating exploration of this white-hot topic."[2] Publishers Weekly praised the book for its writing and extensive research.[3]

Kirkus Reviews gave the book a positive review, calling it "technically rich but accessible", and "an intriguing exploration of AI."[4] Writing for Nature, Virginia Dignum gave the book a positive review, favorably comparing it to Kate Crawford's Atlas of AI.[5]

In 2021, journalist Ezra Klein had Christian on his podcast, The Ezra Klein Show, writing in The New York Times, "The Alignment Problem is the best book on the key technical and moral questions of A.I. that I’ve read."[6] Later that year, the book was listed in a Fast Company feature, "5 books that inspired Microsoft CEO Satya Nadella this year".[7]

In 2022, the book won the Eric and Wendy Schmidt Award for Excellence in Science Communication, given by The National Academies of Sciences, Engineering, and Medicine in partnership with Schmidt Futures.[8]

In 2024, The New York Times placed The Alignment Problem first in its list of the "5 Best Books About Artificial Intelligence," saying: "If you're going to read one book on artificial intelligence, this is the one."[9]

See also

[edit]
  • Effective altruism
  • Global catastrophic risk
  • Human Compatible: Artificial Intelligence and the Problem of Control
  • Superintelligence: Paths, Dangers, Strategies

References

[edit]
  1. ^ "The Alignment Problem". W. W. Norton & Company.
  2. ^ Shaywitz, David (25 October 2020). "'The Alignment Problem' Review: When Machines Miss the Point". The Wall Street Journal. Retrieved 5 December 2021.
  3. ^ "Nonfiction Book Review: The Alignment Problem: Machine Learning and Human Values by Brian Christian. Norton, $27.95 (356p) ISBN 978-0-393-63582-9". PublishersWeekly.com. Retrieved 20 January 2022.
  4. ^ THE ALIGNMENT PROBLEM | Kirkus Reviews.
  5. ^ Dignum, Virginia (26 May 2021). "AI — the people and places that make, use and manage it". Nature. 593 (7860): 499–500. Bibcode:2021Natur.593..499D. doi:10.1038/d41586-021-01397-x. S2CID 235216649.
  6. ^ Klein, Ezra (4 June 2021). "If 'All Models Are Wrong,' Why Do We Give Them So Much Power?". The New York Times. Retrieved 5 December 2021.
  7. ^ Nadella, Satya (15 November 2021). "5 books that inspired Microsoft CEO Satya Nadella this year". Fast Company. Retrieved 5 December 2021.
  8. ^ "Winners - Eric and Wendy Schmidt Awards for Excellence in Science Communication - National Academies". National Academies. 12 October 2022. Retrieved 21 October 2022.
  9. ^ Marche, Stephen (31 January 2024). "5 Best Books About Artificial Intelligence". New York Times. Retrieved 6 February 2024.
  • v
  • t
  • e
Effective altruism
Concepts
  • Aid effectiveness
  • Charity assessment
  • Demandingness objection
  • Disability-adjusted life year
  • Disease burden
  • Distributional cost-effectiveness analysis
  • Earning to give
  • Equal consideration of interests
  • Incremental cost-effectiveness ratio
  • Longtermism
  • Marginal utility
  • Moral circle expansion
  • Psychological barriers to effective altruism
  • Quality-adjusted life year
  • Utilitarianism
  • Venture philanthropy
Key figures
  • Sam Bankman-Fried
  • Liv Boeree
  • Nick Bostrom
  • Paul Christiano
  • Hilary Greaves
  • Holden Karnofsky
  • William MacAskill
  • Dustin Moskovitz
  • Yew-Kwang Ng
  • Toby Ord
  • Derek Parfit
  • Kelsey Piper
  • Peter Singer
  • Brian Tomasik
  • Cari Tuna
  • Eliezer Yudkowsky
Organizations
  • 80,000 Hours
  • Against Malaria Foundation
  • Animal Charity Evaluators
  • Alignment Research Center
  • Animal Ethics
  • Centre for Effective Altruism
  • Centre for Enabling EA Learning & Research
  • Center for High Impact Philanthropy
  • Centre for the Study of Existential Risk
  • Coefficient Giving
  • Development Media International
  • Evidence Action
  • Faunalytics
  • Fistula Foundation
  • Future of Humanity Institute
  • Founders Pledge
  • GiveDirectly
  • GiveWell
  • Giving Multiplier
  • Giving What We Can
  • Good Food Fund
  • The Good Food Institute
  • Good Ventures
  • The Humane League
  • Mercy for Animals
  • Machine Intelligence Research Institute
  • Malaria Consortium
  • Raising for Effective Giving
  • Sentience Institute
  • Unlimit Health
  • Wild Animal Initiative
Focus areas
  • Biotechnology risk
  • Climate change
  • Cultured meat
  • Economic stability
  • Existential risk from artificial general intelligence
  • Global catastrophic risk
  • Global health
  • Global poverty
  • Intensive animal farming
  • Land use reform
  • Life extension
  • Malaria prevention
  • Mass deworming
  • Neglected tropical diseases
  • Risk of astronomical suffering
  • Wild animal suffering
Literature
  • Doing Good Better
  • The End of Animal Farming
  • Famine, Affluence, and Morality
  • The Life You Can Save
  • Living High and Letting Die
  • The Most Good You Can Do
  • Practical Ethics
  • The Precipice
  • Superintelligence: Paths, Dangers, Strategies
  • What We Owe the Future
Events
  • Effective Altruism Global
Authority control databases Edit this at Wikidata
  • Open Library
Retrieved from "https://teknopedia.ac.id/w/index.php?title=The_Alignment_Problem&oldid=1332712702"
Categories:
  • 2020 non-fiction books
  • Books about effective altruism
  • Books about existential risk
  • Existential risk from artificial intelligence
  • English non-fiction books
  • English-language non-fiction books
  • Futurology books
  • W. W. Norton & Company books
  • Non-fiction books about artificial intelligence
Hidden categories:
  • Articles with short description
  • Short description is different from Wikidata
  • Use dmy dates from April 2022

  • indonesia
  • Polski
  • العربية
  • Deutsch
  • English
  • Español
  • Français
  • Italiano
  • مصرى
  • Nederlands
  • 日本語
  • Português
  • Sinugboanong Binisaya
  • Svenska
  • Українська
  • Tiếng Việt
  • Winaray
  • 中文
  • Русский
Sunting pranala
url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url
Pusat Layanan

UNIVERSITAS TEKNOKRAT INDONESIA | ASEAN's Best Private University
Jl. ZA. Pagar Alam No.9 -11, Labuhan Ratu, Kec. Kedaton, Kota Bandar Lampung, Lampung 35132
Phone: (0721) 702022
Email: pmb@teknokrat.ac.id