![the sequence the sequence](https://i.ytimg.com/vi/6DDbzqYKNh0/maxresdefault.jpg)
A luminous mental state is one that you have and know that you have. Luminosity, as used here, is self-awareness. This sequence, themed with an analysis of Newcomb's problem, is a consolidated summary and context for the many decision theory discussions found on LessWrong at the time of writing. For this, we formulate decision theories. Decisions need to be modeled with some structure in order to be scrutinized and systematically improved simply "intuiting" the answers to decision problems by ad-hoc methods is not conducive to thorough analysis. Decision Theory of Newcomblike Problems.This sequence explains and defends a naturalistic approach to metaethics. This sequence explains how intuitions are used in mainstream philosophy and what the science of intuitions suggests about how intuitions should be used in philosophy. Each post concludes with footnotes and a long list of references from the academic literature. This sequence summarizes scientifically-backed advice for "winning" at everyday life: in one's productivity, in one's relationships, in one's emotions, etc. There exist ways to defend yourself against these kinds of intrusions, and there are even methods to harness them into useful testing mechanisms. Priming may be described as the capability of any random stimulus to commandeer your thinking and judgement for the next several minutes. Positivism, Self Deception, and Neuroscience.Sequences of essays by Scott Alexander include: These essays include a discussion of truth, formal logic, causality, and metaethics, and are a good way for more ambitious readers to quickly get up to speed. Highly Advanced Epistemology 101 for Beginners.Yudkowsky has also written a more recent sequence:
#THE SEQUENCE FREE#
Free Will : Yudkowsky's answer to a challenge he raises in Rationality: From AI to Zombies to come up with an explanation for the human feeling that we have free will.The Hanson-Yudkowsky AI-Foom Debate : A blog conversation between Eliezer Yudkowsky and Robin Hanson on the topic of intelligence explosion and how concerned we should be about superintelligent AI.
![the sequence the sequence](http://ih0.redbubble.net/image.216601132.0830/flat,800x800,075,f.u2.jpg)
Other collections from the same time period (2006-2009) include:
#THE SEQUENCE HOW TO#
Book II: How to Actually Change Your Mind.An introduction to the Bayesian concept of rational belief.
![the sequence the sequence](https://media.nagwa.com/173130429845/en/thumbnail_l.jpeg)
Its six books in turn break down into twenty-six sections: The ebook can be downloaded on a "pay-what-you-want" basis from. It's one of the best places to start for people who want to better understand topics that crop up on Less Wrong, such as cognitive bias, the map-territory distinction, meta-ethics, and existential risk. Rationality: From AI to Zombies is an ebook collecting six books worth of essays on the science and philosophy of human rationality. Rationality: From AI to Zombies Rationality: From AI to Zombies cover image. If you are new to Less Wrong, this book is the best place to start. MIRI has since collated and edited the sequences into Rationality: From AI to Zombies. The original sequences were written by Eliezer Yudkowsky with the goal of creating a book on rationality. See the Library page for a list of LessWrong sequences in their modern form.
#THE SEQUENCE SERIES#
A sequence is a series of multiple posts on Less Wrong on the same topic, to coherently and fully explore a particular thesis.