{ localUrl: '../page/mind_projection.html', arbitalUrl: 'https://arbital.com/p/mind_projection', rawJsonUrl: '../raw/4yk.json', likeableId: '0', likeableType: 'page', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], pageId: 'mind_projection', edit: '2', editSummary: '', prevEdit: '1', currentEdit: '2', wasPublished: 'true', type: 'wiki', title: 'Mind projection fallacy', clickbait: 'Uncertainty is in the mind, not in the environment; a blank map does not correspond to a blank territory. In general, the territory may have a different ontology from the map.', textLength: '3074', alias: 'mind_projection', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'NateSoares', editCreatedAt: '2016-06-30 04:51:58', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2016-06-30 00:49:22', seeDomainId: '0', editDomainId: '123', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '0', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '98', text: '[summary: One commits the mind projection fallacy when they postulate that features of how their model of the world works are actually features of the world.\n\nSuppose you flip a coin, slap it against your wrist, and don't look at the result. Does it make sense to say that the probability of the coin being heads is 50%? How can this be true, when the coin has already landed, and is either definitely heads or definitely tails? One who says "the coin is fundamentally uncertain; it is a feature of the coin that it is always 50% likely to be heads" commits the mind projection fallacy. Uncertainty is in the mind, not in reality. It makes sense that brains have an internal measure of how uncertain they are about the world, but that uncertainty is not a fact about the coin, it's a fact about the uncertain person. The coin itself is not sure or unsure.]\n\nThe "mind projection fallacy" occurs when somebody expects an overly direct resemblance between the intuitive language of the mind, and the language of physical reality.\n\nConsider the [map_territory map and territory] metaphor, in which the world is a like a territory and your mental model of the world is like a map of that territory. In this metaphor, the mind projection fallacy is analogous to thinking that the territory can be folded up and put into your pocket.\n\nAs an archetypal example: Suppose you flip a coin, slap it against your wrist, and don't yet look at it. Does it make sense to say that the probability of the coin being heads is 50%? How can this be true, when the coin itself is already either definitely heads or definitely tails?\n\nOne who says "the coin is fundamentally uncertain; it is a feature of the coin that it is always 50% likely to be heads" commits the mind projection fallacy. Uncertainty is in the mind, not in reality. If you're ignorant about a coin, that's not a fact about the coin, it's a fact about you. It makes sense that your brain, the map, has an internal measure of how it's more or less sure of something. But that doesn't mean the coin itself has to contain a corresponding quantity of increased or decreased sureness; it is just heads or tails.\n\nThe [-ontology] of a system is the elementary or basic components of that system. The ontology of your model of the world may include intuitive measures of uncertainty that it can use to represent the state of the coin, used as primitives like [float floating-point numbers] are primitive in computers. The mind projection fallacy occurs whenever someone reasons as if the territory, the physical universe and its laws, must have the same sort of ontology as the map, our models of reality.\n\nSee also:\n\n- [4vr]\n- The LessWrong sequence on [Reductionism](https://wiki.lesswrong.com/wiki/Reductionism_%28sequence%29), especially:\n - [How an algorithm feels from the inside](http://lesswrong.com/lw/no/how_an_algorithm_feels_from_inside/)\n - [The Mind Projection Fallacy](http://lesswrong.com/lw/oi/mind_projection_fallacy/)\n - [Probability is in the mind](http://lesswrong.com/lw/oj/probability_is_in_the_mind/)', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'EliezerYudkowsky', 'NateSoares' ], childIds: [], parentIds: [ 'fallacy' ], commentIds: [], questionIds: [], tagIds: [ 'start_meta_tag' ], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [], subjects: [], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: {}, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '14931', pageId: 'mind_projection', userId: 'NateSoares', edit: '2', type: 'newEdit', createdAt: '2016-06-30 04:51:58', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '14836', pageId: 'mind_projection', userId: 'EliezerYudkowsky', edit: '0', type: 'newParent', createdAt: '2016-06-30 00:49:24', auxPageId: 'fallacy', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '14837', pageId: 'mind_projection', userId: 'EliezerYudkowsky', edit: '0', type: 'newTag', createdAt: '2016-06-30 00:49:24', auxPageId: 'start_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '14834', pageId: 'mind_projection', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2016-06-30 00:49:22', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }