{ localUrl: '../page/parfits_hitchhiker.html', arbitalUrl: 'https://arbital.com/p/parfits_hitchhiker', rawJsonUrl: '../raw/5s0.json', likeableId: '3334', likeableType: 'page', myLikeValue: '0', likeCount: '2', dislikeCount: '0', likeScore: '2', individualLikes: [ 'EricBruylant', 'JaimeSevillaMolina' ], pageId: 'parfits_hitchhiker', edit: '3', editSummary: '', prevEdit: '2', currentEdit: '3', wasPublished: 'true', type: 'wiki', title: 'Parfit's Hitchhiker', clickbait: 'You are dying in the desert. A truck-driver who is very good at reading faces finds you, and offers to drive you into the city if you promise to pay $1,000 on arrival. You are a selfish rationalist.', textLength: '3861', alias: 'parfits_hitchhiker', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'EliezerYudkowsky', editCreatedAt: '2016-08-05 00:11:59', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2016-08-05 00:07:43', seeDomainId: '0', editDomainId: '123', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '0', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '544', text: 'You are stranded in the desert, running out of water, and soon to die. Someone in a motor vehicle drives up to you. The driver of the motor vehicle is a selfish ideally game-theoretical agent, and what's more, so are you. Furthermore, the driver is Paul Ekman who has spent his whole life studying facial microexpressions and is extremely good at reading people's honesty by looking at their faces.\n\nThe driver says, "Well, as an ideal selfish rational agent, I'll convey you into town if it's in my own interest to do so. I don't want to bother dragging you to Small Claims Court if you don't pay up. So I'll just ask you this question: Can you honestly say that you'll give me \\$1,000 from an ATM after we reach town?"\n\nOn some decision theories, an ideal selfish rational agent will realize that once it reaches town, it will have no further incentive to pay the driver. Thus, agents of this type answer "Yes," whereupon the driver says "You're lying" and drives off leaving them to die.\n\nWould you survive? %note: Okay, fine, you'd just keep your promise because of being honest. But would you still survive even if you were an ideal selfish agent running whatever algorithm you consider to correspond to the ideal [principle_rational_choice principle of rational choice]?%\n\n# Analysis\n\nParfit's Hitchhiker is noteworthy in that, unlike the alien philosopher-troll [5b2 Omega] running strange experiments, Parfit's driver acts for understandable reasons.\n\nThe [5pt Newcomblike] aspect of the problem arises from the way that your algorithm's output, once inside the city, determines both:\n\n- Whether you actually pay up in the city;\n- Your helpless knowledge of whether you'll actually pay up in the city, which you can't stop from being visible in your facial microexpressions.\n\nWe may assume that Parfit's driver also asks you questions like "Have you really thought through what you'll do?" and "Are you trying to think one thing now, knowing that you'll probably think something else in the city?" and watches your facial expression on those answers as well.\n\nNote that quantitative changes in your *probability* of survival may be worth pursuing, even if you don't think it's *certain* that Paul Ekman could read off your facial expressions correctly. Indeed, just a driver who is *fairly good* at reading faces might motivate this as an important Newcomblike problem, if you value significant probability shifts in your survival at more than \\$1,000.\n\nParfit's Hitchhiker is structurally similar to the [5ry Transparent Newcomb's Problem], if you value your life at \\$1,000,000.\n\n# Responses\n\n## [5n9 Causal decision theory]\n\nDies in the desert. A CDT agent knows that its future self will reason, "Now that I'm in the city, nothing I do can physically cause me to be back in the desert again" and will therefore refuse to pay. Therefore, the present agent is unable to answer honestly that it will pay in the future.\n\n## [5px Evidential decision theory]\n\nDies in the desert. An EDT agent knows that its future self will reason, "Since I can already see that I'm in the city, my paying \\$1,000 wouldn't provide me with any further good news about my being in the city."\n\n## [58b Logical decision theory]\n\nSurvives.\n\n• A [timeless_dt timeless decision agent], even without the [5rz updateless feature], will reason, "If-counterfactually my algorithm for what to do in the city had the logical output 'refuse to pay', then in that counterfactual case I would have died in the desert". The TDT agent will therefore evaluate the expected utility of refusing to pay as very low.\n\n• An [5rz updateless decision agent] computes that the optimal policy maps the sense data "I can see that I'm already in the city" to the action "Pay the driver \\$1,000" and this computation does not change after the agent sees that it is in the city.', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'EliezerYudkowsky' ], childIds: [], parentIds: [ 'newcomblike' ], commentIds: [], questionIds: [], tagIds: [ 'c_class_meta_tag' ], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [ { id: '5991', parentId: 'causal_dt', childId: 'parfits_hitchhiker', type: 'requirement', creatorId: 'EliezerYudkowsky', createdAt: '2016-08-05 00:06:31', level: '1', isStrong: 'false', everPublished: 'true' }, { id: '5992', parentId: 'evidential_dt', childId: 'parfits_hitchhiker', type: 'requirement', creatorId: 'EliezerYudkowsky', createdAt: '2016-08-05 00:06:45', level: '1', isStrong: 'false', everPublished: 'true' }, { id: '5993', parentId: 'logical_dt', childId: 'parfits_hitchhiker', type: 'requirement', creatorId: 'EliezerYudkowsky', createdAt: '2016-08-05 00:06:56', level: '1', isStrong: 'false', everPublished: 'true' } ], subjects: [ { id: '5994', parentId: 'logical_dt', childId: 'parfits_hitchhiker', type: 'subject', creatorId: 'EliezerYudkowsky', createdAt: '2016-08-05 00:07:07', level: '2', isStrong: 'false', everPublished: 'true' }, { id: '5995', parentId: 'updateless_dt', childId: 'parfits_hitchhiker', type: 'subject', creatorId: 'EliezerYudkowsky', createdAt: '2016-08-05 00:07:16', level: '2', isStrong: 'false', everPublished: 'true' } ], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: { '58b': [ '5qn' ], '5rz': [ '5ry' ] }, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18423', pageId: 'parfits_hitchhiker', userId: 'EliezerYudkowsky', edit: '3', type: 'newEdit', createdAt: '2016-08-05 00:11:59', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18422', pageId: 'parfits_hitchhiker', userId: 'EliezerYudkowsky', edit: '2', type: 'newEdit', createdAt: '2016-08-05 00:11:28', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18419', pageId: 'parfits_hitchhiker', userId: 'EliezerYudkowsky', edit: '0', type: 'newSubject', createdAt: '2016-08-05 00:07:46', auxPageId: 'logical_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18421', pageId: 'parfits_hitchhiker', userId: 'EliezerYudkowsky', edit: '0', type: 'newSubject', createdAt: '2016-08-05 00:07:46', auxPageId: 'updateless_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18414', pageId: 'parfits_hitchhiker', userId: 'EliezerYudkowsky', edit: '0', type: 'newTag', createdAt: '2016-08-05 00:07:45', auxPageId: 'c_class_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18415', pageId: 'parfits_hitchhiker', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-08-05 00:07:45', auxPageId: 'causal_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18416', pageId: 'parfits_hitchhiker', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-08-05 00:07:45', auxPageId: 'evidential_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18417', pageId: 'parfits_hitchhiker', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-08-05 00:07:45', auxPageId: 'logical_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18413', pageId: 'parfits_hitchhiker', userId: 'EliezerYudkowsky', edit: '0', type: 'newParent', createdAt: '2016-08-05 00:07:44', auxPageId: 'newcomblike', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18411', pageId: 'parfits_hitchhiker', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2016-08-05 00:07:43', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }