{ localUrl: '../page/hypercomputer.html', arbitalUrl: 'https://arbital.com/p/hypercomputer', rawJsonUrl: '../raw/1mk.json', likeableId: '581', likeableType: 'page', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], pageId: 'hypercomputer', edit: '7', editSummary: '', prevEdit: '6', currentEdit: '7', wasPublished: 'true', type: 'wiki', title: 'Hypercomputer', clickbait: 'Some formalisms demand computers larger than the limit of all finite computers', textLength: '3266', alias: 'hypercomputer', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'EliezerYudkowsky', editCreatedAt: '2016-01-17 01:19:09', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2016-01-17 00:59:30', seeDomainId: '0', editDomainId: 'EliezerYudkowsky', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '0', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '129', text: '[summary: A "hypercomputer" is an imaginary artifact required to answer some crisp question that can't be answered in the limit of arbitrarily large finite computers. For example, if you have a question that depends on a general solution to the [Halting Problem](https://en.wikipedia.org/wiki/Halting_problem), we say that to solve this problem requires a "hypercomputer". (In particular, it requires a level-1 halting oracle.)\n\nIt seems exceptionally unlikely that hypercomputers will ever be discovered to be embedded into our physical universe. We just use this as a label so we can say, for certain impossible programs, "Supposing we had a hypercomputer and could run this impossible program, what would be the consequences?"\n\nFor an example of interesting code that requires a hypercomputer, see [11w]. The relations between different levels of hypercomputer are also useful for crisply describing agents that have better or worse abilities to predict one another.]\n\nA "hypercomputer" is an imaginary artifact required to answer some crisp question that can't be answered in the limit of arbitrarily large finite computers. For example, if you have a question that depends on a general solution to the [Halting Problem](https://en.wikipedia.org/wiki/Halting_problem), then we say that to solve this problem requires a "hypercomputer", and in particular, a level-1 halting oracle. (If you need to determine whether programs on level-1 halting oracles halt, you need a level-2 halting oracle, which we would also call a "hypercomputer".)\n\nIt seems exceptionally unlikely that hypercomputers will ever be discovered to be embedded into our physical universe. The term "hypercomputer" just exists as a label so we can say, "Supposing we had a hypercomputer and ran this (impossible) program, what would be the consequences?"\n\nFor some examples of conceptually illuminating code that would require a hypercomputer to actually run, see [11w] and [11v].\n\n[107 Unbounded analysis] of agents sometimes invokes hypercomputers because this lets us talk about multiple agents with easy-to-describe knowledge relations to each other. [has-requisite(arithmetical_hierarchy): For example, we might talk about Agent X that uses Zermelo-Fraenkel set theory as a proof system and has a $\\Pi_n$ oracle, and another Agent Y that uses Peano Arithmetic and has a $\\Pi_{n+1}$ oracle, to encode a set of relations where Y can directly predict and model X, and X can do proofs about Y.] [!has-requisite(arithmetical_hierarchy): For example, we might talk about Agent X that uses a weak hypercomputer and a strong proof system, and Agent Y that has a strong hypercomputer and a weak proof system, to describe a scenario where Y can directly predict and model X, and X can do proofs about Y.] In these cases, we're not trying to say that the relation between agents X and Y intrinsically requires them to have impossible powers of computation. We're just reaching for an unphysical scenario that happens to crisply encode inter-agent relations we find interesting for some reason, and allows these inter-agent relations to have consequences about which we can easily do proofs.\n\nSee also [the Wikipedia page on hypercomputation](https://en.wikipedia.org/wiki/Hypercomputation).', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '2016-02-10 04:38:16', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'EliezerYudkowsky' ], childIds: [], parentIds: [ 'unbounded_analysis' ], commentIds: [], questionIds: [], tagIds: [ 'value_alignment_glossary' ], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [], subjects: [], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: {}, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '5400', pageId: 'hypercomputer', userId: 'EliezerYudkowsky', edit: '7', type: 'newEdit', createdAt: '2016-01-17 01:19:09', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '5392', pageId: 'hypercomputer', userId: 'EliezerYudkowsky', edit: '6', type: 'newEdit', createdAt: '2016-01-17 01:10:58', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '5391', pageId: 'hypercomputer', userId: 'EliezerYudkowsky', edit: '5', type: 'newEdit', createdAt: '2016-01-17 01:08:38', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '5390', pageId: 'hypercomputer', userId: 'EliezerYudkowsky', edit: '4', type: 'newEdit', createdAt: '2016-01-17 01:07:58', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '5389', pageId: 'hypercomputer', userId: 'EliezerYudkowsky', edit: '3', type: 'newEdit', createdAt: '2016-01-17 01:00:59', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '5388', pageId: 'hypercomputer', userId: 'EliezerYudkowsky', edit: '2', type: 'newEdit', createdAt: '2016-01-17 00:59:56', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '5387', pageId: 'hypercomputer', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2016-01-17 00:59:30', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '5358', pageId: 'hypercomputer', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteTag', createdAt: '2016-01-16 22:47:59', auxPageId: 'stub_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '5356', pageId: 'hypercomputer', userId: 'EliezerYudkowsky', edit: '0', type: 'newTag', createdAt: '2016-01-16 22:47:50', auxPageId: 'stub_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '5354', pageId: 'hypercomputer', userId: 'EliezerYudkowsky', edit: '0', type: 'newTag', createdAt: '2016-01-16 22:43:34', auxPageId: 'value_alignment_glossary', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '5352', pageId: 'hypercomputer', userId: 'EliezerYudkowsky', edit: '0', type: 'newParent', createdAt: '2016-01-16 22:41:22', auxPageId: 'unbounded_analysis', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }