{ localUrl: '../page/relevant_powerful_agent.html', arbitalUrl: 'https://arbital.com/p/relevant_powerful_agent', rawJsonUrl: '../raw/2s.json', likeableId: '1696', likeableType: 'page', myLikeValue: '0', likeCount: '1', dislikeCount: '0', likeScore: '1', individualLikes: [ 'AlexeiAndreev' ], pageId: 'relevant_powerful_agent', edit: '7', editSummary: '', prevEdit: '9', currentEdit: '7', wasPublished: 'true', type: 'wiki', title: 'Relevant powerful agent', clickbait: 'An agent is relevant if it completely changes the course of history.', textLength: '2803', alias: 'relevant_powerful_agent', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'AlexeiAndreev', editCreatedAt: '2015-12-16 02:21:04', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2015-03-26 22:24:36', seeDomainId: '0', editDomainId: 'EliezerYudkowsky', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '1', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '116', text: 'A cognitively powerful agent is *relevant* if it is cognitively powerful enough to be a game-changer in the larger dilemma faced by Earth-originating intelligent life. Conversely, an agent is *irrelevant* if it is too weak to make much of a difference, or if the cognitive problems it can solve or tasks it is authorized to perform don't significantly change the overall situation we face.\n\n## Definition\n\nIntuitively speaking, a value-aligned AI is 'relevant' to the extent it has a 'yes' answer to the box 'Can we use this AI to produce a benefit that solves the larger dilemma?' in [this flowchart](https://i.imgur.com/JuIFAxh.png?0), or is part of a larger plan that gets us to the lower green circle without any "Then a miracle occurs" steps. A cognitively powerful agent is relevant at all if its existence can effectuate good or bad outcomes - e.g., a big neural net is not 'relevant' because it doesn't end the world one way or another.\n\n(A better word than 'relevant' might be helpful here.)\n\n## Examples\n\n### Positive examples\n\n- A hypothetical agent that can bootstrap to nanotechnology by solving the inverse protein folding problem and shut down other AI projects, in a way that can reasonably be known safe enough to authorize by the AI's programmers, would be relevant.\n\n### Negative examples\n\n- An agent authorized to prove or disprove the Riemann Hypothesis, but not to do anything else, is not relevant (unless knowing whether the Riemann Hypothesis is true somehow changes everything for the basic dilemma of AI).\n- An oracle that can only output verified HOL proofs is not yet 'relevant' until someone can describe theorems to prove such that firm knowledge of their truth would be a game-changer for the AI situation. (Hypothesizing that someone else will come up with a theorem like that, if you just build the oracle, is a [ hail Mary step] in the plan.)\n\n## Importance\n\nMany proposals for AI safety, especially [2l advanced safety], so severely restrict the applicability of the AI that the AI is no longer allowed to do anything that seems like it could solve the larger dilemma. (E.g., an oracle that is only allowed to give us binary answers for whether it thinks certain mathematical facts are true, and nobody has yet said how to use this ability to save the world.)\n\nConversely, proposals to use AIs to do things impactful enough to solve the larger dilemma, generally run smack into all the usual [2l advanced safety] problems, especially if the AI must operate in the rich domain of the real world to carry out the task (this tends to require full trust).\n\n## Open problem\n\n[2y It is an open problem to propose a relevant, limited AI] that would be significantly easier to handle than the general safety problem, while also being useful enough to resolve the larger .', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '2016-02-21 21:51:06', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'EliezerYudkowsky', 'AlexeiAndreev' ], childIds: [], parentIds: [ 'ai_alignment' ], commentIds: [ '19h' ], questionIds: [], tagIds: [], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [], subjects: [], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: {}, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '3827', pageId: 'relevant_powerful_agent', userId: 'AlexeiAndreev', edit: '0', type: 'newAlias', createdAt: '2015-12-16 02:21:04', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '3828', pageId: 'relevant_powerful_agent', userId: 'AlexeiAndreev', edit: '7', type: 'newEdit', createdAt: '2015-12-16 02:21:04', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '355', pageId: 'relevant_powerful_agent', userId: 'AlexeiAndreev', edit: '1', type: 'newParent', createdAt: '2015-10-28 03:46:51', auxPageId: 'ai_alignment', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1441', pageId: 'relevant_powerful_agent', userId: 'EliezerYudkowsky', edit: '9', type: 'newEdit', createdAt: '2015-03-27 01:52:42', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1440', pageId: 'relevant_powerful_agent', userId: 'EliezerYudkowsky', edit: '8', type: 'newEdit', createdAt: '2015-03-27 01:52:28', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1439', pageId: 'relevant_powerful_agent', userId: 'AlexeiAndreev', edit: '6', type: 'newEdit', createdAt: '2015-03-26 23:48:38', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1438', pageId: 'relevant_powerful_agent', userId: 'AlexeiAndreev', edit: '5', type: 'newEdit', createdAt: '2015-03-26 23:44:50', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1437', pageId: 'relevant_powerful_agent', userId: 'EliezerYudkowsky', edit: '3', type: 'newEdit', createdAt: '2015-03-26 22:26:30', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1436', pageId: 'relevant_powerful_agent', userId: 'EliezerYudkowsky', edit: '2', type: 'newEdit', createdAt: '2015-03-26 22:25:07', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1435', pageId: 'relevant_powerful_agent', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2015-03-26 22:24:36', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }