{
localUrl: '../page/limited_agi.html',
arbitalUrl: 'https://arbital.com/p/limited_agi',
rawJsonUrl: '../raw/5b3.json',
likeableId: '3046',
likeableType: 'page',
myLikeValue: '0',
likeCount: '2',
dislikeCount: '0',
likeScore: '2',
individualLikes: [
'EricRogstad',
'RolandPihlakas'
],
pageId: 'limited_agi',
edit: '2',
editSummary: '',
prevEdit: '1',
currentEdit: '2',
wasPublished: 'true',
type: 'wiki',
title: 'Limited AGI',
clickbait: 'Task-based AGIs don't need unlimited cognitive and material powers to carry out their Tasks; which means their powers can potentially be limited.',
textLength: '2331',
alias: 'limited_agi',
externalUrl: '',
sortChildrenBy: 'likes',
hasVote: 'false',
voteType: '',
votesAnonymous: 'false',
editCreatorId: 'EliezerYudkowsky',
editCreatedAt: '2017-02-22 01:12:59',
pageCreatorId: 'EliezerYudkowsky',
pageCreatedAt: '2016-07-10 22:40:45',
seeDomainId: '0',
editDomainId: 'EliezerYudkowsky',
submitToDomainId: '0',
isAutosave: 'false',
isSnapshot: 'false',
isLiveEdit: 'true',
isMinorEdit: 'false',
indirectTeacher: 'false',
todoCount: '0',
isEditorComment: 'false',
isApprovedComment: 'true',
isResolved: 'false',
snapshotText: '',
anchorContext: '',
anchorText: '',
anchorOffset: '0',
mergedInto: '',
isDeleted: 'false',
viewCount: '90',
text: 'One of the reasons why a [6w Task AGI] can potentially be safer than an [1g3 Autonomous AGI], is that since Task AGIs only need to carry out activities of limited scope, they [7tf may only need limited material and cognitive powers] to carry out those tasks. The [7g0 nonadversarial principle] still applies, but takes the form of "[7fx don't run the search]" rather than "make sure the search returns the correct answer".\n\n# Obstacles\n\n• Increasing your material and cognitive efficacy is [10g instrumentally convergent] in all sorts of places and would presumably need to be [2vk averted] all over the place.\n\n• Good limitation proposals are [deceptive_ease not as easy as they look] because [7vh particular domain capabilities can often be derived from more general architectures]. An Artificial *General* Intelligence doesn't have a handcrafted 'thinking about cars' module and a handcrafted 'thinking about planes' module, so you [7vk can't just handcraft the two modules at different levels of ability].\n\nE.g. many have suggested that 'drive' or 'emotion' is something that can be selectively removed from AGIs to 'limit' their ambitions; [43h presumably] these people are using a mental model that is not the standard [18r expected utility agent] model. To know which kind of limitations are easy, you need a sufficiently good background picture of the AGI's subprocesses that you understand which kind of system capabilities will naturally carve at the joints.\n\n# Related ideas\n\nThe research avenue of [2r8 Mild optimization] can be viewed as pursuing a kind of very general Limitation.\n\n[102 Behaviorism] asks to Limit the AGI's ability to model other minds in non-whitelisted detail.\n\n[4mn Taskishness] can be seen as an Alignment/Limitation hybrid in the sense that it asks for the AI to only *want* or *try* to do a bounded amount at every level of internal organization.\n\n[2pf] can be seen as an Alignment/Limitation hybrid in the sense that a [4l successful impact penalty] would make the AI not *want* to implement larger-scale plans.\n\nLimitation may be viewed as yet another subproblem of the [3ps], since it seems like a type of precaution that a generic agent would desire to construct into a generic imperfectly-aligned subagent.\n\nLimitation can be seen as motivated by both the [7g0] and the [7tf].',
metaText: '',
isTextLoaded: 'true',
isSubscribedToDiscussion: 'false',
isSubscribedToUser: 'false',
isSubscribedAsMaintainer: 'false',
discussionSubscriberCount: '2',
maintainerCount: '1',
userSubscriberCount: '0',
lastVisit: '',
hasDraft: 'false',
votes: [],
voteSummary: 'null',
muVoteSummary: '0',
voteScaling: '0',
currentUserVote: '-2',
voteCount: '0',
lockedVoteType: '',
maxEditEver: '0',
redLinkCount: '0',
lockedBy: '',
lockedUntil: '',
nextPageId: '',
prevPageId: '',
usedAsMastery: 'false',
proposalEditNum: '0',
permissions: {
edit: {
has: 'false',
reason: 'You don't have domain permission to edit this page'
},
proposeEdit: {
has: 'true',
reason: ''
},
delete: {
has: 'false',
reason: 'You don't have domain permission to delete this page'
},
comment: {
has: 'false',
reason: 'You can't comment in this domain because you are not a member'
},
proposeComment: {
has: 'true',
reason: ''
}
},
summaries: {},
creatorIds: [
'EliezerYudkowsky'
],
childIds: [],
parentIds: [
'task_agi'
],
commentIds: [],
questionIds: [],
tagIds: [
'c_class_meta_tag'
],
relatedIds: [],
markIds: [],
explanations: [],
learnMore: [],
requirements: [],
subjects: [],
lenses: [],
lensParentId: '',
pathPages: [],
learnMoreTaughtMap: {},
learnMoreCoveredMap: {},
learnMoreRequiredMap: {},
editHistory: {},
domainSubmissions: {},
answers: [],
answerCount: '0',
commentCount: '0',
newCommentCount: '0',
linkedMarkCount: '0',
changeLogs: [
{
likeableId: '0',
likeableType: 'changeLog',
myLikeValue: '0',
likeCount: '0',
dislikeCount: '0',
likeScore: '0',
individualLikes: [],
id: '22165',
pageId: 'limited_agi',
userId: 'EliezerYudkowsky',
edit: '2',
type: 'newEdit',
createdAt: '2017-02-22 01:12:59',
auxPageId: '',
oldSettingsValue: '',
newSettingsValue: ''
},
{
likeableId: '0',
likeableType: 'changeLog',
myLikeValue: '0',
likeCount: '0',
dislikeCount: '0',
likeScore: '0',
individualLikes: [],
id: '16528',
pageId: 'limited_agi',
userId: 'EliezerYudkowsky',
edit: '0',
type: 'newParent',
createdAt: '2016-07-10 22:40:46',
auxPageId: 'task_agi',
oldSettingsValue: '',
newSettingsValue: ''
},
{
likeableId: '0',
likeableType: 'changeLog',
myLikeValue: '0',
likeCount: '0',
dislikeCount: '0',
likeScore: '0',
individualLikes: [],
id: '16529',
pageId: 'limited_agi',
userId: 'EliezerYudkowsky',
edit: '0',
type: 'newTag',
createdAt: '2016-07-10 22:40:46',
auxPageId: 'c_class_meta_tag',
oldSettingsValue: '',
newSettingsValue: ''
},
{
likeableId: '0',
likeableType: 'changeLog',
myLikeValue: '0',
likeCount: '0',
dislikeCount: '0',
likeScore: '0',
individualLikes: [],
id: '16526',
pageId: 'limited_agi',
userId: 'EliezerYudkowsky',
edit: '1',
type: 'newEdit',
createdAt: '2016-07-10 22:40:45',
auxPageId: '',
oldSettingsValue: '',
newSettingsValue: ''
}
],
feedSubmissions: [],
searchStrings: {},
hasChildren: 'false',
hasParents: 'true',
redAliases: {},
improvementTagIds: [],
nonMetaTagIds: [],
todos: [],
slowDownMap: 'null',
speedUpMap: 'null',
arcPageIds: 'null',
contentRequests: {}
}