{ localUrl: '../page/7m.html', arbitalUrl: 'https://arbital.com/p/7m', rawJsonUrl: '../raw/7m.json', likeableId: '2406', likeableType: 'page', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], pageId: '7m', edit: '1', editSummary: '', prevEdit: '0', currentEdit: '1', wasPublished: 'true', type: 'comment', title: '"Normally I think that you s..."', clickbait: '', textLength: '1491', alias: '7m', externalUrl: '', sortChildrenBy: 'recentFirst', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'PaulChristiano', editCreatedAt: '2015-06-18 18:45:53', pageCreatorId: 'PaulChristiano', pageCreatedAt: '2015-06-18 18:45:53', seeDomainId: '0', editDomainId: 'EliezerYudkowsky', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '0', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '145', text: 'Normally I think that you set the bar too high for yourself. In this case, I think that you would be justified in setting the bar much higher (I guess if we disagreed in the same direction in every case, it wouldn't be clear that we were really disagreeing).\r \n\r \nIf you design a "safe" AI which is much less efficient (say 10x more expensive to do the same things) then an unsafe AI, that may be useful but it does not seem to resolve what you call the value achievement dilemma. It would need to be coupled with very good coordination to prevent people from deploying the more efficient, unsafe AI.\r \n\r \nSo I think it is reasonable to set the bar at safe systems that act in the world (acquire resources, produce things, influence politics...) nearly as effectively as any unsafe system that we could construct using the same underlying technologies.\r \n\r \nThis kind of requirement seems much more important than (e.g.) ensuring that your system remains safe if it were to suddenly become infinitely powerful.\r \n\r \nThis disagreement likely relates to our disagreement about the likely pace and dynamics of AI development. One difference is that in this case assuming a fast takeoff may actually be less conservative. So if you want to push the "plan for the worst" line, it seems like you should probably be pessimistic about an intelligence explosion where that would be inconvenient, but also be pessimistic about the tolerable gaps in efficiency where that would be inconvenient.', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '2', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '2016-02-26 01:58:42', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'PaulChristiano' ], childIds: [], parentIds: [ 'relevant_limited_AI' ], commentIds: [ '7q' ], questionIds: [], tagIds: [], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [], subjects: [], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: {}, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '162', pageId: '7m', userId: 'AlexeiAndreev', edit: '1', type: 'newParent', createdAt: '2015-10-28 03:46:51', auxPageId: 'relevant_limited_AI', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1257', pageId: '7m', userId: 'PaulChristiano', edit: '1', type: 'newEdit', createdAt: '2015-06-18 18:45:53', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }