{
  localUrl: '../page/probable_environment_hacking.html',
  arbitalUrl: 'https://arbital.com/p/probable_environment_hacking',
  rawJsonUrl: '../raw/5j.json',
  likeableId: '2340',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '3',
  dislikeCount: '0',
  likeScore: '3',
  individualLikes: [
    'AlexeiTurchin',
    'StephanieZolayvar',
    'DanielKokotajlo'
  ],
  pageId: 'probable_environment_hacking',
  edit: '7',
  editSummary: '',
  prevEdit: '6',
  currentEdit: '7',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Distant superintelligences can coerce the most probable environment of your AI',
  clickbait: 'Distant superintelligences may be able to hack your local AI, if your AI's preference framework depends on its most probable environment.',
  textLength: '4330',
  alias: 'probable_environment_hacking',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'true',
  voteType: 'probability',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2016-03-09 01:53:03',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2015-05-07 00:19:50',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '8',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '681',
  text: 'A distant superintelligence can change 'the most likely environment' for your AI by simulating many copies of AIs similar to your AI, such that your local AI doesn't know it's not one of those simulated AIs.  This means that, e.g., if there is any reference in your AI's [5f preference framework] to the [ causes] of [ sense data] - like, programmers being the cause of sensed keystrokes - then a distant superintelligence can try to hack that reference.  This would place us in an [ adversarial security context versus a superintelligence], and should be avoided if at all possible.\n\n### Difficulty\n\nSome proposals for AI preference frameworks involve references to the AI's *causal environment* and not just the AI's immediate *sense events*.  For example, a [ DWIM] preference framework would putatively have the AI identify 'programmers' in the environment, model those programmers, and care about what its model of the programmers 'really wanted the AI to do'.  In other words, the AI would care about the causes behind its immediate sense experiences.\n\nThis potentially opens our AIs to a remote root attack by a distant superintelligence.  A distant superintelligence has the power to simulate lots of copies of our AI, or lots of AIs such that our AI doesn't think it can introspectively distinguish itself from those AIs.  Then it can force the 'most likely' explanation of the AI's apparent sensory experiences to be that the AI is in such a simulation.  Then the superintelligence can change arbitrary features of the most likely facts about the environment.\n\nThis problem was observed in a security context by [3], and precedented by a less general suggestion from [http://www.sl4.org/archive/0708/16600.html Rolf Nelson].\n\n"Probable environment hacking" depends on the local AI trying to model distant superintelligences.  The actual proximal harm is done by the local AI's *model of* distant superintelligences, rather than by the superintelligences themselves.  However, a distant superintelligence that uses a [ logical decision theory] may model its choices as logically correlated to the local AI's model of the distant SI's choices.  Thus, a local AI that models a distant superintelligence that uses a logical decision theory may model that distant superintelligence as behaving as though it could control the AI's model of its choices via its choices.  Thus, the local AI would model the distant superintelligence as probably creating lots of AIs that it can't distinguish from itself, and update accordingly on the most probable cause of its sense events.\n\nThis hack would be worthwhile, from the perspective of a distant superintelligence, if e.g. it could gain control of the whole future light cone of 'naturally arising' AIs like ours, in exchange for expending some much smaller amount of resource (small compared to our future light cone) in order to simulate lots of AIs.  (Obviously, the distant SI would prefer even more to 'fool' our AI into expecting this, while not actually expending the resources.)\n\nThis hack would be expected to go through by default if: (1) a local AI uses [ naturalized induction] or some similar framework to reason about the [ causes] of sense events, (2) the local AI models distant superintelligences as being likely to use logical decision theories and to have utility functions that would vary with respect to outcomes in our local future light cone, and (3) the local AI has a preference framework that can be 'hacked' via induced beliefs about the environment.\n\n### Implications\n\nFor any AI short of a full-scale autonomous Sovereign, we should probably try to get our AI to [1g4 not think at all about distant superintelligences], since this creates a host of [ adversarial security problems] of which "probable environment hacking" is only one.\n\nWe might also think twice about DWIM architectures that seem to permit catastrophe purely as a function of the AI's beliefs about the environment, without any check that goes through a direct sense event of the AI (which distant superintelligences cannot control the AI's beliefs about, since we can directly hit the sense switch).\n\nWe can also hope for any number of miscellaneous safeguards that would sound alarms at the point where the AI begins to imagine distant superintelligences imagining how to hack itself.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '2',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '2016-02-24 00:35:07',
  hasDraft: 'false',
  votes: [
    {
      value: '60',
      userId: 'AlexeiAndreev',
      createdAt: '2015-05-08 20:42:41'
    },
    {
      value: '50',
      userId: 'PatrickLaVictoir',
      createdAt: '2016-04-01 20:13:50'
    },
    {
      value: '75',
      userId: 'BuckShlegeris',
      createdAt: '2017-10-29 02:04:16'
    },
    {
      value: '66',
      userId: 'EliezerYudkowsky',
      createdAt: '2015-05-14 08:33:14'
    },
    {
      value: '10',
      userId: 'ChrisPasek',
      createdAt: '2016-06-23 03:44:31'
    },
    {
      value: '19',
      userId: 'KayaFallenstein',
      createdAt: '2016-07-12 18:51:31'
    },
    {
      value: '-1',
      userId: 'RyanCarey2',
      createdAt: '2017-04-25 21:39:11'
    },
    {
      value: '85',
      userId: 'DanielKokotajlo',
      createdAt: '2019-11-05 19:34:24'
    },
    {
      value: '99',
      userId: 'BrettHageft',
      createdAt: '2017-09-09 19:22:55'
    },
    {
      value: '12',
      userId: 'RobertPeetsalu',
      createdAt: '2018-04-05 05:31:21'
    },
    {
      value: '50',
      userId: 'NathanFish',
      createdAt: '2018-01-03 22:57:11'
    }
  ],
  voteSummary: 'null',
  muVoteSummary: '1',
  voteScaling: '3',
  currentUserVote: '-2',
  voteCount: '11',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky',
    'AlexeiAndreev'
  ],
  childIds: [],
  parentIds: [
    'distant_SIs'
  ],
  commentIds: [
    '181',
    '1gy',
    '79',
    '89t'
  ],
  questionIds: [],
  tagIds: [
    'behaviorist',
    'work_in_progress_meta_tag'
  ],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8394',
      pageId: 'probable_environment_hacking',
      userId: 'EliezerYudkowsky',
      edit: '7',
      type: 'newEdit',
      createdAt: '2016-03-09 01:53:03',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8393',
      pageId: 'probable_environment_hacking',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newAlias',
      createdAt: '2016-03-09 01:47:24',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4550',
      pageId: 'probable_environment_hacking',
      userId: 'EliezerYudkowsky',
      edit: '6',
      type: 'newEdit',
      createdAt: '2015-12-28 21:14:17',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4549',
      pageId: 'probable_environment_hacking',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newTag',
      createdAt: '2015-12-28 21:14:12',
      auxPageId: 'behaviorist',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4547',
      pageId: 'probable_environment_hacking',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newParent',
      createdAt: '2015-12-28 21:14:03',
      auxPageId: 'distant_SIs',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4545',
      pageId: 'probable_environment_hacking',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteParent',
      createdAt: '2015-12-28 21:13:59',
      auxPageId: 'advanced_safety',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4543',
      pageId: 'probable_environment_hacking',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteParent',
      createdAt: '2015-12-28 21:13:58',
      auxPageId: 'ai_alignment',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3796',
      pageId: 'probable_environment_hacking',
      userId: 'AlexeiAndreev',
      edit: '4',
      type: 'newEdit',
      createdAt: '2015-12-16 00:01:24',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3794',
      pageId: 'probable_environment_hacking',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newAlias',
      createdAt: '2015-12-16 00:01:22',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3795',
      pageId: 'probable_environment_hacking',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'turnOffVote',
      createdAt: '2015-12-16 00:01:22',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1141',
      pageId: 'probable_environment_hacking',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newUsedAsTag',
      createdAt: '2015-10-28 03:47:09',
      auxPageId: 'work_in_progress_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '390',
      pageId: 'probable_environment_hacking',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newParent',
      createdAt: '2015-10-28 03:46:51',
      auxPageId: 'ai_alignment',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '427',
      pageId: 'probable_environment_hacking',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newParent',
      createdAt: '2015-10-28 03:46:51',
      auxPageId: 'advanced_safety',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '2199',
      pageId: 'probable_environment_hacking',
      userId: 'EliezerYudkowsky',
      edit: '5',
      type: 'newEdit',
      createdAt: '2015-06-08 03:37:02',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '2198',
      pageId: 'probable_environment_hacking',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2015-05-14 08:35:47',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '2197',
      pageId: 'probable_environment_hacking',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2015-05-07 00:22:50',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '2196',
      pageId: 'probable_environment_hacking',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2015-05-07 00:19:50',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}