{
  localUrl: '../page/Human_consulting_HCH.html',
  arbitalUrl: 'https://arbital.com/p/Human_consulting_HCH',
  rawJsonUrl: '../raw/1tw.json',
  likeableId: '774',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '0',
  dislikeCount: '0',
  likeScore: '0',
  individualLikes: [],
  pageId: 'Human_consulting_HCH',
  edit: '3',
  editSummary: '',
  prevEdit: '2',
  currentEdit: '3',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Humans consulting HCH',
  clickbait: '',
  textLength: '2963',
  alias: 'Human_consulting_HCH',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'PaulChristiano',
  editCreatedAt: '2016-03-04 00:40:32',
  pageCreatorId: 'PaulChristiano',
  pageCreatedAt: '2016-02-02 22:15:15',
  seeDomainId: '0',
  editDomainId: '705',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '27',
  text: 'Consider a human who has access to a question-answering machine. Suppose the machine answers questions by perfectly imitating what the human would do if asked that question.\n\nTo make things twice as tricky, suppose the human-to-be-imitated is herself able to consult a question-answering machine, which answers questions by perfectly imitating what the human would do if asked that question…\n\nLet’s call this process HCH, for “Humans Consulting HCH.”\n\nI’ve talked about many variants of [this process](https://arbital.com/p/1th?title=implementing-our-considered-judgment) before, but I find it easier to think about with a nice handle. (Credit to Eliezer for proposing using a recursive acronym.)\n\nHCH is easy to specify very precisely. For now, I think that HCH is our best precisely specified model for “a human’s enlightened judgment.” It’s a pretty problematic model, but we don’t yet have many contenders.\n\n### Elaborations\n\nWe can also define realizable variants of this inaccessible ideal.\n\n- For a particular prediction algorithm P, define HCHᴾ as:  \n“P’s prediction of humans consulting HCHᴾ”\n- For a reinforcement learning algorithm A, define max-HCHᴬ as:  \n“A’s output when maximizing the score assigned to that output by humans consulting max-HCHᴬ”\n- For a given market structure and participants, define HCHᵐᵃʳᵏᵉᵗ as:  \n“market estimates about humans consulting HCHᵐᵃʳᵏᵉᵗ”\n\nNote that e.g. max-HCHᴬ is totally different from “A’s output when maximizing the score assigned to that output by HCH.”\n\nThe latter proposal is essentially [abstract approval-direction](https://arbital.com/p/1w5). It seems much more likely to yield good actions than max-HCHᴬ. But we can’t provide any feedback on the scores assigned by HCH, which makes it impossible to train the abstract version using conventional techniques. On the other hand it is much easier to implement HCHᴾ or max-HCHᴬ or HCHᵐᵃʳᵏᵉᵗ.\n\n### Hope\n\nThe best case is that HCHᴾ, max-HCHᴬ, and HCHᵐᵃʳᵏᵉᵗ are:\n\n- As capable as the underlying predictor, reinforcement learner, or market participants.\n- Aligned with the enlightened judgment of the human, e.g. as evaluated by HCH.\n\n(At least when the human is suitably prudent and wise.)\n\nIt is clear from the definitions that these systems can’t be any _more_ capable than the underlying predictor/learner/market. I honestly don’t know whether we should expect them to match the underlying capabilities. My intuition is that max-HCHᴬ probably can, but that HCHᴾ and HCHᵐᵃʳᵏᵉᵗ are much dicier.\n\nIt is similarly unclear whether the system continues to reflect the human’s judgment. In some sense this is in tension with the desire to be capable — the more guarded the human, the less capable the system but the more likely it is to reflect their interests. The question is whether a prudent human can achieve both goals.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'PaulChristiano'
  ],
  childIds: [],
  parentIds: [
    'paul_ai_control'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8264',
      pageId: 'Human_consulting_HCH',
      userId: 'JessicaChuan',
      edit: '3',
      type: 'newEdit',
      createdAt: '2016-03-04 00:40:32',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6892',
      pageId: 'Human_consulting_HCH',
      userId: 'JessicaChuan',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-02-11 23:17:56',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6160',
      pageId: 'Human_consulting_HCH',
      userId: 'JessicaChuan',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-02-02 22:15:15',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6159',
      pageId: 'Human_consulting_HCH',
      userId: 'JessicaChuan',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-02-02 22:13:34',
      auxPageId: 'paul_ai_control',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6157',
      pageId: 'Human_consulting_HCH',
      userId: 'JessicaChuan',
      edit: '0',
      type: 'deleteParent',
      createdAt: '2016-02-02 22:13:28',
      auxPageId: 'steering_problem',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6155',
      pageId: 'Human_consulting_HCH',
      userId: 'JessicaChuan',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-02-02 22:13:00',
      auxPageId: 'steering_problem',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}