{
  localUrl: '../page/terminal_vs_instrumental.html',
  arbitalUrl: 'https://arbital.com/p/terminal_vs_instrumental',
  rawJsonUrl: '../raw/1bh.json',
  likeableId: 'ChrisWalker',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '1',
  dislikeCount: '0',
  likeScore: '1',
  individualLikes: [
    'EricBruylant'
  ],
  pageId: 'terminal_vs_instrumental',
  edit: '3',
  editSummary: '',
  prevEdit: '2',
  currentEdit: '3',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Terminal versus instrumental goals / values / preferences',
  clickbait: 'Distinguish events wanted for their consequences, from events wanted locally.',
  textLength: '2936',
  alias: 'terminal_vs_instrumental',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2017-02-08 18:37:44',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2015-12-17 22:55:32',
  seeDomainId: '0',
  editDomainId: '123',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '121',
  text: '[summary:  In a human sense, we want some things for themselves ('terminally'), and other things because of their later consequences ('instrumentally').\n\nWhen we get into a car on the way to the airport, we're not doing that because we enjoy opening car doors for their own sake, but because we want to be somewhere else later.  This is 'instrumental value'.\n\nWhen we enjoy eating chocolate, then while other goods or evils might come later of having eaten the chocolate, at least the current moment of happiness will count towards the total sum of goodness in the system.  We don't derive our preference to eat the chocolate *only* from our beliefs about the future consequences of eating the chocolate.  This is 'terminal value'.\n\nAn [-agent]'s 'instrumental goals' are preferred because of their expected future consequences.  Conversely, 'terminal goals' can be evaluated as goals by considering only the local facts.\n\nIn terms of the [18v expected utility formalism], instrumental [18t utility] is a non-local calculation that depends on [4vr subjective probabilities] and distant parts of the event graph; terminal [109 utility] is evaluated locally on the objects of value inside a single possibility.]\n\n'Instrumental goals' or 'instrumental values' are things that an agent wants for the sake of achieving other things.  For example, we might want to get into a car, not because we enjoy the act of opening car doors for their own sake, but because we want to drive somewhere else.\n\n'Terminal' goals, values, or preferences are those where the preference is derived locally rather than by looking at further or distant consequences.  If you enjoy eating chocolate (and otherwise approve of this enjoyment, etcetera) then you aren't deriving your preference based on what you believe to be the *further consequences* of eating chocolate.\n\nImagine reality as an enormous web of events, linked by cause and effect.  "Terminal value" is usually local and be evaluated at a single event inside the graph; even if it's a nonlocal good thing, we'd evaluate it by evaluating the history up to some point, and then we'd have a chunk of definite goodness that would stand on its own no matter what happened later.\n\n"Instrumental value" is a nonlocal property of an event, depending on its real or expected future, and contingent on that future; if you add up all the instrumental values on the graph, you don't get a meaningful sum because you may be double-counting some value.\n\nOn a moral or ethical level, instrumental values are justified by appealing to their consequences, while terminal values are justified without appeal to their consequences.\n\nFurther reading:\n\n- [41r]\n- http://lesswrong.com/lw/l4/terminal_values_and_instrumental_values/\n- http://wiki.lesswrong.com/wiki/Terminal_value#Terminal_vs._instrumental_values\n- http://en.wikipedia.org/wiki/Intrinsic_value_(ethics) and http://en.wikipedia.org/wiki/Instrumental_value',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '2016-01-05 22:58:01',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky'
  ],
  childIds: [],
  parentIds: [
    'metaethics'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21977',
      pageId: 'terminal_vs_instrumental',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2017-02-08 18:37:44',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '11875',
      pageId: 'terminal_vs_instrumental',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-06-07 01:56:49',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '11872',
      pageId: 'terminal_vs_instrumental',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newParent',
      createdAt: '2016-06-07 01:48:16',
      auxPageId: 'metaethics',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4126',
      pageId: 'terminal_vs_instrumental',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2015-12-17 22:55:32',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}