{
  localUrl: '../page/3yz.html',
  arbitalUrl: 'https://arbital.com/p/3yz',
  rawJsonUrl: '../raw/3yz.json',
  likeableId: '0',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '0',
  dislikeCount: '0',
  likeScore: '0',
  individualLikes: [],
  pageId: '3yz',
  edit: '1',
  editSummary: '',
  prevEdit: '0',
  currentEdit: '1',
  wasPublished: 'true',
  type: 'comment',
  title: '"Suggest modifying the first..."',
  clickbait: '',
  textLength: '349',
  alias: '3yz',
  externalUrl: '',
  sortChildrenBy: 'recentFirst',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EricRogstad',
  editCreatedAt: '2016-06-01 22:53:01',
  pageCreatorId: 'EricRogstad',
  pageCreatedAt: '2016-06-01 22:53:01',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: 'If we now reconsider the arguments for "rescuing the utility function", we find that we have more choices beyond "looking for the closest thing to caloric" and "giving up entirely on warm feelings"\\.  An additional option is to try to 'rescue' the intuitive sense of warmth, but not the caloric fluid and the explicit beliefs about it\\.  We could see this as retracing our steps after being led down a garden\\-path of bad reasoning \\- we started with the intuitively good feelings about warmth, came to believe a false model about the causes of warmth, reacted emotionally to this false model, and developed an explicit moral theory about caloric fluid\\.  The extrapolated\\-volition model of normativity \\(what we would want\\* if we knew all the facts, etcetera\\) suggests that we could see the reasoning after adopting the false caloric model as 'mistaken' and not rescue it\\.  If we were dealing with preverbal intuitions and emotions that were in some way running skew to reality, our only alternatives would be to rescue the intuitions as best we could, or else give up entirely on that emotion and value\\.  When we're dealing with explicit moral beliefs that grew up around a false model of the world, we have the option to "rewind and rescue" rather than just "rescuing"\\.',
  anchorText: 'If we were dealing with preverbal intuitions and emotions that were in some way running skew to reality, our only alternatives',
  anchorOffset: '879',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '2628',
  text: 'Suggest modifying the first clause (possibly by moving the 'only' from the second into the first) to make clearer that "dealing with preverbal intuitions" is being contrasted with the situation in the previous sentence where not rescuing was an option.\n\nOr just splitting this paragraph. I was starting to lose track of where I was by the end of it.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EricRogstad'
  ],
  childIds: [],
  parentIds: [
    'rescue_utility'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '11602',
      pageId: '3yz',
      userId: 'EricRogstad',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-06-01 22:53:01',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '11601',
      pageId: '3yz',
      userId: 'EricRogstad',
      edit: '1',
      type: 'newParent',
      createdAt: '2016-06-01 22:50:38',
      auxPageId: 'rescue_utility',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}