{
  localUrl: '../page/death_in_damascus.html',
  arbitalUrl: 'https://arbital.com/p/death_in_damascus',
  rawJsonUrl: '../raw/5qn.json',
  likeableId: '3343',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '1',
  dislikeCount: '0',
  likeScore: '1',
  individualLikes: [
    'EricBruylant'
  ],
  pageId: 'death_in_damascus',
  edit: '9',
  editSummary: '',
  prevEdit: '8',
  currentEdit: '9',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Death in Damascus',
  clickbait: 'Death tells you that It is coming for you tomorrow.  You can stay in Damascus or flee to Aleppo.  Whichever decision you actually make is the wrong one.  This gives some decision theories trouble.',
  textLength: '12821',
  alias: 'death_in_damascus',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'DuncanWilson',
  editCreatedAt: '2017-03-21 12:39:52',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2016-08-02 04:05:38',
  seeDomainId: '0',
  editDomainId: '123',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '495',
  text: '[summary:  In the city of Damascus, a man encounters the skeletal visage of Death.  Death, upon seeing the man, looks surprised; but then says, "I ᴀᴍ ᴄᴏᴍɪɴɢ ғᴏʀ ʏᴏᴜ ᴛᴏᴍᴏʀʀᴏᴡ."  The terrified man buys a camel and flees to Aleppo.  After being killed in Aleppo by falling roof tiles, the man looks around and sees Death waiting.\n\n"I thought you would be looking for me in Damascus," says the man.\n\n"Nᴏᴛ ᴀᴛ ᴀʟʟ," says Death.  "Tʜᴀᴛ ɪs ᴡʜʏ I ᴡᴀs sᴜʀᴘʀɪsᴇᴅ ᴛᴏ sᴇᴇ ʏᴏᴜ ʏᴇsᴛᴇʀᴅᴀʏ, ғᴏʀ I ᴋɴᴇᴡ I ʜᴀᴅ ᴀɴ ᴀᴘᴘᴏɪɴᴛᴍᴇɴᴛ ᴡɪᴛʜ ʏᴏᴜ ɪɴ Aʟᴇᴘᴘᴏ."\n\nIn the Death in Damascus dilemma, we can either stay in Damascus or flee to Aleppo.  Death, an excellent predictor of human behavior, has informed us that whatever decision we end up making after being warned, will be the wrong one.\n\nIf we decide to stay in Damascus, we conclude that staying in Damascus will be fatal.  If we observe ourselves fleeing to Aleppo, we conclude that we'll die if we go to Aleppo and that Damascus would be safe.\n\nThis standard dilemma can send some decision theories into infinite loops; while other decision theories break the loop in ways that (arguably) lead to other problems.]\n\nIn the city of Damascus, a man encounters the skeletal visage of Death.  Death, upon seeing the man, looks surprised; but then says, "I ᴀᴍ ᴄᴏᴍɪɴɢ ғᴏʀ ʏᴏᴜ ᴛᴏᴍᴏʀʀᴏᴡ."  The terrified man buys a camel and flees to Aleppo.  After being killed in Aleppo by falling roof tiles, the man looks around and sees Death waiting.\n\n"I thought you would be looking for me in Damascus," says the man.\n\n"Nᴏᴛ ᴀᴛ ᴀʟʟ," says Death.  "Tʜᴀᴛ ɪs ᴡʜʏ I ᴡᴀs sᴜʀᴘʀɪsᴇᴅ ᴛᴏ sᴇᴇ ʏᴏᴜ ʏᴇsᴛᴇʀᴅᴀʏ, ғᴏʀ I ᴋɴᴇᴡ I ʜᴀᴅ ᴀɴ ᴀᴘᴘᴏɪɴᴛᴍᴇɴᴛ ᴡɪᴛʜ ʏᴏᴜ ɪɴ Aʟᴇᴘᴘᴏ."\n\nIn the Death in Damascus dilemma for decision theories, Death has kindly informed us that whatever decision we end up making, will, in fact, have been the wrong one.  It's not that Death follows us wherever we go, but that Death has helpfully predicted our future decision and found that our decision takes us to a city in which a fatal accident will occur to us.\n\nIf we observe ourselves deciding to stay in Damascus, we know that staying in Damascus will be fatal and that we would be safe if only we fled to Aleppo.  If we observe ourselves fleeing to Aleppo, we will conclude that we are to die in Aleppo for no reason other than that we fled there.\n\nThis dilemma can send some decision theories into infinite loops; while other decision theories break the loop in ways that (arguably) lead to other problems.\n\nFor a related dilemma with some of the same flavor of [ratification looking for a stable policy], without involving Death or other perfect predictors, see [5qh the Absent-Minded Driver].\n\n# Analysis\n\nDeath in Damascus is a standard problem in decision theory and has a sizable literature concerning it.  (We haven't found a good online collection, so [try this Google search](https://www.google.com/search?q=death%20in%20damascus%20decision) for some analyses within the mainstream view.)\n\n## Causal decision theory\n\nThe first-order version of [5n9 CDT] just considers counterfactuals--$\\operatorname {do}()$ operations--on our possible actions, meaning that we don't update our background beliefs at all at the time of calculating our action.  It's not clear in this case what we think of Aleppo and Damascus after Death gives us Its observation, which would seem to require that we have prior probabilities on our going to Aleppo or staying in Damascus.  Let's say that we thought we only had a 0.01% chance under normal circumstances of suddenly traveling to Aleppo; then after updating on Death's statement, we'll think that Damascus has a 99.99% chance of being fatal and Aleppo has a 0.01% chance of being the fatal city, and we'll flee to Aleppo.\n\nThis does deliver a prompt answer, but it involves a false calculation about expected utility--at the time of calculating the expected utilities in the decision, we think we have a 99.99% chance of surviving (since we think Aleppo is only 0.01% likely to prove fatal).  The actual number, by hypothesis, is 0%.\n\nIn turn, this could let a mischievous bookie pump money out of the CDT agent.  Suppose that besides choosing between Aleppo and Damascus, the agent also needs to choose whether to buy a ticket that costs \\$1, and pays out \\$11 if the agent survives.  This is a good bet if you have a 99.99% chance of survival; not so much if you have a 0% chance of survival.\n\nWe can suppose the agent must choose both $D$ vs $A$ for Damascus vs. Aleppo, and simultaneously choose $Y$ vs $N$ for whether to yes-buy or not-buy the \\$1 ticket that pays \\$11 if the agent survives.  That is, the agent is facing four buttons $DY, AY, DN, AN$ and this outcome table:\n\n$$\n\\begin{array}{r|c|c}\n& \\text {Damascus fatal} & \\text {Aleppo fatal} \\\\ \\hline\n\\ {DN} & \\text {Die} & \\text{Live} \\\\ \\hline\n\\ {AN} & \\text {Live} & \\text {Die} \\\\ \\hline\n\\ {DY} & \\text {Die, \\$-1} & \\text{Live, \\$+10} \\\\ \\hline\n\\ {AY} & \\text {Live, \\$+10} & \\text {Die, \\$-1}\n\\end{array}\n$$\n\nA causal decision theory that doesn't update its background beliefs at all while making the decision, will select $AY$ instead of $AN.$  (And then the CDT agent predictably updates afterwards to thinking that the ticket is worthless, so we can buy the ticket back for \\$0.01 at a profit of \\$0.99, justifying our regarding this as a "money pump".)\n\nA first response would be to allow the CDT agent to [tickle_defense observe its own initial impulse], try updating the background variables accordingly, and then reconsider its decision until it finds a decision that is stable or [ratification "self-ratifying"].\n\nThis deals with the [newcombs_tax Newcomb's Tax] dilemma, but isn't sufficient for Death in Damascus since there is no deterministic self-ratifying decision on this problem--the decision theory goes into an infinite loop as it believes that Damascus is fatal and feels an impulse to go to Aleppo, updates to believe that Aleppo is fatal and observes an impulse to stay in Damascus, etcetera.\n\nThe standard reply is to allow the decision theory to break loops like this by deploying mixed strategies.  At the point where the agent thinks it will deploy the mixed strategy of staying in Damascus with 50% probability and going to Aleppo with 50% probability, any possible probabilistic mix of "stay in Damascus" and "flee to Aleppo" will seem equally attractive, with a 50% probability of dying given either decision.  We then modify the theory of CDT to add the rule that in cases like this, we output a self-consistent policy if one is found.  (This does require an extra rule, because not *only* the policy of {0.5 stay, 0.5 flee} seems acceptable at the self-consistent point--all policies seem acceptable at that point--unless we add a special rule to stop there and output the self-consistent policy.)\n\nThis is a standard addendum to CDT and also appears in e.g. the most widely accepted resolution for the [5qh].  But in this case, in addition to the concern that the extra rule in CDT could be taken as strange (why pick one policy at a point where all policies seem equally attractive?), we also need to deal with additional concerns:\n\n• That the agent will immediately reverse course as soon as it notices itself fleeing to Aleppo and reconsiders this decision a second later.\n\n• (Raised by [2 Yudkowsky] in personal conversation with James M. Joyce.)  This version of the agent will still buy for \\$1 a ticket that pays \\$11 if it survives, if it's offered that choice as part of the stay/flee decision.  That is, the agent stabilizes on the policy {0.5 DY, 0.5 AY} instead of the policy {0.5 DN, 0.5 AN} if it's offered all four choices.\n\n%%comment:  This objection was raised by Eliezer Yudkowsky in personal conversation with James M. Joyce at "Self-prediction in Decision Theory and Artificial Intelligence" at Cambridge 2015.  Joyce was suggesting a particular formalism for a self-ratifying CDT.  The conversation went something like the following:\n\nYudkowsky:  I think this agent is irrational, because at the point where it makes the decision to stay or flee with 0.5:0.5 probability, it thinks it has a 50% chance of survival.\n\nJoyce:  I think that's rational.  Maybe after the decision the agent realizes it won't survive, but it has no way of knowing that at the time it makes the decision.\n\nYudkowsky:  Hm.  (Goes off and thinks.)  (Returns.)  Your agent is irrational and I can pump money out of it by offering to sell it for \\$1 a ticket that pays a net \\$10 if it survives.\n\nJoyce:  That's because from your epistemic vantage point outside the agent, you know something the agent doesn't.  Obviously you can win bets against the agent when you're allowed to bet with knowledge it doesn't have.\n\nYudkowsky:  (Thinks.)  Your agent knows in advance that it can be money-pumped and it will pay me \\$0.50 not to offer to sell it a ticket later.  So I claim that it clearly *can* know the thing you say it can't know at the time of making the decision.\n\nJoyce:  I disagree, but let me think about it.\n\n(Commented out because it would be unfair to cite this conversation without running it past Joyce, plus he may have come up with a further reply since then.)\n\n%%\n\n## Evidential decision theory\n\n[5px Evidential decision theory] evaluates its expected utility as "doomed" whether it flees to Aleppo or stays in Damascus, and will choose whichever option corresponds to spending its last days more comfortably.\n\n## Logical decision theory\n\nAn agent using the standard [5rz updateless] form of [58b logical decision theory] responds by asking:  "How exactly does Death decide whether to speak to someone?"\n\nIt's not causally possible for Death to *always* tell people when a natural death is approaching, regardless of the person's policy.  For example, there could be someone who will die if they stay in Damascus, but whose disposition causes them to flee to Aleppo (where no death waits) if they are warned.\n\nTwo possible rules for Death would be as follows:\n\nRule K:\n\n- Each day, check whether telling a person that they have an appointment with Death will cause them to die the next day.\n- If so, tell them they have an appointment with Death the next day.\n- If not, remain silent, even if this means the person dies with no warning.\n\nRule L:\n\n- Each day, check whether telling a person that they have an appointment with Death will cause them to die the next day.\n- If so, tell them they have an appointment with Death the next day.\n- If not, remain silent and *don't kill them.*\n\nSince the [5rz UDT]-optimal policy differs depending on whether Death follows Rule K or Rule L, we need at least a prior probability distribution on which rule Death follows.  As Bayesians, we can just guess this probability if we don't have authoritative information, but we need to guess something to proceed.\n\nIf Death follows Rule K, the [5rz UDT] reply is to stay in Damascus, and this is decisively optimal--definitely superior to the option of fleeing to Aleppo!  If you always flee to Aleppo on a warning, then you are killed by any fatal event that could occur in Aleppo (Death gives you warning, you flee, you die).  You are also killed by any fatal event that could occur in Damascus (Death checks if It can consistently warn you, finds that It can't, stays silent, and collects you in Damascus the next day).  You will be aware, on receiving the warning, that Death awaits you in Damascus; but you'll also be aware that if-counterfactually you were the sort of person who flees to Aleppo on warning, you would have received no warning today, and possibly have died in Aleppo some time ago.\n\nIf Death follows Rule L, you should, upon receiving Death's warning, hide yourself in the safest possible circumstances--perhaps near the emergency room of a well-managed hospital, under medical supervision.  You'll still expect to die after taking this precaution--something fatal will happen to you despite all nearby doctors.  However, by being the sort of person who acts like this on receiving a warning from Death, you minimize Death's probability of collecting you on any given day.  You know that if-counterfactually you were the sort of person whose algorithm's logical output says to stay in Damascus after receiving warning, you would probably have been killed earlier in Damascus where potentially fatal events crop up more frequently.\n\nAn [5rz updateless] LDT agent computes this reply in one sweep, and without needing to observe itself or search for a self-ratifying answer.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: [
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0'
  ],
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {
    Summary: 'In the city of Damascus, a man encounters the skeletal visage of Death.  Death, upon seeing the man, looks surprised; but then says, "I ᴀᴍ ᴄᴏᴍɪɴɢ ғᴏʀ ʏᴏᴜ ᴛᴏᴍᴏʀʀᴏᴡ."  The terrified man buys a camel and flees to Aleppo.  After being killed in Aleppo by falling roof tiles, the man looks around and sees Death waiting.\n\n"I thought you would be looking for me in Damascus," says the man.\n\n"Nᴏᴛ ᴀᴛ ᴀʟʟ," says Death.  "Tʜᴀᴛ ɪs ᴡʜʏ I ᴡᴀs sᴜʀᴘʀɪsᴇᴅ ᴛᴏ sᴇᴇ ʏᴏᴜ ʏᴇsᴛᴇʀᴅᴀʏ, ғᴏʀ I ᴋɴᴇᴡ I ʜᴀᴅ ᴀɴ ᴀᴘᴘᴏɪɴᴛᴍᴇɴᴛ ᴡɪᴛʜ ʏᴏᴜ ɪɴ Aʟᴇᴘᴘᴏ."\n\nIn the Death in Damascus dilemma, we can either stay in Damascus or flee to Aleppo.  Death, an excellent predictor of human behavior, has informed us that whatever decision we end up making after being warned, will be the wrong one.\n\nIf we decide to stay in Damascus, we conclude that staying in Damascus will be fatal.  If we observe ourselves fleeing to Aleppo, we conclude that we'll die if we go to Aleppo and that Damascus would be safe.\n\nThis standard dilemma can send some decision theories into infinite loops; while other decision theories break the loop in ways that (arguably) lead to other problems.'
  },
  creatorIds: [
    'EliezerYudkowsky',
    'DuncanWilson'
  ],
  childIds: [],
  parentIds: [
    'newcomblike'
  ],
  commentIds: [
    '8pw'
  ],
  questionIds: [],
  tagIds: [
    'b_class_meta_tag'
  ],
  relatedIds: [],
  markIds: [],
  explanations: [
    {
      id: '5818',
      parentId: 'death_in_damascus',
      childId: 'death_in_damascus',
      type: 'subject',
      creatorId: 'EliezerYudkowsky',
      createdAt: '2016-08-02 04:08:53',
      level: '3',
      isStrong: 'true',
      everPublished: 'true'
    }
  ],
  learnMore: [],
  requirements: [
    {
      id: '5788',
      parentId: 'causal_dt',
      childId: 'death_in_damascus',
      type: 'requirement',
      creatorId: 'EliezerYudkowsky',
      createdAt: '2016-08-02 00:27:28',
      level: '2',
      isStrong: 'true',
      everPublished: 'true'
    },
    {
      id: '5820',
      parentId: 'logical_dt',
      childId: 'death_in_damascus',
      type: 'requirement',
      creatorId: 'EliezerYudkowsky',
      createdAt: '2016-08-02 04:09:21',
      level: '1',
      isStrong: 'false',
      everPublished: 'true'
    }
  ],
  subjects: [
    {
      id: '5818',
      parentId: 'death_in_damascus',
      childId: 'death_in_damascus',
      type: 'subject',
      creatorId: 'EliezerYudkowsky',
      createdAt: '2016-08-02 04:08:53',
      level: '3',
      isStrong: 'true',
      everPublished: 'true'
    },
    {
      id: '5819',
      parentId: 'causal_dt',
      childId: 'death_in_damascus',
      type: 'subject',
      creatorId: 'EliezerYudkowsky',
      createdAt: '2016-08-02 04:09:06',
      level: '3',
      isStrong: 'false',
      everPublished: 'true'
    },
    {
      id: '5821',
      parentId: 'logical_dt',
      childId: 'death_in_damascus',
      type: 'subject',
      creatorId: 'EliezerYudkowsky',
      createdAt: '2016-08-02 04:09:52',
      level: '2',
      isStrong: 'false',
      everPublished: 'true'
    }
  ],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {
    '58b': [
      '5s0'
    ],
    '5n9': [
      '5qh'
    ]
  },
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '22347',
      pageId: 'death_in_damascus',
      userId: 'DuncanWilson',
      edit: '9',
      type: 'newEdit',
      createdAt: '2017-03-21 12:39:52',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18080',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '8',
      type: 'newEdit',
      createdAt: '2016-08-02 05:31:44',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18079',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '7',
      type: 'newEdit',
      createdAt: '2016-08-02 04:21:27',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18078',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '6',
      type: 'newEdit',
      createdAt: '2016-08-02 04:21:03',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18077',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '5',
      type: 'newEdit',
      createdAt: '2016-08-02 04:19:50',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18076',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newEdit',
      createdAt: '2016-08-02 04:18:28',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18075',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2016-08-02 04:12:36',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18074',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newSubject',
      createdAt: '2016-08-02 04:09:52',
      auxPageId: 'logical_dt',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18072',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newRequirement',
      createdAt: '2016-08-02 04:09:22',
      auxPageId: 'logical_dt',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18071',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newSubject',
      createdAt: '2016-08-02 04:09:06',
      auxPageId: 'causal_dt',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18068',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newTeacher',
      createdAt: '2016-08-02 04:08:54',
      auxPageId: 'death_in_damascus',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18069',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newSubject',
      createdAt: '2016-08-02 04:08:54',
      auxPageId: 'death_in_damascus',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18067',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-08-02 04:08:46',
      auxPageId: 'b_class_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18066',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteTag',
      createdAt: '2016-08-02 04:08:38',
      auxPageId: 'work_in_progress_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18064',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-08-02 04:08:25',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18061',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-08-02 04:05:40',
      auxPageId: 'newcomblike',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18062',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-08-02 04:05:40',
      auxPageId: 'work_in_progress_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18063',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newRequirement',
      createdAt: '2016-08-02 04:05:40',
      auxPageId: 'causal_dt',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18059',
      pageId: 'death_in_damascus',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-08-02 04:05:38',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}