{
  localUrl: '../page/6pp.html',
  arbitalUrl: 'https://arbital.com/p/6pp',
  rawJsonUrl: '../raw/6pp.json',
  likeableId: '3751',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '5',
  dislikeCount: '0',
  likeScore: '5',
  individualLikes: [
    'AlexeiAndreev',
    'EricBruylant',
    'TravisRivera',
    'EricRogstad',
    'ChaseRoycroft'
  ],
  pageId: '6pp',
  edit: '4',
  editSummary: '',
  prevEdit: '3',
  currentEdit: '4',
  wasPublished: 'true',
  type: 'wiki',
  title: 'On the importance of Less Wrong, or another single conversational locus',
  clickbait: '',
  textLength: '4900',
  alias: '6pp',
  externalUrl: 'http://lesswrong.com/lw/o5z/on_the_importance_of_less_wrong_or_another_single/',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'AlexeiAndreev',
  editCreatedAt: '2016-12-13 21:38:55',
  pageCreatorId: 'AlexeiAndreev',
  pageCreatedAt: '2016-12-02 18:19:54',
  seeDomainId: '0',
  editDomainId: '2069',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'false',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '29',
  text: '[summary: In this post, [@33] talks about how [6pn] used to a locus of discussion, and that it is worth seeing whether Less Wrong or some similar such place may be a viable locus again in order to help with reducing [-existential_risk].]\n\n This post discusses the following claims:\n\n1. [claim([6tm])]\n> The world is locked right now in a [deadly](http://www.existential-risk.org/) [puzzle](https://books.google.com/books/about/Superintelligence.html?id=7_H8AwAAQBAJ), and needs something like a miracle of good thought if it is to have the survival odds one might wish the world to have.\n\n2. [claim([6tn])]\n>  Despite all priors and appearances, our little community (the\n> "aspiring rationality" community; the "effective altruist" project;\n> efforts to create an existential win; etc.) has a shot at seriously\n> helping with this puzzle.  This sounds like hubris, but it is at this\n> point at least partially a matter of track record. %%note: By track\n> record, I have in mind most obviously that AI risk is now relatively\n> credible and mainstream, and that this seems to have been due largely\n> to (the direct + indirect effects of) Eliezer, Nick Bostrom, and\n> others who were poking around the general aspiring rationality and\n> effective altruist space in 2008 or so, with significant help from the\n> extended communities that eventually grew up around this space.  More\n> controversially, it seems to me that this set of people has probably\n> (though not indubitably) helped with locating specific angles of\n> traction around these problems that are worth pursuing; with locating\n> other angles on existential risk; and with locating techniques for\n> forecasting/prediction (e.g., there seems to be similarity between the\n> techniques already being practiced in this community, and those Philip\n> Tetlock documented as working). %%\n\n3. [claim([6tq])]\n> To aid in solving this puzzle, we must probably find a way to think\n> together, accumulatively. We need to think about technical problems in\n> AI safety, but also about the full surrounding context -- everything\n> to do with understanding what the heck kind of a place the world is,\n> such that that kind of place may contain cheat codes and trap doors\n> toward achieving an existential win. We probably also need to think\n> about "ways of thinking" -- both the individual thinking skills, and\n> the community conversational norms, that can cause our puzzle-solving\n> to work better. %%note: Again, it may seem somewhat hubristic to claim\n> that that a relatively small community can usefully add to the world's\n> analysis across a broad array of topics (such as the summed topics\n> that bear on "How do we create an existential win?").  But it is\n> generally smallish groups (rather than widely dispersed millions of\n> people) that can actually bring analysis together; history has often\n> involved relatively small intellectual circles that make concerted\n> progress; and even if things are already known that bear on how to\n> create an existential win, one must probably still combine and\n> synthesize that understanding into a smallish set of people that can\n> apply the understanding to AI (or what have you).%%\n\n4. [claim([6tr])]\n> One feature that is pretty helpful here, is if we somehow maintain a\n> single "conversation", rather than a bunch of people separately having\n> thoughts and sometimes taking inspiration from one another.  By "a\n> conversation", I mean a space where people can e.g. reply to one\n> another; rely on shared jargon/shorthand/concepts; build on arguments\n> that have been established in common as probably-valid; point out\n> apparent errors and then have that pointing-out be actually taken into\n> account or else replied-to).\n\n5. [claim([6ts])]\n> One feature that really helps things be "a conversation" in this way,\n> is if there is a single Schelling set of posts/etc. that people (in\n> the relevant community/conversation) are supposed to read, and can be\n> assumed to have read.  Less Wrong used to be a such place; right now\n> there is no such place; it seems to me highly desirable to form a new\n> such place if we can.\n\n6. [claim([6tt])]\n> We have lately ceased to have a "single conversation" in this way. \n> Good content is still being produced across these communities, but\n> there is no single locus of conversation, such that if you're in a\n> gathering of e.g. five aspiring rationalists, you can take for granted\n> that of course everyone has read posts such-and-such.  There is no one\n> place you can post to, where, if enough people upvote your writing,\n> people will reliably read and respond (rather than ignore), and where\n> others will call them out if they later post reasoning that ignores\n> your evidence.  Without such a locus, it is hard for conversation to\n> build in the correct way.  (And hard for it to turn into arguments and\n> replies, rather than a series of non sequiturs.)',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '0',
  maintainerCount: '0',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'AlexeiAndreev',
    'EricRogstad'
  ],
  childIds: [
    '6tm',
    '6tn',
    '6tq',
    '6tr',
    '6ts',
    '6tt'
  ],
  parentIds: [],
  commentIds: [
    '6ty'
  ],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20793',
      pageId: '6pp',
      userId: 'AlexeiAndreev',
      edit: '4',
      type: 'newEdit',
      createdAt: '2016-12-13 21:38:55',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20761',
      pageId: '6pp',
      userId: 'AlexeiAndreev',
      edit: '3',
      type: 'newEdit',
      createdAt: '2016-12-13 16:46:03',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20753',
      pageId: '6pp',
      userId: 'EricRogstad',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-12-13 06:00:46',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20749',
      pageId: '6pp',
      userId: 'EricRogstad',
      edit: '0',
      type: 'newChild',
      createdAt: '2016-12-13 05:58:40',
      auxPageId: '6tv',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20746',
      pageId: '6pp',
      userId: 'EricRogstad',
      edit: '0',
      type: 'newChild',
      createdAt: '2016-12-13 05:57:51',
      auxPageId: '6tt',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20743',
      pageId: '6pp',
      userId: 'EricRogstad',
      edit: '0',
      type: 'newChild',
      createdAt: '2016-12-13 05:56:44',
      auxPageId: '6ts',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20740',
      pageId: '6pp',
      userId: 'EricRogstad',
      edit: '0',
      type: 'newChild',
      createdAt: '2016-12-13 05:55:49',
      auxPageId: '6tr',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20737',
      pageId: '6pp',
      userId: 'EricRogstad',
      edit: '0',
      type: 'newChild',
      createdAt: '2016-12-13 05:53:52',
      auxPageId: '6tq',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20728',
      pageId: '6pp',
      userId: 'EricRogstad',
      edit: '0',
      type: 'newChild',
      createdAt: '2016-12-13 05:47:21',
      auxPageId: '6tn',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20724',
      pageId: '6pp',
      userId: 'EricRogstad',
      edit: '0',
      type: 'newChild',
      createdAt: '2016-12-13 05:45:40',
      auxPageId: '6tm',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [
    {
      domainId: '2069',
      pageId: '6pp',
      submitterId: 'AlexeiAndreev',
      createdAt: '2016-12-02 18:19:54',
      score: '63.597012925897985',
      featuredCommentId: ''
    }
  ],
  searchStrings: {},
  hasChildren: 'true',
  hasParents: 'false',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}