{ localUrl: '../page/task_agi.html', arbitalUrl: 'https://arbital.com/p/task_agi', rawJsonUrl: '../raw/6w.json', likeableId: '2382', likeableType: 'page', myLikeValue: '0', likeCount: '3', dislikeCount: '0', likeScore: '3', individualLikes: [ 'EliezerYudkowsky', 'RyanCarey2', 'RolandPihlakas' ], pageId: 'task_agi', edit: '15', editSummary: '', prevEdit: '14', currentEdit: '15', wasPublished: 'true', type: 'wiki', title: 'Task-directed AGI', clickbait: 'An advanced AI that's meant to pursue a series of limited-scope goals given it by the user. In Bostrom's terminology, a Genie.', textLength: '6015', alias: 'task_agi', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'EliezerYudkowsky', editCreatedAt: '2017-03-25 06:35:00', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2015-06-09 22:54:21', seeDomainId: '0', editDomainId: 'EliezerYudkowsky', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '2', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '1037', text: '[summary: A task-based AGI or "genie" is an AGI [6h intended] to follow a series of human orders, rather than autonomously pursuing long-term goals. A Task AGI might be easier to render safe, since:\n\n- It's possible to [2qq query the user] before and during a Task.\n- [4mn Tasks are satisficing] - they're of limited scope and can be fully accomplished using a limited effort. (In other words, Tasks should not become more and more accomplished as more and more effort is put into them.)\n- Adequately [2rz identifying] what it means to safely "cure cancer" might be simpler than adequately identifying [55 all normative value].\n- Task AGIs can be limited in various ways, rather than self-improving as far as possible, so long as they can still carry out at least some [6y pivotal] Tasks.\n\nThe obvious disadvantage of a Task AGI is [2sb moral hazard] - it may tempt the users in ways that an autonomous AI would not.\n\nThe problem of making a safe Task AGI invokes numerous subtopics such as [2pf low impact], [2r8 mild optimization], and [2qp conservatism] as well as numerous standard AGI safety problems like [ goal identification] and [1fx reflective stability].]\n\nA task-based AGI is an AGI [6h intended] to follow a series of human-originated orders, with these orders each being of limited scope - "satisficing" in the sense that they can be [4mn accomplished using bounded amounts of effort and resources] (as opposed to the goals being more and more fulfillable using more and more effort).\n\nIn [1g0 Bostrom's typology], this is termed a "Genie". It contrasts with a "Sovereign" AGI that acts autonomously in the pursuit of long-term real-world goals.\n\nBuilding a safe Task AGI might be easier than building a safe Sovereign for the following reasons:\n\n- A Task AGI can be "online"; the AGI can potentially [2qq query the user] before and during Task performance. (Assuming an ambiguous situation arises, and is successfully identified as ambiguous.)\n- A Task AGI can potentially be [5b3 limited] in various ways, since a Task AGI doesn't need to be *as powerful as possible* in order to accomplish its limited-scope Tasks. A Sovereign would presumably engage in all-out self-improvement. (This isn't to say Task AGIs would automatically not self-improve, only that it's possible *in principle* to limit the power of a Task AGI to only the level required to do the targeted Tasks, *if* the associated safety problems can be solved.)\n- Tasks, by assumption, are limited in scope - they can be accomplished and done, inside some limited region of space and time, using some limited amount of effort which is then complete. (To gain this advantage, a state of Task accomplishment should not go higher and higher in preference as more and more effort is expended on it open-endedly.)\n- Assuming that users can figure out [6h intended goals] for the AGI that are [55 valuable] and [6y pivotal], the [2rz identification problem] for describing what constitutes a safe performance of that Task, might be simpler than giving the AGI a [ complete description] of [55 normativity in general]. That is, the problem of communicating to an AGI an adequate description of "cure cancer" (without killing patients or causing other side effects), while still difficult, might be simpler than an adequate description of all normative value. Task AGIs fall on the narrow side of [1vt].\n\nRelative to the problem of building a Sovereign, trying to build a Task AGI instead might step down the problem from "impossibly difficult" to "insanely difficult", while still maintaining enough power in the AI to perform [6y pivotal acts].\n\nThe obvious disadvantage of a Task AGI is [2sb moral hazard] - it may tempt the users in ways that a Sovereign would not. A Sovereign has moral hazard chiefly during the development phase, when the programmers and users are perhaps not yet in a position of special relative power. A Task AGI has ongoing moral hazard as it is used.\n\n[2] has suggested that people only confront many important problems in value alignment when they are thinking about Sovereigns, but that at the same time, Sovereigns may be impossibly hard in practice. Yudkowsky advocates that people think about Sovereigns first and list out all the associated issues before stepping down their thinking to Task AGIs, because thinking about Task AGIs may result in premature pruning, while thinking about Sovereigns is more likely to generate a complete list of problems that can then be checked against particular Task AGI approaches to see if those problems have become any easier.\n\nThree distinguished subtypes of Task AGI are these:\n\n- **[6x Oracles]**, an AI that is intended to only answer questions, possibly from some restricted question set.\n- **[1fy Known-algorithm AIs]**, which are not self-modifying or very weakly self-modifying, such that their algorithms and representations are mostly known and mostly stable.\n- **[102 Behaviorist Genies]**, which are meant to not model human minds or model them in only very limited ways, while having great material understanding (e.g., potentially the ability to invent and deploy nanotechnology).\n\n# Subproblems\n\nThe problem of making a safe genie invokes numerous subtopics such as [2pf low impact], [2r8 mild optimization], and [2qp conservatism] as well as numerous standard AGI safety problems like [1fx reflective stability] and safe [6c identification] of [6h intended goals].\n\n([2mx See here for a separate page on open problems in Task AGI safety that might be ready for current research.])\n\nSome further problems beyond those appearing in the page above are:\n\n- **Oracle utility functions** (that make the Oracle not wish to leave its box or optimize its programmers)\n- **Effable optimization** (the opposite of [9f cognitive uncontainability])\n- **Online checkability**\n - Explaining things to programmers [30b without putting the programmers inside an argmax] for how well you are 'explaining' things to them\n- **Transparency**\n- [2s1 Do What I Mean]', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '0', userSubscriberCount: '0', lastVisit: '2016-02-25 04:36:01', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'true', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'EliezerYudkowsky', 'MalcolmOcean', 'AlexeiAndreev', 'NateSoares' ], childIds: [ 'behaviorist', 'epistemic_exclusion', 'taskagi_open_problems', 'low_impact', 'conservative_concept', 'user_querying', 'soft_optimizer', 'task_identification', 'safe_plan_identification', 'faithful_simulation', 'task_goal', 'limited_agi', 'oracle', 'AI_boxing' ], parentIds: [ 'AGI_typology', 'ai_alignment' ], commentIds: [ '1gj' ], questionIds: [], tagIds: [ 'work_in_progress_meta_tag' ], relatedIds: [ 'neural_genie_metaphor' ], markIds: [], explanations: [], learnMore: [], requirements: [], subjects: [], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: {}, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '22388', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '15', type: 'newEdit', createdAt: '2017-03-25 06:35:00', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '22387', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '14', type: 'newEdit', createdAt: '2017-03-25 06:33:05', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '22386', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '0', type: 'newAlias', createdAt: '2017-03-25 06:33:04', auxPageId: '', oldSettingsValue: 'genie', newSettingsValue: 'task_agi' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16527', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '0', type: 'newChild', createdAt: '2016-07-10 22:40:46', auxPageId: 'limited_agi', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '14138', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '0', type: 'newChild', createdAt: '2016-06-20 21:09:03', auxPageId: 'task_goal', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '9287', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '13', type: 'newEdit', createdAt: '2016-04-14 00:42:37', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '9286', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '12', type: 'newEdit', createdAt: '2016-04-14 00:40:20', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '9285', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '11', type: 'newEdit', createdAt: '2016-04-14 00:37:30', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '9280', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '10', type: 'newEdit', createdAt: '2016-04-14 00:17:30', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '9081', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteChild', createdAt: '2016-03-26 19:47:48', auxPageId: 'avert_instrumental_pressure', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8971', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteChild', createdAt: '2016-03-23 22:17:44', auxPageId: 'dwim', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8966', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '9', type: 'newChild', createdAt: '2016-03-23 21:54:26', auxPageId: 'safe_plan_identification', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8954', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '9', type: 'newChild', createdAt: '2016-03-23 20:06:51', auxPageId: 'dwim', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8939', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '9', type: 'newChild', createdAt: '2016-03-23 19:22:10', auxPageId: 'task_identification', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8916', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '9', type: 'newEdit', createdAt: '2016-03-22 19:57:40', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8915', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '8', type: 'newEdit', createdAt: '2016-03-22 19:57:01', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8857', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '7', type: 'newChild', createdAt: '2016-03-21 21:21:32', auxPageId: 'soft_optimizer', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8820', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '7', type: 'newChild', createdAt: '2016-03-20 01:12:29', auxPageId: 'user_querying', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8808', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '7', type: 'newChild', createdAt: '2016-03-19 23:53:22', auxPageId: 'conservative_concept', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8671', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '7', type: 'newChild', createdAt: '2016-03-18 20:54:03', auxPageId: 'low_impact', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8595', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '7', type: 'newChild', createdAt: '2016-03-15 20:26:49', auxPageId: 'taskagi_open_problems', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8593', pageId: 'task_agi', userId: 'MalcolmOcean', edit: '7', type: 'newEdit', createdAt: '2016-03-15 17:08:37', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8592', pageId: 'task_agi', userId: 'MalcolmOcean', edit: '6', type: 'newEdit', createdAt: '2016-03-15 17:07:41', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4898', pageId: 'task_agi', userId: 'AlexeiAndreev', edit: '5', type: 'newParent', createdAt: '2016-01-01 05:54:11', auxPageId: 'AGI_typology', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4631', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '5', type: 'newEdit', createdAt: '2015-12-28 22:59:41', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4591', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '4', type: 'newChild', createdAt: '2015-12-28 22:11:20', auxPageId: 'epistemic_exclusion', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4563', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteUsedAsTag', createdAt: '2015-12-28 21:42:45', auxPageId: 'KANSI', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4510', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '4', type: 'newUsedAsTag', createdAt: '2015-12-28 20:03:21', auxPageId: 'KANSI', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4325', pageId: 'task_agi', userId: 'NateSoares', edit: '4', type: 'newEdit', createdAt: '2015-12-24 23:51:55', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '3895', pageId: 'task_agi', userId: 'AlexeiAndreev', edit: '3', type: 'newRequiredBy', createdAt: '2015-12-16 15:38:06', auxPageId: 'neural_genie_metaphor', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '3893', pageId: 'task_agi', userId: 'AlexeiAndreev', edit: '3', type: 'newUsedAsTag', createdAt: '2015-12-16 15:37:59', auxPageId: 'neural_genie_metaphor', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '3817', pageId: 'task_agi', userId: 'AlexeiAndreev', edit: '3', type: 'newEdit', createdAt: '2015-12-16 00:57:34', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '3816', pageId: 'task_agi', userId: 'AlexeiAndreev', edit: '0', type: 'newAlias', createdAt: '2015-12-16 00:57:33', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1135', pageId: 'task_agi', userId: 'AlexeiAndreev', edit: '1', type: 'newUsedAsTag', createdAt: '2015-10-28 03:47:09', auxPageId: 'work_in_progress_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '826', pageId: 'task_agi', userId: 'AlexeiAndreev', edit: '1', type: 'newChild', createdAt: '2015-10-28 03:46:58', auxPageId: 'oracle', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '827', pageId: 'task_agi', userId: 'AlexeiAndreev', edit: '1', type: 'newChild', createdAt: '2015-10-28 03:46:58', auxPageId: 'behaviorist', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '828', pageId: 'task_agi', userId: 'AlexeiAndreev', edit: '1', type: 'newChild', createdAt: '2015-10-28 03:46:58', auxPageId: 'AI_boxing', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '378', pageId: 'task_agi', userId: 'AlexeiAndreev', edit: '1', type: 'newParent', createdAt: '2015-10-28 03:46:51', auxPageId: 'ai_alignment', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '2007', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '2', type: 'newEdit', createdAt: '2015-06-09 22:59:14', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '2006', pageId: 'task_agi', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2015-06-09 22:54:21', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'true', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }