{ localUrl: '../page/standard_agent.html', arbitalUrl: 'https://arbital.com/p/standard_agent', rawJsonUrl: '../raw/6t.json', likeableId: '2380', likeableType: 'page', myLikeValue: '0', likeCount: '1', dislikeCount: '0', likeScore: '1', individualLikes: [ 'EliezerYudkowsky' ], pageId: 'standard_agent', edit: '3', editSummary: '', prevEdit: '2', currentEdit: '3', wasPublished: 'true', type: 'wiki', title: 'Standard agent properties', clickbait: 'What's a Standard Agent, and what can it do?', textLength: '2932', alias: 'standard_agent', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'NateSoares', editCreatedAt: '2015-12-18 01:37:50', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2015-06-09 21:08:29', seeDomainId: '0', editDomainId: 'EliezerYudkowsky', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '0', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '229', text: '### Boundedly rational agents\n\n- Have probabilistic models of the world.\n- Update those models in response to sensory information.\n - The ideal algorithm for updating is Bayesian inference, but this requires too much computing power and a bounded agent must use some bounded alternative.\n - Implicitly, we assume the agent has some equivalent of a complexity-penalizing prior or Occam's Razor. Without this, specifying Bayesian inference does not much constrain the end results of epistemic reasoning.\n- Have preferences over events or states of the world, quantifiable by a utility function that maps those events or states onto a scalar field.\n - These preferences must be quantitative, not just ordered, in order to combine with epistemic states of uncertainty (probabilities).\n- Are consequentialist: they evaluate the expected consequences of actions and choose among actions based on preference among their expected consequences.\n - Bounded agents cannot evaluate all possible actions and hence cannot obtain literal maximums of expected utility except in very simple cases.\n- Act in real time in a noisy, uncertain environment.\n\nFor the arguments that sufficiently intelligent agents will appear to us as boundedly rational agents in some sense, see:\n\n- [29 Relevant powerful agents will be highly optimized]\n- [21 Sufficiently optimized agents appear coherent]\n\n### Economic agents\n\n- Achieve their goals by efficiently allocating limited resources, including, e.g., time, money, or negentropy;\n- Try to find new paths that route around obstacles to goal achievement;\n- Predict the actions of other agents;\n- Try to coordinate with, manipulate, or hinder other agents (in accordance with the agent's own goals or utilities);\n- Respond to both negative incentives (penalties) and positive incentives (rewards) by planning accordingly, and may also consider strategies to avoid penalties or gain rewards that were unforeseen by the creators of the incentive framework.\n\n### Naturalistic agents\n\n- Naturalistic agents are embedded in a larger universe and are made of the same material as other things in the universe (wavefunction, on our current beliefs about physics).\n- A naturalistic agent's uncertainty about the environment is uncertainty about which natural universe embeds them (what material structure underlies their available sensory and introspective data).\n- Some of the actions available to naturalistic agents potentially alter their sensors, actuators, or computing substrate.\n- Sufficiently powerful naturalistic agents may construct other agents out of resources available to them internally or in their environment, or extend their intelligence into outside computing resources.\n- A naturalistic agent's sensing, cognitive, and decision/action capabilities may be distributed over space, time, and multiple substrates; the applicability of the 'agent' concept does not require a small local robot body.', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '2', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '2016-02-06 00:41:54', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'EliezerYudkowsky', 'AlexeiAndreev', 'NateSoares' ], childIds: [ 'bounded_agent' ], parentIds: [ 'advanced_agent' ], commentIds: [ '88' ], questionIds: [], tagIds: [ 'work_in_progress_meta_tag' ], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [], subjects: [], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: {}, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '9390', pageId: 'standard_agent', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteRequiredBy', createdAt: '2016-04-21 20:47:58', auxPageId: 'advanced_agent', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8883', pageId: 'standard_agent', userId: 'EliezerYudkowsky', edit: '3', type: 'newChild', createdAt: '2016-03-22 01:48:50', auxPageId: 'bounded_agent', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4141', pageId: 'standard_agent', userId: 'NateSoares', edit: '3', type: 'newEdit', createdAt: '2015-12-18 01:37:50', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '3913', pageId: 'standard_agent', userId: 'AlexeiAndreev', edit: '0', type: 'newAlias', createdAt: '2015-12-16 16:18:19', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '3914', pageId: 'standard_agent', userId: 'AlexeiAndreev', edit: '2', type: 'newEdit', createdAt: '2015-12-16 16:18:19', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '3902', pageId: 'standard_agent', userId: 'AlexeiAndreev', edit: '1', type: 'newRequiredBy', createdAt: '2015-12-16 15:57:14', auxPageId: 'advanced_agent', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1118', pageId: 'standard_agent', userId: 'AlexeiAndreev', edit: '1', type: 'newUsedAsTag', createdAt: '2015-10-28 03:47:09', auxPageId: 'work_in_progress_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '143', pageId: 'standard_agent', userId: 'AlexeiAndreev', edit: '1', type: 'newParent', createdAt: '2015-10-28 03:46:51', auxPageId: 'advanced_agent', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1442', pageId: 'standard_agent', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2015-06-09 21:08:29', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'true', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }