{ localUrl: '../page/468.html', arbitalUrl: 'https://arbital.com/p/468', rawJsonUrl: '../raw/468.json', likeableId: '0', likeableType: 'page', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], pageId: '468', edit: '6', editSummary: '', prevEdit: '5', currentEdit: '6', wasPublished: 'true', type: 'wiki', title: 'Joint probability distribution: (Motivation) coherent probabilities', clickbait: 'If you don't use joint probability distributions, none of your probabilities will make any sense. So, yeah, use joint probability distributions.', textLength: '7100', alias: '468', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'TsviBT', editCreatedAt: '2016-06-22 14:25:11', pageCreatorId: 'TsviBT', pageCreatedAt: '2016-06-11 06:48:51', seeDomainId: '0', editDomainId: 'MaloBourgon', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '0', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '29', text: '$$\n\\newcommand{\\gO}{\\Omega}\n\\newcommand{\\go}{\\omega}\n\\newcommand{\\bP}{\\mathbb{P}}\n\\newcommand{\\pc}{0.4}\n\\newcommand{\\pnc}{0.6}\n\\newcommand{\\plc}{0.7}\n\\newcommand{\\fpcl}{0.9}\n\\newcommand{\\fpcnl}{0.2}\n\\newcommand{\\plnc}{0.2}\n\\newcommand{\\pscr}{0.4}\n\\newcommand{\\pscnr}{0.1}\n\\newcommand{\\psncr}{0.7}\n\\newcommand{\\psncnr}{0.9}\n\\newcommand{\\pjnclnrns}{0.0036}\n\\newcommand{\\pjclrs}{0.0784}\n\\newcommand{\\true}{\\text{True}}\n\\newcommand{\\false}{\\text{False}}\n$$\n\n\nExample: Medical Diagnosis\n===\n\nImagine you're a doctor. Today's patient, Mr. WebMD, has a terrible cough; so naturally, he either has the flu or cancer. From past experience with\nsimilar cases, you assign some prior probability $\\bP(C)$ to the [event_probability event]\n that Mr. W has cancer, and $\\bP(\\neg C) = 1-\\bP(C)$ to the event that Mr. W doesn't have cancer (and\ndoes have the flu).\n \nIf he has cancer, then you place conditional probability $\\bP(L\\mid C) = \\plc$ on finding a lump. If he doesn't have cancer and just has the flu, then you assign \n$\\bP(L \\mid \\neg C) = \\plnc$ to finding a lump. \n\nYou're going to observe whether Mr. W has a lump, and then you need to decide whether to treat him with radiation ($R$) or with $\\text{Not Radiation}^{\\text{TM}}$\n($\\neg R$).\n Whether or not the patient survives ($S$) for at least a year depends on what disease he has, and what treatment you prescribe:\n\n$$\\bP(S \\mid \\;\\; C, \\;\\; R) = \\pscr$$\n\n$$\\bP(S \\mid \\;\\; C, \\neg R) = \\pscnr$$\n\n$$\\bP(S \\mid \\neg C, \\;\\; R) = \\psncr$$\n\n$$\\bP(S \\mid \\neg C, \\neg R) = \\psncnr$$\n\nSo, for instance, if Mr. W has cancer but you don't treat him with radiation, then you believe based on similar cases that he'll only survive with probability \n$\\bP(S \\mid C, \\neg R) = \\pscnr$.\n\nNow, what do you do in this situation?\nHow do you look at a bunch of conditional probabilities, and use them to figure out\nwhich disease Mr. W is likely to have and what treatment is likely to keep him\nalive?\n\nFor example, just looking at the probability table for $\\bP(S \\mid C,R)$, we\ncan see that if we believe that Mr. W has cancer, then we want to treat him with radiation, and if he has the flu, then we do not. These are the choices that\nmaximize the probability that Mr. W survives, conditioned on the disease and treatment. \n\nSo, we've got to guess whether or not Mr. W has cancer. Maybe we see a lump, which is a strong indication of cancer, so after we update our beliefs \n we have a conditional probability $\\bP(C \\mid L) = \\fpcl$ of cancer. Since we now think cancer is likely, we treat with radiation.\nOn the other hand, if we didn't see a lump, maybe cancer isn't very likely, so we have a conditional probability $\\bP(C \\mid \\neg L) = \\fpcnl$ of cancer, and\nwe don't treat with radiation.\n\nEpistemic Chaos!\n===\n\nThe probabilities we assigned in the medical diagnosis situation might be appealing to those who wish to reason about an uncertain world in an organized way. We've written down lots of numbers that quantify the intuitive relationships between variables that we care about, like "cancer usually leads to lumps" ($\\bP(L \\mid C) = big$) or "if the patient has cancer, then survival is more likely if we treat with radiation than if we don't" ($\\bP(S \\mid C, R) >\\bP(S \\mid C, \\neg R)$). \n\nBut there is a terrible problem lurking here.\n\nA terrible problem: incoherent beliefs\n---\n\nYou can get test subjects in a psychology experiment to say that \n"Linda is a bank teller and is active in the feminist movement" is *more likely* to be true than "Linda is a bank teller". \n(See [conjunction fallacy](https://en.wikipedia.org/wiki/Conjunction_fallacy).) \n\nThis is absurd: if Laura is a teller and an active feminist, then she is a teller; ($A$ and $B$) logically implies $A$. There is no [coherent_probability coherent] set of beliefs that could possibly lead to assigning probabilities so that $\\bP(A,B) > \\bP(A)$, since whenever $(A,B)$ happens, $A$ also happens. \n\nSo we can't just write down any old numbers to represent our uncertainty: our probabilities have to be [coherent_probability coherent].\n\n\nIncoherence isn't always obvious...\n---\n\nIt is a Serious Concern that our beliefs might be [incoherent_probability incoherent], and this won't work itself out automatically. In the medical example, we wrote down these conditional probabilities: \n\n$$\\bP(L\\mid \\;\\; C) = \\plc$$\n\n$$\\bP(L \\mid \\neg C) = \\plnc$$\n\n$$\\bP(C \\mid \\;\\; L) = \\fpcl$$\n\n$$\\bP(C \\mid \\neg L) = \\fpcnl$$\n\nThese probabilities can be justified with intuitive reasons. If Mr. W doesn't have cancer, he probably won't have a lump; if we see that Mr. W has a lump, then he probably has cancer.\n\nAre you seriously concerned yet? If not, I hope the following diagrams will increase the seriousness of your concern:\n\n<img src="http://i.imgur.com/Mm586rN.png" width="540" height="300">\n\nwe're using the [suqeare reperesnsiont] . we fix the conditionals L given C. free param pc determines everything. inf particul, detremines the p c|l or nl. but need different settings for the different pc|l: \n<img src="http://i.imgur.com/DqdsRWf.png" width="576" height="960">\n\n\nbut we don't know the prior p(c). as it slides, it fully specifies the distribution. so it gives probailties p(c|l). but the ones we wrote down happen at different p(c). so it is impossible.\n\nLet's break this down. We're using the [probability_distribution_square_visualization square visualiation] of our probability distribution. The red regions are the regions where Mr. W has cancer, and the blue regions are where $\\neg C$ is true. The darker regions are where $L$, and the lighter regions are where $\\neg L$. \n\nThe proportion of the red column that is darker is \n\nthis goes up top. remove p(c). or rather, make theree: 1 with no pc, 1 for each pc for p(c|l). those 2 go after the spectrum \n$$\\frac{\\bP(L,C)}{\\bP(C)} = \\bP(L \\mid C)\\ ,$$\n\nand \n\n\n\n<img src="http://i.imgur.com/hRtSpVu.png" width="600" height="600">\n\n\n\n\nsomething something inchoroledent\n\n\n...and incoherence is not at all obvious in real life.\n---\n\nMr. W only had two possible diseases, one possible symptom, and one possible treatment. An actual diagnosis could involve many thousands of possible diseases, symptoms, and treatments. \n\nIn real life, how can we be sure that our beliefs are even coherent? If we write down a great big collection of probabilities that look like \n\n$$\\bP(\\text{disease}_9 \\mid \\text{symptom}_2, \\text{symptom}_5, \\text{symptom}_{17})=0.153$$ \n\nor \n\n$$\\bP(\\text{outcome} = \\text{survival} \\mid \\text{disease}_9, \\text{treatment} = \\text{bezoar})=0.094,$$ \n\nwhat's to stop us from writing down something nonsensical like $\\bP(A,B) > \\bP(A)$?\n\n\nWhat To Do?\n==\n\nIn general, if we want to use probability theory to reason in uncertain situations, there will be lots and lots of variables that we care about. So it won't be obvious that our beliefs are coherent. And, if our beliefs aren't coherent, there's nothing to stop us from doing [incoherence_properties_probability all sorts of silly things].\n\nIf only there were some way to organize our uncertainty in a nice, systematic way that is guaranteed to be consistent.\n', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'TsviBT' ], childIds: [], parentIds: [ 'joint_probability_distribution' ], commentIds: [], questionIds: [], tagIds: [ 'work_in_progress_meta_tag' ], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [], subjects: [], lenses: [], lensParentId: 'joint_probability_distribution', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: {}, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '14353', pageId: '468', userId: 'TsviBT', edit: '6', type: 'newEdit', createdAt: '2016-06-22 14:25:11', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '13851', pageId: '468', userId: 'TsviBT', edit: '0', type: 'newTag', createdAt: '2016-06-18 08:29:03', auxPageId: 'work_in_progress_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '12611', pageId: '468', userId: 'TsviBT', edit: '5', type: 'newEdit', createdAt: '2016-06-14 05:42:32', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '12610', pageId: '468', userId: 'TsviBT', edit: '4', type: 'newEdit', createdAt: '2016-06-14 05:17:07', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '12609', pageId: '468', userId: 'TsviBT', edit: '3', type: 'newEdit', createdAt: '2016-06-14 05:14:27', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '12583', pageId: '468', userId: 'TsviBT', edit: '2', type: 'newEdit', createdAt: '2016-06-13 21:09:55', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '12401', pageId: '468', userId: 'TsviBT', edit: '1', type: 'newEdit', createdAt: '2016-06-11 06:48:51', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '12399', pageId: '468', userId: 'TsviBT', edit: '1', type: 'newParent', createdAt: '2016-06-11 06:48:02', auxPageId: 'joint_probability_distribution', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }