A system of probability assignments is "well-calibrated" if things to which it assigns 70% probability, happen around 70% of the time. This is the goal to which [ bounded rationalists] aspire - not to perfectly predict everything, but to be no more or less certain than our information warrants.