"Prior probability", "prior odds", or just "prior" refers to a state of belief that obtained before seeing a piece of new evidence. Suppose there are two suspects in a murder, Colonel Mustard and Miss Scarlet. After determining that the victim was poisoned, you think Mustard and Scarlet are respectively 25% and 75% likely to have committed the murder. Before determining that the victim was poisoned, perhaps, you thought Mustard and Scarlet were equally likely to have committed the murder (50% and 50%). In this case, your "prior probability" of Miss Scarlet committing the murder was 50%, and your "posterior probability" after seeing the evidence was 75%.
The prior probability of a hypothesis $~$H$~$ is often being written with the unconditioned notation $~$\mathbb P(H)$~$, while the posterior after seeing the evidence $~$e$~$ is often being denoted by the conditional probability $~$\mathbb P(H\mid e).$~$%%note: E. T. Jaynes was known to insist on using the explicit notation $~$\mathbb P (H\mid I_0)$~$ to denote the prior probability of $~$H$~$, with $~$I_0$~$ denoting the prior, and never trying to write any entirely unconditional probability $~$\mathbb P(X)$~$. Since, said Jaynes, we always have some prior information.%% %%knows-requisite(Math 2): This however is a heuristic rather than a law, and might be false inside some complicated problems. If we've already seen $~$e_0$~$ and are now updating on $~$e_1$~$, then in this new problem the new prior will be $~$\mathbb P(H\mid e_0)$~$ and the new posterior will be $~$\mathbb P(H\mid e_1 \wedge e_0).$~$ %%
For questions about how priors are "ultimately" determined, see Solomonoff induction.