Imagine what happens when the oyster is blue\. $~$H$~$ predicted blueness with $~$\\frac{1}{8}$~$ of its probability mass, while $~$\\lnot H$~$ predicted blueness with $~$\\frac{1}{4}$~$ of its probability mass\. Thus, $~$\\lnot H$~$ did better than $~$H,$~$ and goes up in probability\. Previously, we've been combining both $~$\\mathbb P(e \\mid H)$~$ and $~$\\mathbb P(e \\mid \\lnot H)$~$ into unified likelihood ratios, like $~$\\left(\\frac{1}{8} : \\frac{1}{4}\\right)$~$ $~$\=$~$ $~$(1 : 2),$~$ which says that the 'blue' observation carries 1 bit of evidence $~$H.$~$ However, we can also take the logs first, and combine second\.
The log used to determine number of bits should probably be consistent throughout or clarified each time. Here, the log 2 scale is used, when elsewhere there is usage of the log 10 scale.