Specifically, if a number $~$x$~$ is $~$n$~$ digits long \(in decimal\_notation\), then its logarithm \(base 10\) is between $~$n-1$~$ and $~$n$~$\. This follows directly from the definition of the logarithm: $~$\\log\_{10}(x)$~$ is the number of times you have to multiply 1 by 10 to get $~$x;$~$ and each new digit lets you write down ten times as many numbers\. In other words, if you have one digit, you can write down any one of ten different things \(0\-9\); if you have two digits you can write down any one of a hundred different things \(00\-99\), if you have three digits, you can write down any one of a thousand different things \(000\-999\), and in general, each digit lets you write down ten times as many things\. Thus, the number of digits you need to write $~$x$~$ is close to the number of times you have to multiply 1 by 10 to get $~$x$~$\. The only difference is that, when using digits to write numbers down, you only get to use whole digits, whereas when computing logs, you can multiply 1 by 10 fractionally\-many times\.
I might write this as, "whereas, when multiplying 1 by 10 to get to x, you might have to multiply by 10 a fractional number of times (if x is not a power of 10), so the log base 10 of x can include a fractional part while the number of digits in the base 10 representation of x is always a whole number."
Rationale: in the previous sentence you're comparing the number of digits needed to write x to the number of times to multiply 1 by 10. So when the next sentence starts with, "the only difference is…" I'm expecting it to be comparing numbers of digits and numbers of times to multiply. I can figure out that you've switched to talking about "computing logs" because logs count the number of times to multiply by 10, but it feels like one extra step of mental effort.
(This is a less confident suggestion than the amount of text I've used suggests.)