Then--if I had a signal that's N samples long, I have N times as much information. Or perhaps think of it as a rate of information.
But for real numbers and continuous distributions, this doesn't work. The information in a single observation diverges. So, doing that with floating point numbers is not practical.
You often see Shannon entropy describing digital signals. If the signal just switches between 0 and 1, we can generate a distribution of the data and see what the probability is empirically. The entropy of each new sample is relative to the distribution. Likewise, then if you know the maximum rate of switching, you can figure out the maximum rate of information in the signal.
Just a few thoughts...
Chuck