Updating supposing and maxent
The maximum entropy principle is also needed to guarantee the uniqueness and consistency of probability assignments obtained by different methods, statistical mechanics and logical inference in particular.
The maximum entropy principle makes explicit our freedom in using different forms of prior data.
Consider the set of all trial probability distributions that would encode the prior data.
According to this principle, the distribution with maximal information entropy is the proper one.
The selected distribution is the one that makes the least claim to being informed beyond the stated prior data, that is to say the one that admits the most ignorance beyond the stated prior data.
The principle of maximum entropy is useful explicitly only when applied to testable information.
Entropy maximization with no testable information respects the universal "constraint" that the sum of the probabilities is one.
Consequently, statistical mechanics should be seen just as a particular application of a general tool of logical inference and information theory.
For continuous distributions, the Shannon entropy cannot be used, as it is only defined for discrete probability spaces.