This paper aims to provide a model of linguistic analysis, which can permit to translate qualitative and intrinsically fuzzy categories, such as grammaticalization and lexicalization, into a quantifiable and numerical value. Originally developed in Information Theory, the mathematical function of entropy proves to be a useful tool for this purpose, by assuming an explicit link between the intuitive notion of information and an empirical datum such as frequency. In this perspective, regarding natural languages as communication systems, we will try to account for some synchronic regularities (namely the correlation between frequency and informational value) and diachronic developments (phonetic reduction, semantic weakening, and grammaticalization). In particular, the predictions of this model will then be tested for a well-known linguistic change occurring in the transition from Latin to Romance languages, i.e. the auxiliarization of habere.