A higher entropy value indicates a high level of uncertainty of information or more practically, a higher number of possible outputs of a function. Entropy adds a mathematical tool-set to measure the relationship between principles of disorder & uncertainty. Known as the Father of Information Theory, MIT mathematician Claude Shannon famously introduced a logical, mathematical way of measuring this difference in recorded communication methods: he defined it as entropy. Different communication methods imply a difference in the uncertainty communicated. With an unlimited amount of words, language offers deeply more descriptive information not casually observed, such as where the tree was planted, by whom, & the type of soil it absorbs. With a preserved image, for example, information on the shape, colors, & even relative size of neighboring trees are all discernible - but is it perfectly precise? Only relatively so - a discrepancy is revealed when comparing supplementary information provided by written language. Consider communicating the presence of a tree.
Information, within the context of the recently-established branch, is defined as the resolution of uncertainty (or surprise). In the English language, ideas as concrete as a “tree”, & as complex as “love” are expressed through the usage of 26 letters - offering zero intrinsic value besides that which society has assigned them. As time progressed, the methodology of record-taking evolved from cave paintings to complex alphabets - offering complete expression through the complex structure that is language. Around some 40,000 years ago, before the evolution of written language, physical illustrations on cave walls were the most precise method of recorded communication. The Indonesian Caves of Borneo Island offer an insight to the most primitive form of recorded communication.