Language is normally understood to be extraordinarily useful to humans, but we all share the experience of language getting in the way. Whether this be saying something we don’t mean, or failing, sometimes stubbornly, to agree to use the same words to refer to the same things. An interesting and informative case of this is the role of language in the wholesale kneecapping of historical developments in mathematics, and in the confusion of a sizable plurality of secondary school students. How could language, the workhorse of human communication, hinder development in mathematics?
One of the great achievements in mathematics was the discovery that mathematical objects, such as numbers, could be understood without there necessarily being a number of something. Ancient Greek geometers worked extensively with idealized forms such as circles or triangles, but these objects always had the property of being real, even if in an ideal sense. Ancient Greeks were relentlessly concrete in their mathematical reasoning. For ancient Greeks, along with Egyptian, Hindu, and Arab mathematicians of the middle ages, written forms of equations were commonly used, but, curiously, equations were almost exclusively written out either fully or partially (see Box 1). The development of representing and manipulating quantities using what is called “literal notation” (x + b = a), where variables represent unknown as well as known quantities came in fits and starts, but was ultimately realized. A mathematical expression such as “a quantity of sheep taken from a flock of unknown size” then became, x-a (with the modern convention of using the first letters in the alphabet for known variables and later letters for unknown quantities).
Once we became comfortable representing and manipulating numbers as symbols without any connection to what those numbers might quantify, we no longer had any good reason to disfavor mathematical objects like negative numbers or the square root of negative 1. It didn’t matter any longer that there couldn’t be -1 sheep because the -1 didn’t have to refer to anything. Later, of course, many real-life applications were found for these objects, even if the objects themselves didn’t seem immediately plausible. This contributed eventually to the development of tools such as calculus, and, perhaps unsurprisingly, to the discovery of solutions for many open problems left by the ancient Greeks. The long incubation of this epiphany might seem strange, if you consider that the achievement of written language was possible because humans were able to relate abstract objects (e.g. the notion of a cat) to relatively concrete forms (the word “cat”) and were able to build on this with the addition of symbolic written forms relating to the same notion of a cat. It’s clear from the examples above that written language was well established long before the development of literal notation, and was, in fact, devised thousands of years earlier. Why, then, was it such a leap to accept the abstraction of quantity in the absence of something to quantify, and to represent it symbolically?
My guess is that language impeded humans from viewing quantities on purely symbolic terms and did so because language and written language are not intuitively understood as being symbolic. In the first place, speakers need not be aware that their word for cat is just as good at referring to the concept of a cat, or any number of different cats, or that the word ‘dog’ could just as well refer to cats so long as everyone in the language group agreed. The relation is more or less fully transparent to the speaker, no direct knowledge of the abstractions at work is necessary, and the ease with which speakers avail themselves of language confers a “realness” onto it. Secondly, when humans naturally used language to express mathematical relations, they transferred the intuitive “realness” of the mappings between words and objects to the relation of quantities in mathematical expressions. It was only after dispensing with language that mathematicians were able to shed their bias of the real and devise a purely symbolic means of representing what were discovered to be abstract quantities.
This post was written by Daniel Sharoh, post-doctoral researcher at the Donders Institute.
An early Egyptian way of representing numbers used hieroglyphics, where different objects would denote quantities.
Geef een reactie