science history —

The strange and righteous history of the equals sign

500 years ago, tragic Welsh genius Robert Recorde wanted to teach math to ordinary people.

This plaque honoring Robert Recorde is at the Tenby Museum in Wales.
Enlarge / This plaque honoring Robert Recorde is at the Tenby Museum in Wales.

Robert Recorde was one of those people so extraordinarily ahead of his time that he seemed destined to come to a tragic end. In the 16th century, he made advances in economics, medicine, theology, and poetry. But his greatest contribution is taught to every elementary school child, and it arguably laid the groundwork for modern computer science. He invented the equals sign.

From urine to popular science

Recorde was born in 1510 in Tenby, Wales. At age 14, he went to Oxford University. At age 21, he was teaching mathematics there, although scholarship wasn't his first career goal. Over the next few years, he also earned a degree in medicine and wrote the exquisitely titled monograph The Urinal of Physick, detailing what a physician could learn from a patient's urine.

Either medicine proved less fascinating than Recorde had anticipated, or less lucrative. Over the next decade, he moved from medicine to finance and oversaw mints in Bristol, London, and Dublin. The writer's life, however, clearly appealed to him. He produced a large and varied body of work: theological tracts defending Protestantism, poems, and most importantly, textbooks.

Not only did Recorde explain astronomy, geometry, and arithmetic in successive textbooks, but he explained them in English. Previous works on mathematics were written in Latin, meaning the only people who could read them already had an extensive education. Recorde wrote in English for the British layman. For these busy learners, he came up with his most famous invention. His final book, The Whetstone of Witte, published in 1557, gave the world the equals sign.

=

Perhaps a man trained to study urine and keep control over currency has a pragmatic mind. Recorde found it irritating to have to state over and over that one side of an equation was equal to the other side. He wrote, with obvious annoyance and whimsical spelling, "And to avoide the tedious repetition of these woordes, is equalle to, I will sette as I doe often in woorke use, a paire of paralleles." Instead of using a phrase to convey meaning, he would convey the same meaning with a symbol. What symbol could be more appropriate than a pair of equal-length lines? Nothing, Recorde explained, "noe 2 thyngs, can be moare equalle."

Recorde's symbol didn't catch on at first. The language of Latin still held sway during the 16th century. Latin had a word for the concept, "aequalis," and if more concision was necessary, people could shorten it to "ae" or "oe." But aiding the popularity of the equals sign was Recorde's introduction, to English-speakers, of the German symbols "+" and "-". In combination, these signs allowed people to express, quickly and with a minimum of wasted ink, a mathematical equation in symbols. Instead of writing, "A factore added to a quantitie of thryeye is equalle to a dyffyrynte factore frome whyche is takene awaye a quantitie of foure," a mathematician could write: "x + 3 = y - 4".

At first glance, this makes Robert Recorde an excellent trivia game topic, but not much more. John V. Tucker, a professor of computer science at Swansea University and an avid researcher of computability theory, argues that Recorde is far more significant than his reputation suggests. In Robert Recorde: The Life and Times of a Tudor Mathematician, Tucker describes how Recorde's work laid the foundation for computer science, at least among English-speaking people:

Computing is largely about collecting, creating, and processing data. It is universal and ubiquitous because it is intimate with the world's work... To understand the history of computing, we may follow the data, which leads us to practical mathematics and to writers such as Recorde.

Tucker believes that Recorde introduced Britain to the concepts of quantification and data in ways that were unfamiliar and world-changing. His instruction on how to apply mathematics to areas such as commerce, land surveying, navigation, and astronomy imposed a new mindset on people who would previously have talked about quantity in vague terms such as "a goodlie amount and a fair distance." Recorde helped introduce precision, exact quantification, and accurate comparison in everyday life.

Math for everyone

Recorde also, it is clear, ushered in a new way of talking about numbers. Picking and choosing from Latin and German texts, he introduced English speakers to words like "linear," "denominate," "binomial," and "equation." He introduced mathematical meanings for words like "square," "compound," "rational," and "irrational." Robert Recorde taught people to speak a mathematical, or more importantly, a computational language.

His most famous invention is still part of this computational language. The equals sign was first used as part of a computer programming language in FORTRAN I, in 1957. The sign, and its variations such as == or ===, continue to be used by people who, like Robert Recorde, would prefer to "avoide the tedious repetition of these woordes, is equalle to." Today, computer programmers reap the benefit of Recorde's work in inventing a precise, quantifiable, and efficient symbolic language to describe the world.

Sadly, Recorde wasn't lauded as a hero of rationality and math education in his lifetime. Brilliant as he was, he forgot one of the basic rules of his day: the aristocracy always wins. Recorde was relieved of his position at the Dublin mint by the Earl of Pembroke. Whether this was because of a setback at some silver mines Recorde was assigned to develop or because he was a Protestant when devoutly Catholic Queen Mary rose to the throne is up for debate.

Recorde chose to dispute it with the Earl, filing a suit of malfeasance against him. The Earl responded by suing for libel. The Earl won. In 1557, a destitute Recorde was thrown in debtor's prison, where he died a year later at the age of 48.

Channel Ars Technica