en zh es ja ko pt

Volume 12, Number 9November 1961

In This Issue

Back to Table of Contents

Zero, Key To Numbers

It was once impossible to rely on numbers, even for counting until Arab mathematicians showed the world how to use the zero.

Picture a hillside thousands upon thousands of years ago. A man emerges from a cave. His brow is heavy, his arms long and muscular; around his waist he wears a tattered animal skin. Below him a herd of wild horses passes. Back into the cave he rushes and, with grunts and gestures, excitedly tells his clan that "many, many" horses are passing. It's the best he can do. He has no way of telling them that 30, 40 or 50 horses are in the herd, for at best he knows three numbers—one, two, and "many." Civilizations will rise and fall and even his own form will change before he learns to count with the ease and exactness of numbers like 30, 40 or 50. Developing an easy-to-use, easy-to-learn system of numbers was, indeed, a milestone reached only after long struggle. In fact, man has had such a system only for about 1,000 years—and a form of man has been on earth for an estimated 1,750,000 years.

What took so long? What is so difficult about our numbering system—the system that everyone easily learns and then takes for granted? The answer to those questions is bound up in the larger meaning and application of zero. The difference between 5 and 50 is only a zero, but that little circle is actually one of the world's greatest inventions.

The decimal system (in which each unit is ten times greater than the preceding unit) is based on nine numbers and the zero. It makes calculations with infinitely large and infinitely small numbers possible by allowing numbers to expand to infinity on either side of a decimal point —numbers greater than one to the left of the point and numbers less than one to the right. Without such a system, modern astronomy, physics and chemistry would be impossible—or, for that matter, all science. Governments could not determine annual budgets, citizens could not figure out income taxes and even totaling the weekly grocery bill would be quite a chore.

Thus, while the zero is used as a symbol for nothing, it actually means everything in combination with our nine basic numbers, providing these numbers with an infinite variety of value. The zero's creation opened the way for the entire concept of algebraic plus and minus numbers, which we use not only to calculate with, but also to identify temperature, electrical charge and discharge and to navigate planes and ships. Speaking less practically and more poetically, the zero serves as a reference point around which man can talk confidently about infinity.

Most of the ancient civilizations had numbering systems and symbols to express their numbers in written form. But without the zero even the simplest arithmetic—addition or subtraction—was next to impossible. The earliest written symbols for numbers were probably lines scratched in soft clay: one line meant one, two lines meant two and so on. Then additional symbols were invented to represent larger quantities. Sumerian merchants in 3,000 B.C. used a system of number symbols on bills, notes and receipts. A 5,000-year-old Babylonian tablet records a payment by clay check. Permanent records of numbers were improved upon by the Egyptians, who used paint instead of clay.

The Greeks had to memorize 27 different symbols just to express the numbers 1 through 999. Each 8, for example, in 888 was represented by a different symbol. Just as unwieldy was the Roman system of using the first letter of the name of the number: 100, for example, was represented by the "C" of centum and 1,000 by the "M" of mille. The Roman who wanted to write down the quantity 1,000,000 had no choice other than writing a thousand M's. And to multiply his clumsy numerals was just about impossible: XLVII x IX x MCMXIV = ? When faced with such a problem, the Roman discarded his written symbols and turned to his counting board, or abacus. All of Europe, through its Dark Ages, followed Rome's example.

Far to the east, however, as early as 200 B.C., Hindu scholars were working with nine oddly shaped symbols and a dot that eventually would bring order out of a world of mathematical chaos. The dot and nine symbols were the earliest known forerunners of the numbers 0, 1,2, 3, 4, 5, 6, 7, 8, 9. Comprised of only ten symbols and based on multiples of ten, the Hindu system was easily learned and easily used. It was the dot that made the system unique because with the dot came a written expression of the place system of numbers—the system that allows the nine basic numbers in different combinations to represent every possible quantity and assigns a different value to the nine numbers depending on their place or position in a series.

Who first thought of using a dot as the tenth number is not nown. But it can be supposed that a Hindu, working on his abacus, wanted to keep a written record of the answers on his abacus. One day he used a symbol (.) which he called sunya to indicate a column on his counting board in which he had moved no beads. Sunya the dot was not zero the number. It was merely a mark to indicate empty space.

The abacus he was using had already been around a long time. On it, to represent 33, for example, he moved three beads on each of the bottom two rows to the right. For 303, he also moved three beads to the right on each of two rows—but between these rows he left an untouched, empty row. It was for the empty row that the unknown Hindu used the symbol (.). The word sunya , standing for the dot, means "empty" or "blank."

The concept of sunya was probably brought by traders from India to Baghdad in the ninth century, when that city was one of the world's greatest centers of learning. Arab merchants and mathematicians immediately recognized the versatility and uniqueness of sunya and further developed its concept. The modem word cipher comes from the Arabic sifr, which was derived from the Hindu sunya. Latin scholars translated sifr as zephyrum, which in Italian became zepiro and zeuero and in English was shortened to zero. The German world for zero—ziffer— and the French chiffre also derive from the Arabic sifr. All of these words came in time to mean much more than zero. Cipher, for example, took on at least a half dozen meanings. It can refer to zero or to any one of the Arabic numerals; it also can mean to compute , or it can mean a complex system of secret writing.

When the new numbering system made its way into Europe through the Moors and became known as Arabic notation, it was already the subject of thorough exploration by Arab scholars. As early as 825 A.D. Arab mathematician al-Khowarizmi had written a book on the zero, and in 976 the scholar Muhammad ibn Ahmad had noted in his Keys of the Sciences that if in a calculation no number appeared in the place of tens, a little circle should be used "to keep the rows." The first comprehensive European analysis of the zero and the nine other Arabic numerals was made in 1202 by Italian mathematician Leonardo Fibonacci, who had studied under an Arab tutor.

Despite the advantages of a numbering system with zeros "to keep the rows," it took Europeans a long time to give up Roman numerals and an even longer time to understand the Arabic numerals, especially the zero. "It seemed impossible for them to comprehend how 3 was three in the units place and 30 in a combination such as 35. Instead they wrote 305 for 35. If 30 was thirty and 5 was five, what could be more logical? Combinations with Roman numerals . . . produced such hybrids as X5 for 15, C35 for 135, and MCCC35 for 1335." Even those who accepted Arabic numerals didn't agree on what they should look like, and it was not until well after the invention of printing in the fifteenth century that Arabic numerals were standardized in design.

Since then, the decimal system with its numbers expanding in multitples of ten on both sides of a point has proved the wisdom of the Hindu who saw the need for a symbol "to keep the rows" and the Arab scholars who recognized sunya's immense significance, developed its concept even further, and then brought it to the attention of the world.

Since then, numbers really have been something you can count on.

This article appeared on pages 14-15 of the November 1961 print edition of Saudi Aramco World.

See Also: NUMBERS

Check the Public Affairs Digital Image Archive for November 1961 images.