the idea of a fully operational zero...
August 26, 2010 6:36 PM   Subscribe

"Michel de Montaigne, whose essays transformed Western consciousness and literature, was not capable of solving basic arithmetic problems. And most other people would not be able to do so either, if not for the invention of decimal notation by an unknown mathematician in India 1500 years ago." The Greatest Mathematical Discovery? (expanded pdf) a paper written for the US Dept. of Energy makes this assertion based in part on the work of Georges Ifrah. [via]

Number geeks may also enjoy:
- Disme: The Art of Tenths (Fr.|Ned.) by Flemish mathematician Simon Stevin describing decimal notation's usefulness in everyday life.
- Decimals and Decimalization A Study and Sketch (The First Canadian Work of the Twentieth Century) containing a history of the progress or lack thereof of metric measurements
- Period or Comma? Decimal Styles Over Time and Place (pdf) by Amelia A Williamson
posted by jessamyn (43 comments total) 35 users marked this as a favorite
 
Wikipedia entry on Indian mathematics, with a lot of history and links.

/obligatory
posted by vidur at 7:01 PM on August 26, 2010 [1 favorite]


It's odd to me to see this presented as a "discovery" as opposed to "invention" or maybe "convention". It's not like numbers are inherently multiples of each power of 10 with a remainder for the ones column* any more than they're inherently groups of ones, fives, tens, fifties, hundreds etc. as represented by roman numerals. At different times humans have decided to represent them in these ways (And which way we represent them obviously makes some things harder or easier), but that's a convention and not something about the number we've discovered.

That said, hurray for the decimal system. I just think we invented it, not discovered it. (and yes, I see both words used in the articles, I just don't think discovered is really accurate).

* Was I the only one who was taught in school to write numbers in expanded form? e.g. 321= 3X100 + 2x10 + 1.
posted by If only I had a penguin... at 7:13 PM on August 26, 2010


It might plausibly be said to be a discovery that one could write numbers that way, and that exponent-based ways of writing numbers are more convenient than Roman-style number systems.
posted by kenko at 7:19 PM on August 26, 2010 [3 favorites]


The other side of the coin - Dyscalculia (I heard about this for the first time just yesterday), the difficulties that math disability can bring.
posted by unliteral at 7:33 PM on August 26, 2010


Discovery vs. invention is an old mathematical argument: most mathematicians I know understand that math is invented but feel like it is discovered. In this case, the particular choice of base 10 etc feels like an invention, but the concept behind the notation itself feels like a discovery.
posted by unSane at 7:33 PM on August 26, 2010


Interesting that there seems to be some correlation between the general scale and complexity of the numbers and operations being used and the need for a new method of notation. In some sense we move from roman numerals and addition/subtraction, through to multiplcation/division and decimal, on to exponentiation and the need for Knuth's up arrows and other such conventions.
posted by axiom at 7:39 PM on August 26, 2010


I'm definitely more of a math hobbyist than any sort of actual math person, but I like the idea that we're still figuring out the timelines and who knew what when w/r/t math. I also like the small note that is in this paper which I've now read a few other places about the guy from Bell Labs who made a mechanical calculator that worked in and displayed Roman numerals. Can anyone find a photo of Claude Shannon's Throback I?
posted by jessamyn at 7:42 PM on August 26, 2010


Forwarded to my math nerd daughter (who can explain these things to me).
posted by Devils Rancher at 8:06 PM on August 26, 2010


Okay, I'm going to feel like a nerd, but the Montaigne quote is taken out of context:

“I cannot yet cast account either with penne or Counters.” That is, he could not do basic arithmetic [Ifrah2000, pg. 577].

That is a rather ... narrow reading of the passage. As the originator of I don't know what I don't know, this was a long passage in which Montaigne was describing himself as an idiot. My modern translation has this preceding it:

"My mind is slow and dull; it cannot penetrate the slightest cloud, so that, for example, I could never offer it any enigma easy enough for it to unravel. There is no subtlety so empty that it will not stump me. Of games in which the mind has a part - chess, cards, draughts, and others - I understand nothing but the barest rudiments."

It goes on like that for like 3 pages. The particular quote the article used took place when he was going on about how he definitely was not a farm boy, despite being raised on a farm, and couldn't tell a goat from a cow. Again, the modern translation:

"I have had affairs and management in my hands ever since my predecessors in my possession of the property I enjoy left me their place. Now I cannot reckon, either with counters or with a pen; most of our coins I do not know; nor do I know the difference between one grain and another, either in the ground or in the bar, between the cabbages and lettuces in my garden ... And since I must make my shame quite complete, not a month ago I was caught ignorant that leaven was used to make bread."

So the point of all this is, if you're sarcastic now, someone in the future isn't going to get it.
posted by geoff. at 8:28 PM on August 26, 2010 [16 favorites]


jessamyn: Throbac - THrifty ROman numeral BAckwards-looking Computer.
posted by unliteral at 8:30 PM on August 26, 2010 [2 favorites]


if you're sarcastic now, someone in the future isn't going to get it.

That's so great. As is the THROBAC machine. Whee.

My favorite part of the whole shebang is how Europe would have had a functional decimal system much earlier had it not been for racists, clerics and accountants.
The Indian system (also known as the Indo-Arabic system) was intro- duced to Europeans by Gerbert of Aurillac in the tenth century. He traveled to Spain to learn about the system first-hand from Arab scholars, prior to being named Pope Sylvester II in 999 CE. However, the system subsequently encountered stiff resistance, in part from accountants who did not want their craft rendered obsolete, to clerics who were aghast to hear that the Pope had traveled to Islamic lands to study the method. It was widely rumored that he was a sorcerer, and that he had sold his soul to Lucifer during his travels. This accusation persisted until 1648, when papal authorities reopened Sylvester’s tomb to make sure that his body had not been infested by Satanic forces.

The Indo-Arabic system was reintroduced to Europe by Leonardo of Pisa, also known as Fibonacci, in his 1202 CE book Liber Abaci. However, usage of the system remained limited for many years, in part because the scheme continued to be considered “diabolical,” due in part to the mistaken impression that it originated in the Arab world (in spite (in spite of Fibonacci’s clear descrip- tions of the “nine Indian figures” plus zero). Indeed, our
modern English word “cipher” or “cypher,” which is derived from the Arabic zephirum for zero, and which alternately means “zero” or “secret code” in modern usage, is very likely a linguistic memory of the time when using decimal arithmetic was deemed evidence of dabbling in the occult, which was potentially punishable by death
posted by jessamyn at 8:34 PM on August 26, 2010 [11 favorites]


Also: Shannon's Ultimate Machine.
posted by unliteral at 8:37 PM on August 26, 2010


Yay thanks for this!
posted by Kerasia at 8:50 PM on August 26, 2010


If only I had a penguin...: “That said, hurray for the decimal system. I just think we invented it, not discovered it. (and yes, I see both words used in the articles, I just don't think discovered is really accurate).”

I think it's neat, too. And there are all kinds of awesome links in this post that I'm going to enjoy digging through for a long time.

However, I want to point out that the system of 'arabic numerals' was by no means the first decimal number system – that is, it was not the first system of numbers represented by ten interchangeable symbols with which one could do mathematical figures. For one thing, at least seven hundred years before the common era, the Greeks were using the Greek numeral system, which was decimal as well. I'm guessing the minor differences (there are different characters used for hundreds and thousands) are what the article means when it says that the Indian was the first 'fully positional' decimal system. But the Greek system was quite well-suited to mathematics. What's more, very few educated Roman geometers, at least at the height of Roman civilization, would have used Roman numerals; Greek notation was common throughout the empire in learned circles.

I also take issue with this being labeled a 'discovery,' as 'discovery' implied that a thing is true. And assigning zero and one as numbers of the same class as two, three, four, and the rest of the multitudes is a mistake.
posted by koeselitz at 8:54 PM on August 26, 2010


"... encountered stiff resistance, in part from accountants who did not want their craft rendered obsolete ..."

Obviously, they were not very good accountants.
posted by vidur at 8:56 PM on August 26, 2010 [1 favorite]


Note that although Archimedes saw far beyond the mathematics of his time, even anticipating numerous key ideas of modern calculus and numerical analysis, and also even though he was very skilled in applying these mathematical principles to engineering and astronomy, nonetheless he used the traditional Greek-Roman numeral system for calculations [Netz2007,Marchant2008]. It is worth noting that Archimedes’ computation of pi was a tour de force of numerical interval analysis performed without either positional notation or trigonometry

You can read this two ways:

1) gee Archimedes had it tough, doing all that stuff without decimals

2) actually, decimal numbers are just a notation and aren't all that important in the end, either in engineering, as long as you mark your straight edges evenly, or mathematically.

The point being that, with the method of exhaustion, Archimedes was effectively working with the "real" numbers, of which 'pi' is just one famous example (out of uncountably many.) I think it's really hard to overestimate the sophistication of classical greek mathematics and engineering: from robots to accurate estimates of the size of a round earth. Considering the circumstances of the death of Archimedes, a world in which the Roman empire never existed might have
have been one in which calculus was "discovered" 800 years earlier.
posted by ennui.bz at 9:02 PM on August 26, 2010 [1 favorite]


unSane: Obviously, they were not very good accountants.
posted by vidur at 11:56 PM on August 26 [+] [!] [Q]



Base 10 is certainly an invention, it paves the way for the discovery that all systems are indeed base 10.
posted by paisley henosis at 9:18 PM on August 26, 2010


Way cool; thank you for posting this.
posted by LobsterMitten at 9:27 PM on August 26, 2010


(Also, N.B.: Archimedes did work with decimal notation. For what it's worth.)
posted by koeselitz at 9:29 PM on August 26, 2010


koeselitz: do you have any evidence he did? Because the passage quoted by ennui.bz seems to (though I haven't tracked them down)
posted by claudius at 9:58 PM on August 26, 2010


the references, that is
posted by claudius at 9:58 PM on August 26, 2010


claudius: “koeselitz: do you have any evidence he did? Because the passage quoted by ennui.bz seems to (though I haven't tracked them down)”

The passage oddly refers to 'the Greek-Roman numeral system.' I don't know what that is. I think they want to evoke the Roman numeral system familiar to us in the west; but no serious mathematician used that system until a thousand years after the birth of Christ.

Archimedes certainly used the Greek numeral system, because that was already the standard system in use throughout the Mediterranean world. And the Greek numeral system was decimal.
posted by koeselitz at 11:20 PM on August 26, 2010


I though the abacus implicitly used a place system of notation. Am I wrong about this? Were they used differently than they are now?
posted by Obscure Reference at 2:05 AM on August 27, 2010


I'm guessing the minor differences (there are different characters used for hundreds and thousands) are what the article means when it says that the Indian was the first 'fully positional' decimal system.

The Indian system is positional in the sense that value is also encoded in the position of a symbol: the symbol 1 in 1000 represents a different value from that in 100. This makes the calculation algorithms much simpler: you keep doing the same thing while moving from left to right. With the Greek system you can jumble up the digits without losing any information --- I don't know if they even ordered digits by value or considered say, αι and ια to both represent the number 11.

Ancient Greeks did have a decimal system, but they used different symbols for each multiple of each power of ten - they didn't even have a symbol for zero that could have been used in a positional system. This is more complicated to learn (more symbols) has trouble handling large numbers (you run out of symbols), and doesn't work so well for fast calculation.
posted by Dr Dracator at 5:02 AM on August 27, 2010 [1 favorite]


> But the Greek system was quite well-suited to mathematics.

Have you tried using it?
posted by languagehat at 6:17 AM on August 27, 2010 [1 favorite]


The point being that, with the method of exhaustion, Archimedes was effectively working with the "real" numbers, of which 'pi' is just one famous example (out of uncountably many.)

This is a little confused. First of all, there are rationals which are the ratio of integers. Then there are reals, which may or may not be expressive as a ratio of integers. If not, they are irrational. Pi is irrational, but that's not why it is famous. It's famous for being transcendental, which means it's not the solution to any polynomial such as ax2 + bx + c. (I'm not going to get into the countability issue.)

So let's get back to Archimedes "effectively working with the real numbers". The Greeks didn't really understand irrationals, despite proving they existed. And the output of his process was was a pair of rationals that bounded pi above and below: 3+10/71 < π < 3 + 1/7. In the sense that these numbers are not integers, they are "reals". But they are not irrational and they are certainly not transcendental.
posted by DU at 6:35 AM on August 27, 2010 [1 favorite]


I read a very interesting book years ago that examined the growth of numeracy in the US: A Calculating People.
posted by mareli at 6:46 AM on August 27, 2010


This just seems a bit nonsensical to me.
Mathematicions don't really use decimals. they don't use caculators for the most part. Its the Engineers who need a calculator.
posted by mary8nne at 7:59 AM on August 27, 2010


Dr Dracator: “Ancient Greeks did have a decimal system, but they used different symbols for each multiple of each power of ten - they didn't even have a symbol for zero that could have been used in a positional system. This is more complicated to learn (more symbols) has trouble handling large numbers (you run out of symbols), and doesn't work so well for fast calculation.”

That's because they didn't believe zero was a number. They had very real philosophical and mathematical reasons for this, which they explained. I happen to think they were right.

What's more, mary8nne is right; mathematics generally doesn't require fast calculation, and large numbers aren't always needed for conceptual stuff. I still don't see how that could hold a person back from understanding the deeper implications of mathematics.

languagehat: “Have you tried using it?”

I have, actually. When I was in school, we went through Euclid and Apollonius of Perga; during the Apollonius bit (his treatise on conics, which is still the most difficult and profound mathematics treatise I've ever encountered, and that's including Maxwell) I spent some time trying to do figures with Greek numerals. It's just the alphabet, so if you know the order of the letters it only takes some hacking with it to start to get it down. It's not so difficult as long as you keep it in the hundreds; after that, I was a little lost, but then again I'm not a native speaker / writer of Greek. I imagine that if I were, I'd have had a bit of an easier time.

What I'm trying to get us past is the immediate assumption that the ancients 'didn't understand' math or were held back from understanding it by the systems they were using. That's not just because I have an axe to grind for the Greeks, though I know I'm more invested in them than most; it's because this notion that they avoided computational mathematics because of its difficulty represents, I think, a fundamental misunderstanding of their work. If you simply conclude that they avoided these things because of difficulty, you're likely to miss the fact that they gave reasons why they thought mathematics ought to be approached in a particular way – in the geometrical way.

There's a very good book about the history of math that I can recommend called Greek Mathematical Thought and the Origin of Algebra. It's not about the system of numerals – it actually deals with algebra and the representational system of mathematics that we use without even thinking about it today (for example, DU used it in that last comment above) and the way that system changes our way of seeing the world in subtle and fundamental ways. The thoughtful way to approach this, I think, is to accept that we might not be right about the way to 'properly' do math. As that book points out, these ways we have of thinking about mathematics have come to dominate not just mathematics but physics, science, and our whole outlook about the world; to the point where we can't easily even see what it is we've accepted without question. It takes some work to pull back far enough to notice that there is a lot in mathematics that is conventional rather than rational. That's why I really like thinking about the history of numerals, and where decimals came from; it highlights the conventionality of our own system today.
posted by koeselitz at 9:31 AM on August 27, 2010 [3 favorites]


Base 10 is certainly an invention, it paves the way for the discovery that all systems are indeed base 10.

Wait, what? I'll cop to not knowing a lot about math, but in what way are all systems base 10? I thought different methods of counting (base 10, base 8, etc.) were fairly interchangable. Is there some inherent sense in which base 10 is more basic?
posted by Jahaza at 9:52 AM on August 27, 2010


There is a lot of confusion in this thread. Mathematics is independent of the particular system of numerical representation you use, which is why the Greeks could get as far as they did.

Numerical representation is, however, crucial for arithmetic, and arithmetic is crucial for lots of other things, including engineering and book-keeping.

As for zero not being a number, that depends entirely on your definition of 'number', of which there are a massive profusion of varieties in the math garden. You are free to roll your own definition and see how far you get with it.
posted by unSane at 10:00 AM on August 27, 2010


Yes but what does it say about CANNIBALS??
posted by Lutoslawski at 10:30 AM on August 27, 2010


Jahaza: Base 10 is certainly an invention, it paves the way for the discovery that all systems are indeed base 10.

Wait, what? I'll cop to not knowing a lot about math, but in what way are all systems base 10? I thought different methods of counting (base 10, base 8, etc.) were fairly interchangable. Is there some inherent sense in which base 10 is more basic?


No, it's a joke. We call it "Base Ten" because when we get to 9+1 we start over but shift everything one place to the left. But if you were counting in Base Six, you would do the same thing when you reached 5+1, and Six would be written 10 (one in the Sixes place, and nothing in the Ones place).
posted by paisley henosis at 10:51 AM on August 27, 2010


There is a lot of confusion in this thread. Mathematics is independent of the particular system of numerical representation you use, which is why the Greeks could get as far as they did.

Numerical representation is, however, crucial for arithmetic, and arithmetic is crucial for lots of other things, including engineering and book-keeping.


I cannot disagree with this, yet I think it is underestimating the main thrust of the argument for the significance of positional decimal representation: It introduces the idea of operations on numbers as formal operations on symbols, described by a minimal set of simple rules and independent of any significance or interpretation. I cannot see how the abstraction of algebra would have been recognized as useful and developed without first exploring this or a similar mode of thinking.
posted by Dr Dracator at 12:00 PM on August 27, 2010 [1 favorite]


That's because they didn't believe zero was a number. They had very real philosophical and mathematical reasons for this, which they explained. I happen to think they were right.

Do you care to elaborate on this? I don't have any definition for "right" in these matters other than "consistent with everything else and useful on some level": how would we benefit from discarding zero as a number?
posted by Dr Dracator at 12:08 PM on August 27, 2010


The Indian system (also known as the Indo-Arabic system) was introduced to Europeans by Gerbert of Aurillac in the tenth century. He traveled to Spain to learn about the system first-hand from Arab scholars, prior to being named Pope Sylvester II in 999 CE. However, his advocacy of the system encountered stiff resistance, in part from accountants who did not want their craft rendered obsolete, to clerics who were aghast to hear that he had traveled to Islamic lands to study the method. As a result, it was widely rumored that he was a sorcerer and that he must have sold his soul to Lucifer during his travels, an accusation that persisted until 1648, when papal authorities reopened his tomb to make sure that his body had not been infested by Satan!

Mind you, Sylvester was also notorious for cracking down on simony and clerical concubinage, so there might be a little of that behind is contemporary bad press. That and an alliance with the Holy Roman Emperor at a time when Rome itself rebelled against he Holy Roman Emperor. I'd imagine the aghast clerics would care more about that than some odd fascination with numbers (devils tool though they be), save as another bit of tar to brush him with. Had he been more a party pope, perhaps it would have been overlooked.

As to the shock and horror in Christian Europe, well, not so sure that's true. FIbonacci was much appreciated by both politicoes and merchants in his native Pisa - and Pisa was doing pretty damn well at that time.

Then we have that other clerical hero in the accounting world Luca Pacioli, man of math and double entry book keeping, whose works came along right after Gutenberg and not coincidentally set that world on fire.

See, even if the old farts tend to like to stick to the tried, true, and traditional, money men tend to adapt the new fangled any time there's a monetary edge to be gained. The article is interesting, but a tad tendentious.

(Years ago I saw someone demonstrate how to manipulate Roman numerals in ways you would not normally associate with Roman numerals - additions subtraction I think even multiplication and division. Can't recall the details, but it was not difficult to follow at all. .Ring a bell?)
posted by IndigoJones at 6:50 PM on August 27, 2010


Dr Dracator: “Do you care to elaborate on this? I don't have any definition for "right" in these matters other than "consistent with everything else and useful on some level": how would we benefit from discarding zero as a number?”

Well, without getting to axe-grindy, I can say a few things, I guess.

To start with, I'm actually not alone in this; Heidegger, too, believed that zero was certainly not a number. And he agreed with me that one isn't a number, either. (That's probably at least a little more controversial.) This had something to do with his philosophical outlook, and something also to do with his hope to critique philosophy as a whole, from the perspective of a time before even the Greeks. But there are people who clearly disagree in a fundamental way. The chief among them, I think, is Richard Dedekind. His essay The Nature and Meaning of Numbers [pdf] is a sort of categorical denial of my thesis, although I feel as though it's less a categorical denial and more a flat denial with no proof or rational arguments for his point of view. That text, which first laid down methodically the now-ubiquitous concept of the number line, pretty much starts by accepting it as given that zero and one are numbers. (My own feeling is that the essay is utterly misnamed, since it actually sucks the nature and meaning out of numbers.)

The best place to begin, I guess, is with the definitions of Euclid, which are pretty good concise statements of the Greek mathematical approach to numbers. He doesn't begin dealing with numbers until the seventh book of his Elements, where he starts by giving definitions:
1. A unit is that by virtue of which each of the things that exist is called one.
2. A number is a multitude composed of units.
This is a pithy statement of the perspective which hints at the theoretical undercurrents behind it. First, a unit; it's a thing that allows us to call each of the things that exist one. This is an essential and momentous thing; the unit is a mechanism which allows us even to conceive of things in the world. Oneness, the perception of a unified whole which is at the same time different from what surrounds it, is a fundamental moment in the human experience. I think it's essential, if we're going to be honest about how we experience the world, for us to come to the realization that seeing the oneness of a thing is categorically different from counting. There are even animals who are observably able to do one of those things and not the other; in fact, I don't know of any animals who can actually count, although I think there exist animals who can be aware of the multiplicity of objects and can even keep track of large numbers of them. Actually counting, though – enumeration – requires language. There is some way in which accepting the oneness of an object is actually prior to language.

And the moment we accept the oneness of a thing, we are forced to accept nullity – because if a thing is one, then everything else is not that one. It's set off. And on some level it becomes apparent at that moment, I think, that nullity, nothingness, had always existed, at the very least because before that there was always an absence of oneness. That absence of oneness is zero.

Both of these things – unity and nothingness, one and zero – are categorically different from each other (as opposite as can be) and categorically different from numbers. This outlook is reflected in the way Euclid defines number: as a multitude composed of units. It's units put together, gathered up. It may seem intuitive to our minds that you might gather up a whole bunch of units, and then take away all but one; have you then changed a number into a not-number? Where did the number go? This seems odd to us, but Euclid and many of the Greek geometers insisted on this because they felt as though the one is prior to the many in a deep and fundamental way. Aristotle sometimes relates somewhat playfully (I think) that they were very hung up on the notion that numbers have different natures; the humorous side is that, for example, the Pythagoreans thought that ten was the 'perfect' number, and that some mathematicians had some phobias or perturbations about the numbers seven and fifteen. But there is something to this; numbers do have their own natures, and knowing those natures is the only way to understand their true meaning.

That's why I think it's essential to class one and zero as non-numbers; they represent fundamentally different moments in the experience of the world than the numbers do. Counting, enumeration, is very different from the recognition of oneness.
posted by koeselitz at 8:12 PM on August 27, 2010 [1 favorite]


sorry, forgot to link to the pdf: Dedekind, The Nature and Meaning of Numbers.
posted by koeselitz at 8:13 PM on August 27, 2010


(Also, on the subject of Euclid, I might note that the most beautiful publication of Euclid's Elements of Geometry in at least two generations has just taken place in Taschen's gorgeous reprint of Oliver Byrne's 1847 edition. Anybody who really loves books should check it out; a lack of funds and a recent soon-regretted promise prevent me from spending the $60 on a copy, but you should if you can.)
posted by koeselitz at 8:18 PM on August 27, 2010




I think it's essential, if we're going to be honest about how we experience the world, for us to come to the realization that seeing the oneness of a thing is categorically different from counting.

I agree, yet do not see why doing this must stop us from counting to one: this concept of unity lives on a different sphere than the numbers. If you look at, for example, the various methods of constructing the natural numbers, you can see how unity or zero could be a concept of different order than the numbers yet expressible and corresponding to a number.

How would you express the operation of adding another item to a set, without one as a number? Unless you can have a framework were all operations are treated equally, your notation would become cumbersome and unclear, and would be an impediment to developing other, more abstract yet perfectly useful concepts. Instead of one operation of addition, you would need six (or even 9 if we consider the order of terms):
  1. addition of two numbers
  2. addition of a number with unity
  3. addition of a number with zero
  4. addition of unity and zero
  5. addition of unity and unity
  6. addition of zero and zero
When doing symbolic calculations, where you do not know which number you are dealing with, all of these would need to be differentiated from each other. It would then be perfectly natural do define a superior set of things that can be either counting numbers, or one, or zero, collapsing all six different operations into a single one: I do not see why this is not a good thing.
posted by Dr Dracator at 11:56 PM on August 27, 2010


Oh yeah, Hiedegger and Aristotle, those noted mathematicians ::eyeroll::

Philosophers on mathematics is about as illuminating as mathematicians on philospohy, which is to say occasionally but not usually.

Numbers are whatever the fuck we define them to be. We can define integers, positive integers, negative integers, reals, rationals, naturals, irrationals, transcendents, complex varieties of all of these, and so on and so on. Let a million flowers bloom. You are free to define a set of numbers free of zero, or one, if you wish. You will then have to develop a mathematics which deals with the fact that the group is not closed under subtraction or division respectively, and quite possibly not addition or multiplication either, doesn't have identity operators, and so on and so on.

Go ahead, try it. Arguing about whether one or zero is a number is a philosophical/metaphysical argument; all that mathematicians care about is how interesting the consequences are.
posted by unSane at 9:25 PM on August 28, 2010 [1 favorite]


spelling by martini
posted by unSane at 9:39 PM on August 28, 2010


« Older best magazine covers of 2010   |   Whack-a-mole climate denialism Newer »


This thread has been archived and is closed to new comments