Many years ago, a friend of mine questioned the common practice of always rounding halves up. That lead me to write this article, just a few minutes ago. It also led me to question elements of math theory and education in general. That let me down a line of reasoning that motivating me to write this article, and I ended up writing that one first as an example. (If that logic is confusing, it's because it is. Basically, I was going to precede the following with a paragraph about the question my friend raised, but by the time I finished, I found I had written a whole article... Yeah, that happens to me sometimes.) Anyhow, on to the topic at hand.
In elementary school math, at least in the U.S., there is a point where a lot of emphasis is put on classifying numbers as even or odd. We are taught, implicitly if not explicitly, that oddness and evenness are fundamental properties of numbers, and that telling the difference is a critical math skill. Rather than exploring these ideas, let's just cut to the chase: This is all wrong. Oddness and evenness are not fundamental properties of numbers, and there is very limited value in being able to classify numbers as even or odd on sight.
The first problem is that the vast majority of numbers are neither even nor odd. I hear you saying, "But half of all numbers are even, aren't they?" Nope, not even close. In fact, so few numbers are even, that one might reasonably claim that evenness doesn't exist, statistically. How can I say this, when you can just count by twos indefinitely or even provide a mathematical proof that there are an infinite number of even numbers? Consider, how many numbers are there between 0 and 1? The answer is an uncountably infinite number. 0.1, 0.01, 0.001, 0.2, and so on, and not a single one is even. In addition, they are not odd either! One the other hand, even numbers (and odd numbers) are only countably infinite, and for all practical intent, countably infinite divided by uncountably infinite is zero, thus the percentage of all numbers that are even or odd is 0%. It's almost (but not quite) like even and odd numbers don't even exist.
Now, if we add some qualifiers, we can make even and odd numbers relevant. Instead of saying that evenness and oddness are fundamental properties of numbers, let's instead say they are fundamental properties of integers, aka whole numbers. This is actually true, and every whole number is either even or odd. Well, that may not be precisely true... Let's further qualify that, every whole real number is either even or odd. (Imaginary numbers... There are a few ways you might try to qualify complex numbers as even or odd, but they are unintuitive. Purely imaginary integers, with no real component, could be even or odd the same way real integers are. For example, 2i is even. Complex numbers, however, really break when it comes to even and odd. For example, technically sqrt(2) + sqrt(2)i is even, because it's magnitude is 2, and thus, when divided by 2, its magnitude is 1. Also, 6 + 8i has magnitude 10, and dividing it by 2 is 3 + 4i, with magnitude 5, and if those look familiar, it's because they are the common example for Pythagoras Theorem.) So, if we limit our set of possible values to real integers, even and odd are relevant, and every possible value is either even or odd. Now it is a fundamental property.
We can make even and odd relevant, by limiting our set of values, but are these designations useful? We first need to define "even" and "odd", before we can really determine their value. In schools, we are taught that numbers that divide "evenly" by 2 are even, and those that don't are odd. What does this mean though? In real numbers, I can divide say, 3 by 2 and get 1.5. Where's the problem? 3 seems to divide by 2 evenly. Again, we have to limit our domain to integers for this to make sense. At the deepest level, what "even" means in this context is that when we divide a number by 2, all groups this creates have an equal (or even) number of elements. So, if I divide 3 by 2, I get a group of 2 and a group of 1, thus 3 isn't even, but if I divide 4 by 2 I get two groups of 2, which are of equal size, this 4 is even. We can define evenness more mathematically though, using discrete math concepts. Discrete math is merely math using only whole numbers, aka "discrete" values rather than continuous values (real numbers). Discrete math has different mathematical operations than continuous math. The prime example is division, where in continuous math, you just split numbers in smaller parts, for example, 3 / 2 = 1.5, but in discrete math, division gives two outputs. One is the number of even groups the numerator gets split into, and the other is the size of the remaining uneven group, otherwise known as the remainder. Thus, in discrete math, 3 / 2 = 1R1, that is, 1 with a remainder of 1. Discrete math also has an additional operation that is a partial division, that yields only the remainder, which is called the "modulus" operation. In programming, the modulus operation is often represented with the percent sign, thus 3 % 2 = 1. Now we have a foundation for defining "even" and "odd". An even number is a number where the modulus is 0, when it is divided by 2, or, x is even if and only if x % 2 = 0. A number x is odd if and only if x % 2 = 1. So even and odd are actually precisely defined by the remainder of dividing a number by 2. Odd and even are thus a really quick way to see if a number is divisible by 2, and since we divide by 2 so much more often than any other number, is really useful...right? Do we though? Sure, I think we do divide by 2 more often, but not that much more often.
So, here's the problem with this, in my mind: What about division by 3 or 4 or 5...? How often do we find we need to split things into 3 groups or more? Well, plenty often! So why don't we have an equivalent of even and odd for division by 3? First, there are infinite numbers we could divide by, so it isn't possible to have special terminology for all potential divisors. Second, the possible outputs of the modulus operation scales with the magnitude of the divisor. For example, 3 % 3 = 0, 3 % 4 = 1, 3 % 5 = 2. So 3 wouldn't just have even and odd. It would have an analog of even, for x % 3 = 0, and then it would need two terms for x % 3 != 0, one for x % 3 = 1, and one for x % 3 = 2. Sure, we might just combine the two uneven ones, but that's what we already do, when we say something is or isn't divisible by 3, so there really wouldn't be any value in it. And the kicker: If this is true, then how does "even" and "odd" have more value than just saying something is or isn't divisible by 2?
The answer is this: Even and odd is just terminology for saying whether a number is divisible by 2 or not. It's not consistent terminology though, because 2 isn't the only number we ever want to divide by, and we don't have any equivalent terminology for any other number. Further, even and odd only exist within the domain of integers. In the domain of real numbers, even numbers that are even or odd in the domain of integers are evenly divisible by everything within that domain. Thus, in the domain of real numbers, either 2 and 4 aren't even, or 3 and 5 are even, because fractional parts allow them all to divide into perfectly equal groups always. Mathematically though, we can assert that evenness and oddness cannot exist, within a domain that does not have a modulus operation.
What it all comes down to is that even and odd are merely terminology limited to the domains within which all operations and values are discrete. Outside of computer science, discrete math is limited mostly to casual, day to day math involving coherent objects, and these terms are only useful when dividing by 2, which is common, but not so much more common it needs special treatment. Given that, I would assert that "odd" and "even", while legitimate properties of whole numbers, are not unique properties that justify their own terminology.
No comments:
Post a Comment