Videos
Please change my view: 0/0 = 1.
I have had this argument for over five years now, and yet to be compelled to see the logic that the above statement is false.
A building block of basic algebra is that x/x = 1. It’s the basic way that we eliminate variables in any given equation. We all accept this to be the norm, anything divided by that same anything is 1. It’s simple division. How many parts of ‘x’ are in ‘x’. If those x things are the same, the answer is one.
But if you set x = 0, suddenly the rules don’t apply. And they should. There is one zero in zero. I understand that logically it’s abstract. How do you divide nothing by nothing? To which I say, there are countless other abstract concepts in mathematics we all accept with no question.
Negative numbers (you can show me three apples. You can’t show me -3 apples. It’s purely representative). Yet, -3 divided by -3 is positive 1. Because there is exactly one part -3 in -3.
“i” (the square root of negative one). A purely conceptual integer that was created and used to make mathematical equations work. Yet i/i = 1.
0.00000283727 / 0.00000283727 = 1.
(3x - 17 (z9-6.4y) / (3x - 17 (z9-6.4y) = 1.
But 0 is somehow more abstract or perverse than the other abstract divisions above, and 0/0 = undefined. Why?
It’s not that 0 is some untouchable integer above other rules. If you want to talk about abstract concepts that we still define- anything to the power of 0, is equal to 1.
Including 0. So we all have agreed that if you take nothing, then raise it to the power of nothing, that equals 1 (00 = 1). A concept far more bizzarre than dividing something by itself. Even nothing by itself. Yet we can’t simply consistently hold the logic that anything divided by it’s exact self is one, because it’s one part itself, when it comes to zero. (There’s exactly one nothing in nothing. It’s one full part nothing. Far logically simpler that taking nothing and raising it to the power of nothing and having it equal exactly one something. Or even taking the absence of three apples and dividing it by the absence of three apples to get exactly one something. If there’s exactly 1 part -3 apples in another hypothetically absence of exactly three apples, we should all be able to agree that there is one part nothing in nothing).
This is an illogical (and admittedly irrelevant) inconsistency in mathematics, and I’d love for someone to change my mind.
Like 10/2- imagine a 10 square foot box, saying 10 divided by 2 is like saying “how many 2 square foot boxes fit in this 10 square foot box?” So the answer is 5.
But if you take the same box and ask “how many boxes that are infinitely small, or zero feet squared, can fit in the same box the answer would be infinity not “undefined”. So 10/0=infinity.
I understand why 2/0 can’t be 0 not only because that doesn’t make and since but also because it could cause terrible contradictions like 1=2 and such.
Ah math is so cool. I love infinity so if anyone wants to talk about it drop a comment.
Edit: thanks everyone so much for the answers. Keep leaving comments though because I’m really enjoying seeing it explained in different ways. Also it doesn’t seem like anyone else has ever been confused by this judging by the comment but if anyone is I really liked this video https://www.khanacademy.org/math/algebra/x2f8bb11595b61c86:foundation-algebra/x2f8bb11595b61c86:division-zero/v/why-dividing-by-zero-is-undefined
If you start with Peano axioms for the natural numbers, then $0$ is part of the language, but $1$ is not. We use $1$ as a shorthand for the term $s0$.
Now we can use the axiom $\forall x(sx\neq0)$, and infer that in particular for $x=0$ it is true that $s0\neq0$. Congratulations, we proved that $0\neq1$ axiomatically.
You can choose different contexts, like set theory, field theory, ring theory or other contexts in which we can interpret $0$ and $1$. You can also find contexts in which $0=1$ is a provable statement. For example the theory whose single axiom states $0=1$. True this theory describes very little of what we expect from the natural numbers, or the symbols $0,1$ to mean. But it is a mathematically valid thing to do.
1) Prove $a \times 0 = 0$ for all $a$.
$a \times 0 = a(1 - 1) = a - a = 0$.
2) If $0 = 1$ then $a = a \times 1 = a \times 0 = 0$
So all terms equal 0.
Which isn't actually a contradiction. It just means we are working with a trivial field. If the field isn't trivial (say the Reals) than $0 \ne 1$.
First I want to say I’m doing algebra 1 this year and this may be covered later in math, but ever since I learned anything the the power of 0 equals 1 I’ve been confused by that. I’ve tried searching it online but I can’t find anything that explains it, everything I can find doesn’t answer the question.
From my understanding how exponents work is the exponent is how many of that number you have then you multiply them. For ex. 324 is 32 * 32 * 32 * 32. So shouldn’t for ex. 920 = 0 since that would mean you don’t have any 92s.
Edit: thanks to everyone who actually answered in stead of saying that it’s not true with 00 or just not answering.
If $0/0$ were equal to $1$, then $1=\frac{0}{0}=\frac{0+0}{0}=\frac{0}{0}+\frac{0}{0}=1+1=2$.
In lay terms, evaluating 0/0 is asking "what number, when multiplied by zero, gives zero". Since the answer to this is "any number", it cannot be defined as a specific value.