If were equal to
, then
.
If you divide a number by 0, you are dividing it by nothing and should get the same number right?
If this isn't true for some reason why does this logic work with multiplication? 1*0=0 is a possible calculation even though you are multiplying by 0.
Videos
Please change my view: 0/0 = 1.
I have had this argument for over five years now, and yet to be compelled to see the logic that the above statement is false.
A building block of basic algebra is that x/x = 1. It’s the basic way that we eliminate variables in any given equation. We all accept this to be the norm, anything divided by that same anything is 1. It’s simple division. How many parts of ‘x’ are in ‘x’. If those x things are the same, the answer is one.
But if you set x = 0, suddenly the rules don’t apply. And they should. There is one zero in zero. I understand that logically it’s abstract. How do you divide nothing by nothing? To which I say, there are countless other abstract concepts in mathematics we all accept with no question.
Negative numbers (you can show me three apples. You can’t show me -3 apples. It’s purely representative). Yet, -3 divided by -3 is positive 1. Because there is exactly one part -3 in -3.
“i” (the square root of negative one). A purely conceptual integer that was created and used to make mathematical equations work. Yet i/i = 1.
0.00000283727 / 0.00000283727 = 1.
(3x - 17 (z9-6.4y) / (3x - 17 (z9-6.4y) = 1.
But 0 is somehow more abstract or perverse than the other abstract divisions above, and 0/0 = undefined. Why?
It’s not that 0 is some untouchable integer above other rules. If you want to talk about abstract concepts that we still define- anything to the power of 0, is equal to 1.
Including 0. So we all have agreed that if you take nothing, then raise it to the power of nothing, that equals 1 (00 = 1). A concept far more bizzarre than dividing something by itself. Even nothing by itself. Yet we can’t simply consistently hold the logic that anything divided by it’s exact self is one, because it’s one part itself, when it comes to zero. (There’s exactly one nothing in nothing. It’s one full part nothing. Far logically simpler that taking nothing and raising it to the power of nothing and having it equal exactly one something. Or even taking the absence of three apples and dividing it by the absence of three apples to get exactly one something. If there’s exactly 1 part -3 apples in another hypothetically absence of exactly three apples, we should all be able to agree that there is one part nothing in nothing).
This is an illogical (and admittedly irrelevant) inconsistency in mathematics, and I’d love for someone to change my mind.
Like 10/2- imagine a 10 square foot box, saying 10 divided by 2 is like saying “how many 2 square foot boxes fit in this 10 square foot box?” So the answer is 5.
But if you take the same box and ask “how many boxes that are infinitely small, or zero feet squared, can fit in the same box the answer would be infinity not “undefined”. So 10/0=infinity.
I understand why 2/0 can’t be 0 not only because that doesn’t make and since but also because it could cause terrible contradictions like 1=2 and such.
Ah math is so cool. I love infinity so if anyone wants to talk about it drop a comment.
Edit: thanks everyone so much for the answers. Keep leaving comments though because I’m really enjoying seeing it explained in different ways. Also it doesn’t seem like anyone else has ever been confused by this judging by the comment but if anyone is I really liked this video https://www.khanacademy.org/math/algebra/x2f8bb11595b61c86:foundation-algebra/x2f8bb11595b61c86:division-zero/v/why-dividing-by-zero-is-undefined
I understand that you can't divide anything by 0, but I can see arguments why it could be 0 (0 divided by anything is 0) or 1 (anything divided by itself is 1). Personally, before I plugged 0/0 in my calculator, I thought the answer would be 0. I'm just curious if there's a special reason why 0/0 is undefined, like how there's a special reason why 1 is not prime.
The other comments are correct: $\frac{1}{0}$ is undefined. Similarly, the limit of $\frac{1}{x}$ as $x$ approaches $0$ is also undefined. However, if you take the limit of $\frac{1}{x}$ as $x$ approaches zero from the left or from the right, you get negative and positive infinity respectively.
$1/x$ does tend to $-\infty$ as you approach zero from the left, and $\infty$ as you approach from the right:

That these limits are not equal is why $1/0$ is undefined.