Originally Posted by Licht
It's not that i am saying current thoughts on numbers agree with me. I'm saying i think the current concept used in mathematics surrounding numbers is possibly flawed (note the word possibly.) As i give careful consideration to numbers every day as a result of OCD (i shouldn't need to tell you anything about it.) I am one who thinks heavily on such things. Long has been my thought on the numbers 0, 1, and 2. Followed by 4, 5, 7, 9, 13, 14, 15, 19, 21, 24 , and a bunch more then i consider important numbers to myself. But 0, historically, had a much better meaning then today. It was considered absolute 0, whereas today it is considered a placeholder of sorts (although it is treated as an absence correctly.) Or that was the original concept however. It took man a while to come up with it if you know much about the history of numbers and numerical computation. Because you can apply this pattern to numbers that are actual values, doesn't mean that an absence of all numerical value obeys this rule. Same as the way 1/3rd isn't truly expressed in decimal points, however 0.333(►) is literally equal to 1/3. Same gos for 0.666(►) and 2/3. It's been mathematically proven that 0.999(►) is equal to one in fact (rather recently too.) These numbers aren't in a sense truly equal to their counterparts, but mathematically they are in fact equal at the same time. And can be treated as such by substitution for easy calculation. Now if we take the fact that such a flaw exists between literal and mathematical number concepts. We must also take into account that maybe 0's literal meaning is separate from it's mathematical concept. See where i am going?
Forgive me if at any point i seem to veer off, or use language that is difficult to understand. That or words don't seem to belong, i had to edit this to fix that. Have an issue with putting in random words for some reason instead of what i mean to type.
I don't really want to get into it, but 0.999~ does not equal 1, and it was never proven. It was a cheap internet myth or rumour which gathered legs and went off into the abyss. 0.999~ does not exist, it is not a real number. 0.333~ is 1/3, 0.666~ 2/3, ect. 0.333~ is how we, as humans, write out 1/3 in decimal expression. There is no fraction for 0.999~, thus it is not a real number, it does not exist. Decimal expression cannot represent fractions properly, thus we add a ~ to signify that the 3 repeats infinitely.
1/3 * 3 = 3/3.
Another way of writing this is:
1/3 * 3 = 1/1 or 1.0
And even still:
0.333~ * 3 = 1.0
But this statement is incorrect:
0.333~ * 3 = 0.999~
Remember, decimal expression is just something humans invented to represent math on paper. Just because we can physically write out 0.999 on paper does not mean the number exists or can be used in any math. Just because I can write out on paper "dfsdfsdfsdgf" does not make it an english word.
All decimal fractions:
Have appropriate fractions that represent them:
Note that 9/9 would be 1.0, not 0.999~. Go ask any Grade 3er what the fraction 9/9 is in decimal expression. Their answer will not be 0.999~. Neither would any university math professor, scientist, or other smarty pants type people.