Decimals
Before we talk about comparing and ordering decimals, let's cover some definitions. A decimal number is defined as a number that has a decimal point in it. A decimal point is a point or dot used to show the beginning of digits that are smaller than 1. Decimal numbers are another way to represent fractions.
For example, the decimal number 21.5 is another way of saying 21 1/2. The decimal point is placed before the 5 to separate the 21 (the whole number) from the .5 (the fraction). The point shows that the 5 is smaller than, or just a fraction of, the whole number 1. We know that 0.5 is actually half of 1. The same is true of the decimal 3.14. The number after the decimal, 0.14, is smaller than 1.
Using Decimals to Count
We count decimals in a similar manner to regular numbers. For regular numbers, we start with 1, then 2, then 3, then 4, etc. As the numbers get bigger, the numbers also get longer. For example, 201 is bigger than 31. This is the way we count for the part of the decimal number that is to the left of the decimal point (the whole part of the number). But once we reach the number to the right of the decimal point (the fraction part of the number), the way we count changes a bit.
I want you to stop for a moment and think about how we alphabetize names. Picture a filing cabinet with a drawer open. What do you see? I have a filing cabinet in my office, and the way I alphabetize is by the first letter, followed by the second letter, and so on and so forth. Each successive letter in a word helps me to get even more detailed in my filing.
A folder called 'BOY' is more specific than something called 'BO,' and I can file it after 'BO' but before 'BP.' Counting decimals is similar. The first digit after the decimal point is like our first letter. The second digit after the decimal point is like our second letter
.
Similar to the alphabet, we can count with just our first digit after the decimal point or we can get even more detailed. Just like we can say A, B, C, D, E, F, G, and so on, we can count with just the first digit after the decimal point like this:
0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9
Just like we can get more specific when alphabetizing by filing things by AA, AB, AC, AD, AE, AF, AG, and so on, we can count with another digit to be more even specific, like this:
0.11, 0.12, 0.13, 0.14, 0.15, 0.16, 0.17, 0.18, 0.19
I stopped counting when the last digit reached 9 because at this point, something unique happens to decimal numbers. For regular numbers, what happens after you reach 9 or 19? Why, the digit to the left increases by 1, and your 9 changes to a 0. The 9 becomes a 10 or the 19 becomes a 20. And then you keep counting by increasing your last digit by ones again. For decimal numbers, when you reach 0.9 or 0.19, what you do is you also increase the digit to the left of the 9 by 1. Your decimal point stays put. So the 0.9 becomes a 1.0 and the 0.19 becomes a 0.20.
Unless the problem tells you otherwise, we don't write zeroes at the ends of decimals. So the 1.0 will be written as 1 and the 0.20 will be written as 0.2. You can compare this to alphabetizing. With alphabetizing, when you reach 'BZ,' you roll over to 'C.' After the 'C,' you can start getting specific again, with 'CA,' 'CB,' and so on.
If we continue counting from the 0.9, we would get 1, then 1.1, then 1.2, and so on, until we reach 1.9. When we reach this point, we increase our digit to the left of the 9 by 1 to get 2. Then, we will continue again at 2.1 and so on. If we continue counting from the 0.19, we would get 0.2, then 0.21, then 0.22, and so on. When we reach 0.29, we would go to 0.3 and then continue to 0.31 and so on.