When students first learn about fractions, we want them to learn that they are just numbers; new numbers, but numbers nonetheless, that fit into the same system as the whole numbers they are familiar wtih. The number line can help with this, with whole numbers and fractions sitting together, and located in essentially the same way; choose a unit (1, 1/3, 1/10) and then count off a number of those units. It also helps students understand that equivalent fractions are just different ways of writing the same number. When (finite) decimals come along, they get added to the list of representations.

The Common Core emphasizes this unity by treating decimals as just a different way of writing fractions, e.g. in 4.NF.C: “Understand decimal notation for fractions, and compare decimal fractions.” In this view, 0.3 is not a new sort of number, just a different way of writing the number 3/10.

This leads to some difficulties in the use of language, because at some points in the curriculum you do want to distinguish between decimals and fractions, for example when you ask a student to write 4/5 as a decimal or to write 0.125 as a fraction. (“You told me it’s already a fraction!” the smart student might reply.)

The IM curriculum writing team was talking about these difficulties the other day and Cathy Kessel had a useful comment:

There’s a developmental issue. When fractions are introduced, the distinction between number represented and representation is blurred, and similarly for decimals (finite, then repeating). But, when the two types of representations are seen as representing the same thing, then the thing and its representations start to separate more.

Because we want students to develop a conception of the number behind the representation, we start out saying decimals are also fractions. Later we build a negative addition to the number line and add the opposites of fractions. Once we have a robust conception of the number line, inhabited by rational numbers, we want to talk about different ways of expressing those numbers: fractions, decimals, infinite decimals, expressions involving square root symbols and exponents. So we start to distinguish between fractions and decimals, not as numbers, but as forms for expressing numbers. We initially suppress their role as forms in order to gain a robust conception of number; once they are firmly attached to that conception we can distinguish between them.

They only way to do this without giving multiple meanings to the same words would be to invent new words and be consistent in their use. This harks back to the distinction between “numeral” and “number” in the New Math, which didn’t take hold.