We don't learn numbers from set cardinality
As pointed out in Where Mathematics Comes From (WMCF), we are born with an innate sense for numbers, which gets fuzzier very fast as the numbers grow bigger. We can also subitize collections, that is to say instantly determine the cardinality, of collections of up to 3 objects. It is likely that children learn about bigger integers by playing with collections of objects, adding or subtracting objects from them, merging collections, and linking the resulting cardinalities to their innate number sense. The authors of WMCF also remark that there is a correspondence between operations on collections and basic arithmetic operations. For example, merging a collection of cardinality 1 with a collection of cardinality 2 results in a collection of cardinality 3, which maps cleanly to the addition 1 + 2 = 3. Now to my point, conceiving of numbers as the cardinality of collections of objects is reminiscent of, though not the same as, the definition of numbers seen in Zermelo-Fraenkel set theory (ZFC), where a number is the set of all smaller numbers, with 0 the empty set. However, the formal ZFC definition is not the most intuitive. A more approachable way to conceive of numbers is as the cardinality of sets. And indeed, if we take a set of 1 fruit, another set of 2 fruits, then take their union, we end up with a set of 3 fruits, mirroring the behavior of real-world collections. A neat correspondence between collections of objects, sets, and arithmetic, right? But there is an issue. Consider a water molecule H-O-H. Naturally you can subitize its elements and tell that it contains 3 atoms. However if you take the set of atoms in H-O-H: it is {H, O}! not {H, O, H}, and it has a cardinality of 2, which doesn't map to the 3 atoms of our water molecule, because elements of sets must be distinct. This uniqueness constraint is the issue: it breaks the mapping from collections of objects to sets. To map cleanly to real-world collections, we need a mathematical object that preserves th