Many mathematicians insist that is not a natural number, motivated by the philosophical something-or-nothing dilemma. Computer scientists, on the other hand, mostly assert that
is a natural number, motivated by zero-indexing in for-loops.
Most otherwise sane academics try to harmonise both views by saying that being a natural number is context-specific, and is included or excluded for purposes amounting to notational convenience.
I attempt to formalise this perspective by asserting that one thing is for certain: must be a nonnegative integer. More precisely, we can define
to be exactly one of the following sets:
or
.
Theorem. Let be any set and
denote any object. Then
is exactly one of
or
. Particularising to
and
yields the desired result.
Proof. We observe that
The claim now is that is exactly one of
or
. Now, exactly one of
or
must hold. In the former, we are done.
Suppose the latter, that is, . We aim to prove that
. To that end, since
, there exists
such that
.
If , then
, a contradiction. Thus,
. Hence,
,
as required.
To conclude: Using this formulation, mathematicians use the convention , while computer scientists use the convention
. This allows both of them in their contexts to write one less symbol (
for mathematicians,
for computer scientists).
Now end your foolish controversies and needless debates.
—Joel Kindiak, 3 Nov 24, 1957H
Leave a reply to The Origin of Numbers – KindiakMath Cancel reply