Jump to content

Undefined (mathematics)

From Wikipedia, the free encyclopedia
(Redirected from Defined and undefined)

In mathematics, the term undefined is often used to refer to an expression which is not assigned an interpretation or a value (such as an indeterminate form, which has the possibility of assuming different values).[1] The term can take on several different meanings depending on the context. For example:

  • In various branches of mathematics, certain concepts are introduced as primitive notions (e.g., the terms "point", "line" and "plane" in geometry). As these terms are not defined in terms of other concepts, they may be referred to as "undefined terms".
  • A function is said to be "undefined" at points outside of its domain – for example, the real-valued function is undefined for  .
  • In algebra, some arithmetic operations may not assign a meaning to certain values of its operands (e.g., division by zero). In which case, the expressions involving such operands are termed "undefined".[2]

Undefined terms

[edit]

In ancient times, geometers attempted to define every term. For example, Euclid defined a point as "that which has no part". In modern times, mathematicians recognize that attempting to define every word inevitably leads to circular definitions, and therefore leave some terms (such as "point") undefined (see primitive notion for more).

This more abstract approach allows for fruitful generalizations. In topology, a topological space may be defined as a set of points endowed with certain properties, but in the general setting, the nature of these "points" is left entirely undefined. Likewise, in category theory, a category consists of "objects" and "arrows", which are again primitive, undefined terms. This allows such abstract mathematical theories to be applied to very diverse concrete situations.

In arithmetic

[edit]

The expression is undefined in arithmetic, as explained in division by zero (the expression is used in calculus to represent an indeterminate form).

Mathematicians have different opinions as to whether 00 should be defined to equal 1, or be left undefined.

Values for which functions are undefined

[edit]

The set of numbers for which a function is defined is called the domain of the function. If a number is not in the domain of a function, the function is said to be "undefined" for that number. Two common examples are , which is undefined for , and , which is undefined (in the real number system) for negative .

In trigonometry

[edit]

In trigonometry, for all , the functions and are undefined for all , while the functions and are undefined for all .

In complex analysis

[edit]

In complex analysis, a point where a holomorphic function is undefined is called a singularity. One distinguishes between removable singularities (i.e., the function can be extended holomorphically to ), poles (i.e., the function can be extended meromorphically to ), and essential singularities (i.e., no meromorphic extension to can exist).

In computability theory

[edit]

Notation using ↓ and ↑

[edit]

In computability theory, if is a partial function on and is an element of , then this is written as , and is read as "f(a) is defined."[3]

If is not in the domain of , then this is written as , and is read as " is undefined".

It is important to distinguish "logic of existence"(the standard one) and "logic of definiteness". Both arrows are not well-defined as predicates in logic of existence, which normally uses the semantics of total functions. Term f(x) is a term and it has some value for example , but in the same time can be a legitimate value of a function. Therefore the predicate "defined" doesn't respect equality, therefore it is not well-defined.

The logic of definiteness has different predicate calculus, for example specialization of a formula with universal quantifier requires the term to be well-defined. Moreover, it requires introduction of a quasi-equality notion, which makes necessary the reformulation of the axioms. [4]

The symbols of infinity

[edit]

In analysis, measure theory and other mathematical disciplines, the symbol is frequently used to denote an infinite pseudo-number, along with its negative, . The symbol has no well-defined meaning by itself, but an expression like is shorthand for a divergent sequence, which at some point is eventually larger than any given real number.

Performing standard arithmetic operations with the symbols is undefined. Some extensions, though, define the following conventions of addition and multiplication:

  •    for all .
  •    for all .
  •    for all .

No sensible extension of addition and multiplication with exists in the following cases:

  • (although in measure theory, this is often defined as )

For more detail, see extended real number line.

References

[edit]
  1. ^ Weisstein, Eric W. "Undefined". mathworld.wolfram.com. Retrieved 2019-12-15.
  2. ^ Bogomolny, Alexander. "Undefined vs Indeterminate in Mathematics". Cut-the-Knot. Retrieved 2019-12-15.
  3. ^ Enderton, Herbert B. (2011). Computability: An Introduction to Recursion Theory. Elseveier. pp. 3–6. ISBN 978-0-12-384958-8.
  4. ^ Farmer, William M.; Guttman, Joshua D. (October 2000). "A Set Theory with Support for Partial Functions" (PDF). Studia Logica. 66 (1, Partiality and Modality): 59–78. doi:10.1023/A:1026744827863.

Further reading

[edit]
  • Smart, James R. (1988). Modern Geometries (Third ed.). Brooks/Cole. ISBN 0-534-08310-2.