I feel that continuity is best understood when we consider convergence at different levels of abstraction. While it’s fairly easy to understand the continuity of functions when they’re defined in spaces like R2, with standards like:
- The left hand limit must equal the right hand limit.
- The function should have a finite value at each point throughout any given domain.
Definitions like the ones I’ve mentioned above are also called definitions of continuity derived out of point-wise convergence. In general, a point-wise convergence definition of continuity looks something like this:
There are other more rigorous definitions for continuity, one of the common definitions that’s taught in beginner level calculus classes is:
All the conceptual analysis set aside, there are problems with these point-wise definitions of continuity and convergence because they don’t preserve boundedness or the continuity of functions. Anyone who’s taken higher level functional analysis would know that there are much more general definitions of continuity, which aren’t counter-intuitive or downright false.
These are called uniform-convergence definitions of continuity.
Uniform Convergence and Continuity
To be precise, uniform convergence directly implies continuity of the function across any domain or subset of the domain. This definition doesn’t take into account convergence of the values of x, rather, it looks at the convergence of sequences of functions. Put simply, if you can show that the function in question belongs to a sequence of functions that converge uniformly across any domain, then the function is continuous across the entire domain as well. The formal definition is as follows:
Definition A: If a sequence {fn} of continuous functions fn: A→R converges uniformly on A⊂R to f: A→R, f is continuous on A.
As a method of proof, if you’re looking to prove that a certain function is continuous—all you need to prove is that the function in question belongs to a sequence of continuous functions on a domain A, which is a subset of the set of real numbers. Definition A, as I’ve written it above preserves the boundedness of sequences and preserves continuity in a way that’s not possible through the point-wise definition of convergence.
This definition A is a direct extension of Cauchy’s definition of uniform convergence, which is as follows:
Cauchy’s Definition of Uniform Convergence: A sequence {fn} of functions fn: A→ R is uniformly Cauchy on A if for every ε>0 there exists N∈N such that m, n>N implies that |fm(x) −fn(x)|<ε for all x∈ A.
To speak loosely, if the absolute difference between two successive functions belonging to the same sequence gets smaller, this implies that the sequence is converging. Assuming that such a convergence function exists, each function within the sequence is going to be continuous.
If you’re interested in studying continuity or the limits of functions, you should read Algebraic General Topology Volume 1. This book introduces my mathematical theory that generalizes limits across arbitrary discontinuous functions. As always, I’m open to new ideas and thoughts on my work and welcome open debate on any issue you find.
Learn more about my general topology book here.