### Infinitesimal Calculus on the Reverse in my Book “Limit of a Discontinuous Function”

Traditional calculus as first considered in 17th century by Isaac Newton (and Leibniz, however some say Leibniz stole the Newton’s idea) and then 150 years later formalized (formulated correctly) by Cauchy and Weierstrass, uses limits.

Initially calculus was called “infinitesimal calculus”, but in recent time the collocation “infinitesimal calculus” is usually used for a more modern form of calculus, where instead of traditional limits they use infinitesimals, infinitely small quantities. Some letter δ denotes an infinitely small quantity in an axiomatic system extending traditional real numbers. Well, the word infinitesimal dates to ancient Greek mathematicians, but they were not quite understanding what they are saying.

In 21st century Victor Porton discovered something similar but different than infinitesimal calculus, generalized limit of arbitrary function.

For continuous functions that generalized limit behaves just like an usual number. But if the function goes to infinity, the generalized limit takes infinite values.

So infinitesimal calculus considers infinitely small quantities, but the Victor Porton’s book “Limits of a Discontinuous Function” considers such things as infinitely big quantities or any kinds of discontinuities, a function tending to more than one value. In a sense, limit of a discontinuous function is a “reverse” infinitesimal calculus, the science of infinitely big rather than infinitely small.

Despite of these limits have multiple values, they behave algebraically just like usual numbers of vectors, for example yy = 0 if y is such a generalized limit.

This allows to easily define such things as derivative of an arbitrary function, integral of an arbitrary function, etc. Arbitrary!

The book further considers an infinite hierarchy of more and more infinite infinities and defines non-differentiable solutions for differential equations.

This is going to revolutionize all quantitative sciences and mathematics itself. Read the book.