In computer science, the iterated logarithm of n {\displaystyle n} , written log* n {\displaystyle n} (usually read "log star"), is the number of times the logarithm function must be iteratively applied before the result is less than or equal to 1 {\displaystyle 1} .[1] The simplest formal definition is the result of this recurrence relation:
In computer science, lg* is often used to indicate the binary iterated logarithm, which iterates the binary logarithm (with base 2 {\displaystyle 2} ) instead of the natural logarithm (with base e). Mathematically, the iterated logarithm is well defined for any base greater than e 1 / e ≈ 1.444667 {\displaystyle e^{1/e}\approx 1.444667} , not only for base 2 {\displaystyle 2} and base e. The "super-logarithm" function s l o g b ( n ) {\displaystyle \mathrm {slog} _{b}(n)} is "essentially equivalent" to the base b {\displaystyle b} iterated logarithm (although differing in minor details of rounding) and forms an inverse to the operation of tetration.[2]
The iterated logarithm is useful in analysis of algorithms and computational complexity, appearing in the time and space complexity bounds of some algorithms such as:
The iterated logarithm grows at an extremely slow rate, much slower than the logarithm itself, or repeats of it. This is because the tetration grows much faster than iterated exponential:
y b = b b ⋅ ⋅ b ⏟ y ≫ b b ⋅ ⋅ b y ⏟ n {\displaystyle {^{y}b}=\underbrace {b^{b^{\cdot ^{\cdot ^{b}}}}} _{y}\gg \underbrace {b^{b^{\cdot ^{\cdot ^{b^{y}}}}}} _{n}}
the inverse grows much slower: log b ∗ x ≪ log b n x {\displaystyle \log _{b}^{*}x\ll \log _{b}^{n}x} .
For all values of n relevant to counting the running times of algorithms implemented in practice (i.e., n ≤ 265536, which is far more than the estimated number of atoms in the known universe), the iterated logarithm with base 2 has a value no more than 5.
Higher bases give smaller iterated logarithms.
The iterated logarithm is closely related to the generalized logarithm function used in symmetric level-index arithmetic. The additive persistence of a number, the number of times someone must replace the number by the sum of its digits before reaching its digital root, is O ( log ∗ n ) {\displaystyle O(\log ^{*}n)} .
In computational complexity theory, Santhanam[6] shows that the computational resources DTIME — computation time for a deterministic Turing machine — and NTIME — computation time for a non-deterministic Turing machine — are distinct up to n log ∗ n . {\displaystyle n{\sqrt {\log ^{*}n}}.}