In ZFC, we usually define the natural numbers as finite ordinals formed by inductive nesting from the empty set (each $n$ is the set of all smaller naturals); from them one models the integers as equivalence classes of pairs of naturals (interpreting $[a,b]$ as $a-b$), and the rationals as equivalence classes of pairs of integers $(m,n)$ with $n\ne 0$. The reals can be identified abstractly with $\mathcal{P}(\mathbb{N})$ by coding each real as an infinite binary sequence (the characteristic function of a subset of $\mathbb{N}$); one then “dresses” this set with the usual analytic structure by interpreting sequences as binary expansions, or as Cauchy sequences or Dedekind cuts of $\mathbb{Q}$, thereby obtaining the standard order and complete ordered-field operations.
Algebraic numbers are the reals that solve non-trivial integer-coefficient polynomials; the set is countable and dense, and every algebraic real is computable via effective root approximation uniformly from a polynomial and isolating data (e.g., a rational interval containing exactly the chosen root).
A real $x$ is computable if there is a Turing machine which, given $n$, outputs a rational $q_n$ with $|x-q_n|\le 2^{-n}$. This class is countable, contains all algebraics and many transcendentals such as $e$, $\pi$, and $\ln 2$, and every computable real is definable and algorithmically compressible, since a finite program generates its digits.
Kolmogorov complexity (prefix-free) $K(\sigma)$ of a finite binary string $\sigma$ is the length of the shortest program on a fixed universal prefix-free Turing machine that outputs $\sigma$. A real $x\in[0,1]$ (viewed by its binary sequence) is Martin–Löf random iff there exists a constant $c$ such that for all $n$, $K(x\upharpoonright n)\ge n-c$ (equivalently: $x$ passes all effective null $G_\delta$ tests). Every computable real is non-random; in fact $K(x\upharpoonright n)\le K(n)+O(1)$ for all $n$.
Definable but non-computable reals are those numbers uniquely specified by some formula in a specified identified second-order structure under full semantics, yet cannot be generated by any Turing machine. There are only countably many such reals, since there are only countably many formulas; they remain descriptively compressible by virtue of finite definitions, though some, such as Chaitin’s halting probability $\Omega_U$ for a fixed universal prefix-free machine $U$, are Martin-Löf random (incompressible prefixes in the algorithmic sense) with no effective algorithm to enumerate their digits. Note that full-semantics (“identified”) definability is an external, meta-theoretic second-order notion relative to the ambient set-theoretic universe $V$; while ZFC can formalize definability relative to a structure by coding formulas, and there are first-order pointwise definable models of ZFC (every element definable without parameters), there is nevertheless no single first-order ZFC formula that, inside $V$, picks out exactly the parameter-free definable reals of $V$. For a further explanation of these topics, see: Is the analysis as taught in universities in fact the analysis of definable numbers?
The uncountable remainder are the non-definable transcendentals. It may at first seem they would all be incompressible; however, one can construct compressible non-definables by modifying digits on a fixed computable sparse set of positions to impose simple structure. There are continuum-many such compressible non-definable reals, but they form a Lebesgue measure-zero subset; almost all non-definable transcendentals are Martin-Löf random.
From this identified second-order point of view, we may call the parameter-free definables in the fixed structure $(\mathbb{R},+,\times,<,0,1,\ldots)$ the bright reals, and their complement the dark reals. Bright reals are closed under any operation that is parameter-free definable in the language; in particular they form a subfield of $\mathbb{R}$ (closed under $+$, $\times$, additive inverse, and reciprocal on nonzero elements). In the pure ordered-field language $(+,\times,<,0,1)$ the parameter-free definables are exactly the real algebraic numbers, which form a real-closed (not algebraically closed) field. Enriching the language by adding standard function symbols (e.g. $\exp$, $\sin$) enlarges the bright set and preserves closure under those definable maps, but still does not yield algebraic closure. The bright/dark divide is external and $V$-relative. The dark reals (the non-definables) form a comeager, Lebesgue-measure-one class containing almost all Martin–Löf random reals.
For specific examples of known definable but non-computable transcendentals, refer to the detailed list provided in Hamkins' answer, which includes recursion-theoretic constructs such as the halting problem encoded as $0'$, Kleene's $O$ for computable well-orderings, higher Turing jumps like $0''$ and $0'''$, the set of total computable functions (Tot), true arithmetic (TA), and more exotic ones like $0^\sharp$ assuming large cardinals.
Regarding the Hartmanis–Stearns conjecture: it asserts that if a real’s base-$b$ expansion is real-time computable by a multitape Turing machine, then the number is rational or transcendental (so no irrational algebraic has a real-time expansion); this remains open as of September 2025. There are many real-time computable transcendentals via automatic base-$b$ expansions (finite automata), and more generally via certain low-complexity morphic/pushdown-generated expansions. Notably, if the conjecture holds, then there is no $O(n)$ algorithm for $n$-bit integer multiplication; equivalently, the existence of an $O(n)$ multiplication algorithm would refute the conjecture. For classical transcendental constants such as $e$ (and similarly for $\pi$ in most bases), whether there exist real-time digit algorithms remains open.