This is a soft paradox, in the vein of Simpson's paradox the boy or girl paradox and the Monty Hall problem. I will first present the problem, let the reader think about it for a bit, and then present the solution and show why it's counterintuitive. Let $\{C_n\}_{n \in \mathbb N} \overset{\text{i.i.d.}}{\sim} U(0,1]$ be a collection of independent and identically distributed random variables uniformly distributed on $(0,1]$. Let $$f(x) = \sum_i^\infty (C_n)^n x^n $$ What is the expected radius of convergence, $\mathbb E[R_c]$, for $f(x)$?
This problem is actually underspecified, and has two answers which can both be interpreted as correct. If you want to, try thinking about the problem and coming up with a solution before looking at the hints.
For this problem you need to understand the uniform distribution and the radius of convergence for power series, in particular the root test. The most advanced piece of mathematics here needed is the Kolmogorov zero-one law. The inuitive explanation of the Kolmogorov zero one-law is not too complex: For an infinite sequence of independent random variables $X_n$, events independent any finite number of $X_n$, usually called tail events, either almost surely or almost never happen.
Question 1: Let $\{C_n\}_{n \in \mathbb N} \overset{\text{i.i.d.}}{\sim} U(0,1]$ be a collection of i.i.d. random variables. We define the radius of convergence, $R_c$ as the random varaible$$\begin{aligned} R_c= \limsup_n 1/C_n \end{aligned}$$ What is $\mathbb E[R_c]$?
Question 2: Let $C:\prod_i \Omega \to \mathbb C[[x]]$ be a random variable which maps an infinite dimensional measure space to the space of formal power series and $\{C_n\}_{n \in \mathbb N} \overset{\text{i.i.d.}}{\sim} U(0,1]$ be a collection of i.i.d. random variables. Define $C$ by $$\begin{aligned}&P(C = g(x)) :=\\ &P\left(\sum_i (C_n)^n x_n = g(x)\right)\end{aligned}$$ What is the radius convergence, $R_c$ of $f(x) = \mathbb E[C]$?
A very natural guess here is that on average, $C_n$ will be $1/2$, so that this series should have the same radius of convergence of $\sum (1/2)^n x^n$, which from the root test gives us a radius of convergence of $2$. If we use the phrasing of Question 1, then we can take the radius of convergence as $R_c = \lim\limits_{n \to \infty} 1/C_n$. By the dominated convergence theorem we can write $\mathbb E[R_c] = \lim\limits_{n \to \infty}1/{\mathbb E}[C_n] = 2$.
The value of $C_n$ is going be within the range $(1-\varepsilon,1)$ infinitely often, so we would naturally expect the radius of convergence to be 1. Using the second phrasing of the question question we cannot take $ 1/\mathbb E[C_n]$ for our radius of convergence, because $\mathbb E[(C_n)^n]^{1/n} \neq \mathbb E[C_n]$ If we use the second phrasing of the theorem we can realize that $$\begin{aligned} P( &\text{there exists infinitely many } C_n \\ &\text{ with } 1-\varepsilon < C_n < 1) = 1 \end{aligned} $$ by the Kolmogorov 0-1 theorem. And so, $$\limsup \mathbb E[(C_n)^n]^{1/n} = 1$$ (note that taking $\limsup$ instead of $\lim$ is vital here)
We see here that a phrasing of a question that's entirely well-defined in the finite setting can be very ill-behaved in the infinite setting. One definition may split into multiple definitions, and those definitions may entirely depend on how you phrase the question, and interrogate the nature of your words. Mathematics is full of these examples, most notable examples being the tychonoff theorem and the distinction between infinite ordinals and cardinals. While this is more abstract than most probability paradoxes, I still think its quite counterintuitive and is deserving of the name. Let me know what you think!