**Theorem** [Erdos and Renyi 1959] A threshold function for the connectedness of a Poisson random network is $t(n) = log(n)/n$, i.e. the probability of forming a link a proportional to $log(n)/n$

Show that if $p(n)/t(n) \rightarrow 0$, then there will be isolated nodes with probability 1

Show that if $p(n)/t(n) \rightarrow infinity$, then there will not be any components of size less than n/2 with probability 1.

Show 1 - intuition for rest is that threshold for isolated node is the same as threshold for small component

Definition of exponential function:

$$ e^x = lim_n (1+x/n)^n$$

Taylor series approximation:

$$e^x = 1 + x + x^2/2! + x^3/3!... = \sum x^n/n!$$

Taylor series approximation for exponential function$$[f(x) = f(a) + f'(a)(x-a)/1! + f''(a)(x-a)^2/2!...]$$

Let us show that E[d] = log(n) is the threshold above which we expect each node to have some links

In fact, above this threshold we expect each node to have many links

Once every node has many links, the chance of disconnected components vanishes

Rewrite $E[d] = p(n-1) = r + log(n)$ for some r

Probability that some node is isolated is probability that it has no links

Probability that some link is not present is (1-p)

Links are independent, so probability of isolation is $(1-p)^{n-1}$

Probability that some node is isolated is $$(1-p)^{n-1} = (1 - (r + log(n))/(n-1))^{n-1}$$

Recall that $(1-x/n)^n$ approaches $e^{-x}$

- If x/n vanishes - so let us consider that case-other cases are more extreme and so easy to fill in the missing steps...

Probability that some node is isolated is $$(1 - (r + log(n))/(n-1))^{n-1} = e^{-r-log(n)} = e^{-r}/n$$

Expected number is isolated nodes is $e^{-r}$

$E(d) - log(n) = r \rightarrow \infty$ implies expected number of isolated nodes goes to 0

$E(d) - log(n) = r \rightarrow -\infty$ implies that expected number of isolated nodes becomes infinite.

- E.g., E(d) bounded by M implies $r \rightarrow -log(n) + M$ number of expected isolated nodes goes to n $e^{-M}$

So, the expected number of isolated nodes = $e^{-r(n)}$ goes to 0 if r(n) tends to infinity and to infinity if r(n) tends to minus infinity.

If the expected number tends to 0 then the probability of having one tends to 0

If the expected number tends to infinity, then extra step using Chebyshev and showing that the variance is no more than twice the mean shows the probability of having one goes to 1