Welcome to our community

Be a part of something great, join today!

A Galton Watson branching process

sabri

New member
May 7, 2019
9
A Galton Watson branching process B is defined P0 = 1/3 ; P1 = (1 - c)2/3 and P2 =c 2/3 where pi is the probability to have 'i' offsprings (C in [0,1]).

a) compute the critical value C* such that for C<= C* the prcesss B will die out with probability one and for C > C* the process will survuve with positive probability.

b) compute the survival probability p1 that the process B starting with one individual will not die out as a function of C .

c) compute the survival probability p10 that the process B starting with 10 individuals will not die out.

I need help (Smile)
 

steep

Member
Dec 17, 2018
51
What sort of tools do you have at your disposal? Branching processes can be taught at many different stages in learning stochastics...

Personally I have a very strong preference to use martingale (plus some markov chain) methods here, though I don't know if you've covered them yet. Branching processes are often also covered using Ordinary Generating Functions among other tools. If you look at your setup and consider the spawning process one organism a time, you should also see that this is a simple random walk (though not necessarily symmetric unless $c = \frac{1}{2}$), so if you have covered random walks and/or gamblers ruin, you can use results from that too-- though this is in 'lazy' form with self loops at each stage this doesn't change the end results of 'bankruptcy' or 'escape'.

So there are many different ways to proceed here depending on what you know.
= = = = =
To get this started, supposing you start with exactly 1 organism:
what is the expected population size after $n$ generations / epochs? Note: if your expected population size is contracting, and the underlying random variable is non-negative (it is) you could apply markov inequality to show with probability 1 it goes extinct.
 
Last edited:

sabri

New member
May 7, 2019
9
I'm having this course called " Math for Electronics" where we've covered :

  1. probability space
  2. calc. of prob. events
  3. independence
  4. Random variable
  5. law of large numbers
  6. cumulative distribution function
  7. probability density function
  8. binomial distribution
  9. Expectation
  10. Variance
  11. Central limit Theorem
  12. Poisson distribution
  13. Markov inequality
  14. Chebyshev theorem

    still have no idea how to solve this task (Worried)
 

steep

Member
Dec 17, 2018
51
I'm having this course called " Math for Electronics" where we've covered :

  1. probability space
  2. calc. of prob. events
  3. independence
  4. Random variable
  5. law of large numbers
  6. cumulative distribution function
  7. probability density function
  8. binomial distribution
  9. Expectation
  10. Variance
  11. Central limit Theorem
  12. Poisson distribution
  13. Markov inequality
  14. Chebyshev theorem

    still have no idea how to solve this task (Worried)

  1. First question -- what does "lawe of large numbers" include? Does it include Strong Law of Large numbers? You may be able to sneakily get an answer using SLLN, though not with WLLN.

    what has been covered in this week that you've had this problem assigned? There must be more relevant information here. E.g. I just looked through Karlin and Taylor's A First Course in Stochastic Processes. They have an entire chapter on branching processes, fairly late in the book and attack these branching processes from many different angels -- but you didn't mention a single topic/technique used in that chapter...

    More to the point, your problem statement form makes this look like a markov chain and in any case to even understand a branching process I think you need to know the markov property...
    = = = =
    For a problem like this always try starting with the simplest interesting case -- i.e. when the initial state has only one organism for this problem.

    Supposing you know the markov property, you could likely skate by to an answer using "first step analysis"

    i.e.

    letting $\pi$ be the probability of ultimate extinction (your questions ask for the complement of this but that's a minor detail), supposing you start at state 1 (1 organism) you should be able to justify the equation
    $\pi = p_0 + p_1 \pi +p_2 \pi^2$

    If you can justify and understand this, then you have the complement of the answer** of (b) given by $\pi$, and can immediately justify $\pi^{10}$ as the complement of the answer to (c). Why?


    ** there are some thorns though. When you solve this quadratic you'll have double roots in some cases where either root plausibly could be true. You'll want to always take the smaller of the two roots but this needs justification -- typically by a continuity theorem on the underlying generating function or by a monotone convergence related argument. This is one of the reasons why I worry about the machinery you have at your disposal... though perhaps in 'math for electronics' you don't need proof to support why you throw out a root?

    - - - - -
    Other ideas:
    there are other combinatorial approaches if you recognize the underlying random walk -- you could treat this as a ballot problem or even compute expected visits to zero and apply Borell-Cantelli to deal with the the case of expected growth per stage $\gt 1$ and as I said before if the expected size contracts over time you can apply markov inequality to show extinction WP1.

    Chebyshev inequality is too weak to use for this problem I'm afraid.
 

sabri

New member
May 7, 2019
9
unfortunately we didn't cover SLLN , and yes you are right we had Karlin and Taylor in Stochastic Processes .

Thank you for your time , I'll go through the hints you gave me trying to solve it (Smile)
 

steep

Member
Dec 17, 2018
51
unfortunately we didn't cover SLLN , and yes you are right we had Karlin and Taylor in Stochastic Processes .

Thank you for your time , I'll go through the hints you gave me trying to solve it (Smile)
Let me know how it goes..

If you had Karlin and Taylor's "A First Course", this problem should be a snack! That book's "Elementary Problems" are challenging, while its "Problems" are just brutal.

= = = = =
looking through some other books, if you have an engineering background, probably most the accessible and commonly presented approach for galton branching processes is to use OGFs / z-transforms
 

sabri

New member
May 7, 2019
9
Last edited:

steep

Member
Dec 17, 2018
51
could you please check in this pdf if what i have done is correct or not
hi, I did a whole long write up, hit submit and then got a message that I've been logged out... after I logged back in it to me to a blank page and I lost my whole response. :mad: ...

= = = =
back to the matter at hand. I think it is pretty good. One thing in particular is to consider your general case

$q = \sum_{i=0}^n p_I q^i$

you can think about this as $E\big[q^{X_1}\big] $ for $q \in [0,1]$ and you want a fixed point where $E\big[q^{X_1}\big] = q$
or it can be convenient to instead think of the function given by $r(q) = E\big[q^{X_1}\big] - q$.

I'd suggest sketching out $r(q)$. In particular $r(1) = 0$ as you've observed. Note that $r'(1) = E\big[X_1\big]-1$ so this is a positive slope when the mean is $\gt 1$.

When $q \to 0^+$ you have $ r(q) \to p_0 \gt 0$ which suggests the existence of another root $q^* \in (0,1)$. you can differentiate twice to discover that $r''(q) \gt 0$ so it is a convex function and only has those two roots. (Given the slope at $q=1$ in the case of a mean $\leq 1$ you have a linear lower bound of that tangent line saying that there can never be another root $\in (0,1)$)

I'd strongly suggest sketching this out on paper and on the whiteboard -- the curvature makes for a nice picture and discussion with a class...

I had a whole writeup on this fact, but a possible suggestion -- it may be worth trying to prove that $q^*$ is a fixed point of $E\big[q^{X_1}\big]$ implies that it is a fixed point for $E\big[q^{X_n}\big]$ i.e. for all generations... this implies that $q^*$ is an upper bound on total probability of extinction at time $n$ (why? hint: at any generation we have a linear combination involving strictly positive terms that sum to this fixed point value). The fact that $q^* \lt 1$ is an upper bound then gives you an excuse for throwing out the junk root of $1$ in the case of $E\big[X_1\big] \gt 1$. (It also has nice monotone convergence properties in that your cumulative probability of extinction is a monotone non-decreasing sequence bounded above by $q^*$ so the limit $L$ exists and is at most $q^*$, but the limitting value must obey the 'first step analysis' you did and hence it must be the case that $L = q^*$. )
= = = =

double check your (c) at the very end -- it seems like you didn't take the complementary probability correctly...
 

sabri

New member
May 7, 2019
9
thanks a lot i appreciate your help (Smile)