wiki:UWSummerStatsWorkshop

Version 3 (modified by iovercast, 9 years ago) (diff)

--

Homepage for the workshop. They have great scholarship opportunities for grad students to cover registration and travel.

Notes from the time I went (July 2015)

SISG 9 - Population Genetics

Taught by Bruce Weir and Jerome Jerome Goudet. Homepage for the R scripts from the workshop.

  • R basics (very quick).
    • It did help me learn about data.frame, and lists, they make much more sense now.
  • Allele Frequency and Hardy-Weinberg
    • I want to know if there's a continuous function that describes the maximum value of the binomial distribution from 0-1. Seems like there should be.
    • The EM algorithm for Two loci isn't guaranteed to converge, sometimes it gets stuck flipping back and forth between two intermediate values. Seems easy to fix, but it'd be curious to know if there was some distribution of genotypes that was guaranteed to break it.

SISG 18 - MCMC for Genetics

Taught by Eric Anderson and Matthew Stephens

  • Monday AM
    • Probability as representation of uncertainty vs long range frequency.
    • Expectation of mean of beta distribution is alpha/(alpha+beta)
    • Jeffreys Prior - a=b=0.5
    • Marginal distribution of y integrating out over theta
    • "Propagating uncertainly" - Take uncertainty into account down the line.
  • Monday AM II
    • Monte Carlo Method - "In search of a definition..." - Approximate expectation based on sample mean of simulated random variables.
      • "Simple sample mean..."
    • Wright-Fisher Model
      • Sampling with replacement between generations
    • Markov Chains
      • Transition probability matrices. Do they have to be symmetric?
      • Limiting distribution (ergodic Markov chain), regardless of where you start, as t->inf the probability of being in any state will be the same.
      • Time averaging over the chain converges to the limiting distribution.
      • "known only up to scale" - shape but not normalizing constant?
      • Reversible jump mcmc? Bridge sampling? Importance sampling?
    • Ergodicity
      • No transient states - No states you can't reach in a finite number of steps.
      • Irreducible - any state is reachable from any other state in a finite number of steps
      • Aperiodic - Can't get stuck in a loop
    • Stationary distribution of Markov chain
      • General balance equation: πP = π, where P is a transition probability matrix and π is the stationary distribution.
    • Time-reversible Markov chains is required to for detailed balance to satisfy general balance
    • Metropolis-Hastincs Algorithm
      • Take state i, propose state j, accept the proposed move with probability min {1, some probability Hastings ratio}
      • f(j)/f(i) x q(i|j)/q(j|i)
        • Ratio of target densites x ratio of proposal densities
      • f(j) is more likely then it increases probablity

quotes

  • "Out of all the tomorrows we might experience...."
  • "Uncertainty is, intrinsically, personal."