Changes between Version 13 and Version 14 of UWSummerStatsWorkshop


Ignore:
Timestamp:
Jul 23, 2015, 1:02:18 PM (9 years ago)
Author:
iovercast
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • UWSummerStatsWorkshop

    v13 v14  
    133133 * ML tends to overfit to the data when you have a large parameter space relative to the sample size...  
    134134=== Wednesday PM2 === 
    135 *  
     135* "You've got your data, it's not random anymore.... You've run your experiment, your data certainly aren't random." 
     136* Partition is a collection of sets.  
     137* Axioms 
     138 * Total probability - Sum of all events in a partition = 1 
     139 * Marginal probability - Sum of probability of event E intersected with all possible events in the partition.  
     140* Likelihood ratio x prior odds = posterior odds 
     141* Standard distributions 
     142 * Discrete random variable 
     143 * Pr(Y=y) = p(y) probability density function, must be 0 <= p(y) <= 1 & sum of all p(y) = 1 
     144 * Probability densities 
     145  * Always >=0, sum of all area under curve = 1 
     146* Binary distribution 
     147 * Y={1,0} 
     148 * Pr(Y=y|θ) = p(y|θ) = θ^y(1-θ)^(1-y) 
     149* Binomial distribution 
     150* Likelihood inference 
     151* Poisson distribution 
     152* Why use gamma vs beta? 
     153 
     154=== Thursday AM1 === 
     155* Posterior is proportional to the likelihood times the prior (colloquial). 
     156* Estimation, hypothesis testing, prediction. 
     157* Beta distribution as a prior: 
     158 * Mean of beta is a/(a+b) 
     159 * Beta is flexible, to a degree 
     160* Uniform prior is tricky, even though its "uninformative", its not uniform on all scales. 
     161* Conjugate: posterior is the same form as the prior. 
     162 * Theta^(y+a-1) * (1-Theta)^(N-y+b-1) 
     163 * Posterior mean: (y+a)/(N+a+b) 
     164 * Weighted estimator of sample mean and prior mean 
     165  * As sample size increases sample mean weight increases 
     166  * With fixed sample size increasing a+b (beta parameters) increases weight on prior. 
     167  * Nonsymmetric vs asymmetric? 
     168 * Averaging out over all the possible values theta could takeweighted by the posterior (Averaging out the uncertainty). 
     169* Hypothesis testing: 
     170 * Ratio of probabilities of data given model 1 vs model 2 or null. 
     171* "Bayesian modelling can be very intoxicating." 
     172=== Thursday AM2 === 
    136173 
    137174==== quotes ==== 
     
    144181* "You're going to be wrong whatever you do. These are cartoons of reality, none of these models are right." 
    145182* "There's only one tomorrow, or maybe there are many tomorrows... Every day is a new tomorrow." 
     183* "Improper posteriors are a no-no, that's just anarchy." 
     184* "The only way to do this is to do it."