| 217 | === Friday AM1 === |
| 218 | * Linear regression |
| 219 | * Model selection |
| 220 | * Iteratively throw away the least significant regressor until you reach some minimum threshold. It can be good... it can be better than doing nothing. |
| 221 | * In regression, the t-statistic is the regression coeffecient divided by the standard deviation. Basically any regression coeffecient that's greater than 2 will resemble significance (> 2 std dev). |
| 222 | * Bayesian model selection |
| 223 | * How to identify the predictors that do have an effect and eliminate the predictors that don't have an effect. |
| 224 | * Posterior odds = prior odds * Bayes factor |
| 225 | * BF - how well the data are fit by the model |
| 226 | * Balance goodness of fit of the observed data with our prior belief that most of the coefficients are probably zero. |
| 227 | * A model will be penalized if it's too complex (if there are too many terms turned on in the regression model), or if it doesn't fit well (if the sum of squared residuals (SSR) is too big). |
| 228 | * Gibbs sampling |
| 229 | * Sample from full conditional distributions p(x|y,z), p(y|x,z), etc... |
| 230 | * Distribution of sequences will approximate the true distribution p(x,y,z) |
| 231 | === Friday AM2 === |
| 232 | * GLM/GLMM/INLA |
| 233 | * Random variable is the exposure status |
| 234 | * epsilon-sub(i,j) - Latent variable that induces overdispersion |
| 235 | * GLM |
| 236 | * Response follows an exponential family |
| 237 | * mean model is linear in covariates |
| 238 | * Link function relates mean to covariates |
| 239 | * Link function is log in poisson, in normal it's identity, binomial is logistic |
| 240 | * INLA - integrated nested Laplace approximation |
| 241 | * Good docs [http://www.r-inla.org homepage] |
| 242 | * [https://catalyst.uw.edu/workspace/file/download/60a7af36f75e0e62c8f390abfb283ab21668f1a756e6f4ca72c6f0e1727b5adf Rmd code for INLA] |
| 243 | * GLMM |
| 244 | * Extends GLM to include random effects (wobble/measurement error) |
| 245 | * Mixed is a mixture of fixed and random effects |
| 246 | * Fixed effects + random effects |
| 247 | * Approximate Bayes inference |
| 248 | * Approximate bayes factor from 2 estimates |
| 249 | * Bayes factors are not independent |
| 250 | * "Shouldn't say mindless, it's judgmental." |
| 251 | * It looks like a complicated formula, but it's intuitive, and it's just a formula. |