# Linear Regression, GLMs and GAMs with R

¡Oferta!

Linear Regression, GLMs and GAMs with R. How to extend linear regression to specify and estimate generalized linear models and additive models.

40,00  10,00

The name of this course is Linear Regression, GLMs and GAMs with R. The knowledge you will get with this indescribable online course is astonishing. How to extend linear regression to specify and estimate generalized linear models and additive models..
Not only will you be able to deeply internalize the concepts, but also their application in different fields won’t ever be a problem. The instructor is Geoffrey Hubona, Ph.D., one of the very best experts in this field.

## Description of this course: Linear Regression, GLMs and GAMs with R

Course Description Linear Regression, GLMs and GAMs with R demonstrates how to use R to extend the basic assumptions and constraints of linear regression to specify, model, and interpret the results of generalized linear (GLMs) and generalized additive (GAMs) models. The course demonstrates the estimation of GLMs and GAMs by working through a series of practical examples from the book Generalized Additive Models: An Introduction with R by Simon N. Wood (Chapman & Antesala/CRC Texts in Statistical Science, 2006). Linear statistical models have a univariate response modeled as a linear function of predictor variables and a zero mean random error term. The assumption of linearity is a critical (and limiting) characteristic. Generalized linear models (GLMs) relax this assumption of linearity. They permit the expected value of the response variable to be a smoothed (e.g. non-linear) monotonic function of the linear predictors. GLMs also relax the assumption that the response variable is normally distributed by allowing for many distributions (e.g. común, poisson, binomial, log-linear, etc.). Generalized additive models (GAMs) are extensions of GLMs. GAMs allow for the estimation of regression coefficients that take the form of non-parametric smoothers. Nonparametric smoothers like lowess (locally weighted scatterplot smoothing) fit a smooth curve to data using localized subsets of the data. This course provides an overview of modeling GLMs and GAMs using R. GLMs, and especially GAMs, have evolved into standard statistical methodologies of considerable flexibility. The course addresses recent approaches to modeling, estimating and interpreting GAMs. The focus of the course is on modeling and interpreting GLMs and especially GAMs with R. Use of the freely available R software illustrates the practicalities of linear, generalized linear, and generalized additive models.

#R #Python #MachineLearning #BigData #DataAnalysis

## Requirements of this course: Linear Regression, GLMs and GAMs with R

What are the requirements? Students will need to install R and R Commander software but ample instruction for doing so is provided.

## What will you learn in this course: Linear Regression, GLMs and GAMs with R?

What am I going to get from this course? Understand the assumptions of ordinary least squares (OLS) linear regression. Specify, estimate and interpret linear (regression) models using R. Understand how the assumptions of OLS regression are modified (relaxed) in order to specify, estimate and interpret generalized linear models (GLMs). Specify, estimate and interpret GLMs using R. Understand the mechanics and limitations of specifying, estimating and interpreting generalized additive models (GAMs).

## Target audience of this course: Linear Regression, GLMs and GAMs with R

Who is the target audience? This course would be useful for anyone involved with linear modeling estimation, including graduate students and/or working professionals in quantitative modeling and data analysis. The focus, and majority of content, of this course is on generalized additive modeling. Anyone who wishes to learn how to specify, estimate and interpret GAMs would especially benefit from this course.