¡Oferta!

Aprendizaje profundo sin supervisión en Python

120,00  120,00 

Aprendizaje profundo sin supervisión en Python. Autoencoders + restringidos Máquina de Boltzmann para Deep Redes Neuronales en Theano, + t-SNE y PCA

MATRICÚLATE

Descripción

No te pierdas este fabuloso curso online llamado Aprendizaje profundo sin supervisión en Python. Es 100% online y comenzarás justo en el momento de matricularte. Tú serás el que marques tu propio ritmo de aprendizaje.

Breve descripción del curso llamado Aprendizaje profundo sin supervisión en Python

Autoencoders + restringidos Máquina de Boltzmann para Deep Redes Neuronales en Theano, + t-SNE y PCA

El profesor de este fabuloso curso 100% online es Lazy Programmer Inc., un auténtico experto en la materia, y con el que aprenderás todo lo necesario para ser más competitivo. El curso se ofrece en Inglés.

Descripción completa del curso llamado Aprendizaje profundo sin supervisión en Python

Course Description This course is the next logical step in my deep learning, data science, and machine learning series. I’ve done a lot of courses about deep learning, and I just released a course about unsupervised learning, where I talked about clustering and density estimation. So what do you get when you put these 2 together? Unsupervised deep learning! In these course we’ll start with some very basic stuff – principal components analysis (PCA), and a popular nonlinear dimensionality reduction technique known as t-SNE (t-distributed stochastic neighbor embedding). Next, we’ll look at a special type of unsupervised neural network called the autoencoder. After describing how an autoencoder works, I’ll show you how you can link a bunch of them together to form a deep stack of autoencoders, that leads to better performance of a supervised deep neural network. Autoencoders are like a non-linear form of PCA. Last, we’ll look at restricted Boltzmann machines (RBMs). These are yet another popular unsupervised neural network, that you can use in the same way as autoencoders to pretrain your supervised deep neural network. I’ll show you an interesting way of training restricted Boltzmann machines, known as Gibbs sampling, a special case of Markov Chain Monte Carlo, and I’ll demonstrate how even though this method is only a rough approximation, it still ends up reducing other cost functions, such as the one used for autoencoders. This method is also known as Contrastive Divergence or CD-k. As in physical systems, we define a concept called free energy and attempt to minimize this quantity. Finally, we’ll bring all these concepts together and I’ll show you visually what happens when you use PCA and t-SNE on the features that the autoencoders and RBMs have learned, and we’ll see that even without labels the results suggest that a pattern has been found. All the materials used in this course are FREE. Since this course is the 4th in the deep learning series, I will assume you already know calculus, linear algebra, and Python coding. You’ll want to install Numpy and Theano for this course. These are essential items in your data analytics toolbox. If you are interested in deep learning and you want to learn about modern deep learning developments beyond just plain backpropagation, including using unsupervised neural networks to interpret what features can be automatically and hierarchically learned in a deep learning system, this course is for you. This course focuses on “how to build and understand”, not just “how to use”. Anyone can learn to use an API in 15 minutes after reading some documentation. It’s not about “remembering facts”, it’s about “seeing for yourself” via experimentation. It will teach you how to visualize what’s happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you. NOTES: All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples In the directory: unsupervised_class2 Make sure you always “git pull” so you have the latest version! HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE: calculuslinear algebraprobabilityPython coding: if/else, loops, lists, dicts, setsNumpy coding: matrix and vector operations, loading a CSV file TIPS (for getting through the course): Watch it at 2x.Take handwritten notes. This will drastically increase your ability to retain the information.Write down the equations. If you don’t, I guarantee it will just look like gibberish.Ask lots of questions on the discussion board. The more the better!Realize that most exercises will take you days or weeks to complete. USEFUL COURSE ORDERING: (The Numpy Stack in Python)Linear Regression in PythonLogistic Regression in Python(Supervised Machine Learning in Python)(Bayesian Machine Learning in Python: A/B Testing)Deep Learning in PythonPractical Deep Learning in Theano and TensorFlow(Supervised Machine Learning in Python 2: Ensemble Methods)Convolutional Neural Networks in Python(Easy NLP)(Cluster Analysis and Unsupervised Machine Learning)Unsupervised Deep Learning(Hidden Markov Models)Recurrent Neural Networks in PythonNatural Language Processing with Deep Learning in Python

Información adicional

Profesor

Lazy Programmer Inc.

Lecciones

31

Duración

3

Nivel

Intermedio

Idioma

Inglés

Incluye

Acceso de por vida <br/> Devolución a los 30 días garantizada <br/> Disponible en iOS y Android <br/> Certificado de finalización

Valoraciones

No hay valoraciones aún.

Sé el primero en valorar “Aprendizaje profundo sin supervisión en Python”

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *

*