Acerca de este Curso
10,860

100 % en línea

Comienza de inmediato y aprende a tu propio ritmo.

Fechas límite flexibles

Restablece las fechas límite en función de tus horarios.

Nivel avanzado

Aprox. 24 horas para completar

Inglés (English)

Subtítulos: Inglés (English)

Habilidades que obtendrás

AlgorithmsExpectation–Maximization (EM) AlgorithmGraphical ModelMarkov Random Field

100 % en línea

Comienza de inmediato y aprende a tu propio ritmo.

Fechas límite flexibles

Restablece las fechas límite en función de tus horarios.

Nivel avanzado

Aprox. 24 horas para completar

Inglés (English)

Subtítulos: Inglés (English)

Programa - Qué aprenderás en este curso

Semana
1
16 minutos para completar

Learning: Overview

This module presents some of the learning tasks for probabilistic graphical models that we will tackle in this course....
1 video (Total 16 minutos)
1 video
1 hora para completar

Review of Machine Learning Concepts from Prof. Andrew Ng's Machine Learning Class (Optional)

This module contains some basic concepts from the general framework of machine learning, taken from Professor Andrew Ng's Stanford class offered on Coursera. Many of these concepts are highly relevant to the problems we'll tackle in this course....
6 videos (Total 59 minutos)
6 videos
Regularization: Cost Function 10m
Evaluating a Hypothesis 7m
Model Selection and Train Validation Test Sets 12m
Diagnosing Bias vs Variance 7m
Regularization and Bias Variance11m
2 horas para completar

Parameter Estimation in Bayesian Networks

This module discusses the simples and most basic of the learning problems in probabilistic graphical models: that of parameter estimation in a Bayesian network. We discuss maximum likelihood estimation, and the issues with it. We then discuss Bayesian estimation and how it can ameliorate these problems....
5 videos (Total 77 minutos), 2 quizzes
5 videos
Maximum Likelihood Estimation for Bayesian Networks15m
Bayesian Estimation15m
Bayesian Prediction13m
Bayesian Estimation for Bayesian Networks17m
2 ejercicios de práctica
Learning in Parametric Models18m
Bayesian Priors for BNs8m
Semana
2
21 horas para completar

Learning Undirected Models

In this module, we discuss the parameter estimation problem for Markov networks - undirected graphical models. This task is considerably more complex, both conceptually and computationally, than parameter estimation for Bayesian networks, due to the issues presented by the global partition function....
3 videos (Total 52 minutos), 2 quizzes
3 videos
Maximum Likelihood for Conditional Random Fields13m
MAP Estimation for MRFs and CRFs9m
1 ejercicio de práctica
Parameter Estimation in MNs6m
Semana
3
17 horas para completar

Learning BN Structure

This module discusses the problem of learning the structure of Bayesian networks. We first discuss how this problem can be formulated as an optimization problem over a space of graph structures, and what are good ways to score different structures so as to trade off fit to data and model complexity. We then talk about how the optimization problem can be solved: exactly in a few cases, approximately in most others....
7 videos (Total 106 minutos), 3 quizzes
7 videos
Likelihood Scores16m
BIC and Asymptotic Consistency11m
Bayesian Scores20m
Learning Tree Structured Networks12m
Learning General Graphs: Heuristic Search23m
Learning General Graphs: Search and Decomposability15m
2 ejercicios de práctica
Structure Scores10m
Tree Learning and Hill Climbing8m
Semana
4
22 horas para completar

Learning BNs with Incomplete Data

In this module, we discuss the problem of learning models in cases where some of the variables in some of the data cases are not fully observed. We discuss why this situation is considerably more complex than the fully observable case. We then present the Expectation Maximization (EM) algorithm, which is used in a wide variety of problems....
5 videos (Total 83 minutos), 3 quizzes
5 videos
Expectation Maximization - Intro16m
Analysis of EM Algorithm11m
EM in Practice11m
Latent Variables22m
2 ejercicios de práctica
Learning with Incomplete Data8m
Expectation Maximization14m
Semana
5
1 hora para completar

Learning Summary and Final

This module summarizes some of the issues that arise when learning probabilistic graphical models from data. It also contains the course final....
1 video (Total 20 minutos), 1 quiz
1 video
1 ejercicio de práctica
Learning: Final Exam24m
25 minutos para completar

PGM Wrapup

This module contains an overview of PGM methods as a whole, discussing some of the real-world tradeoffs when using this framework in practice. It refers to topics from all three of the PGM courses....
1 video (Total 25 minutos)
1 video
4.6
31 revisionesChevron Right

43%

comenzó una nueva carrera después de completar estos cursos

31%

consiguió un beneficio tangible en su carrera profesional gracias a este curso

18%

consiguió un aumento de sueldo o ascenso

Principales revisiones

por LLJan 30th 2018

very good course for PGM learning and concept for machine learning programming. Just some description for quiz of final exam is somehow unclear, which lead to a little bit confusing.

por ZZFeb 14th 2017

Great course! Very informative course videos and challenging yet rewarding programming assignments. Hope that the mentors can be more helpful in timely responding for questions.

Instructor

Avatar

Daphne Koller

Professor
School of Engineering

Acerca de Universidad de Stanford

The Leland Stanford Junior University, commonly referred to as Stanford University or Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto, California, United States....

Acerca del programa especializado Probabilistic Graphical Models

Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems....
Probabilistic Graphical Models

Preguntas Frecuentes

  • Una vez que te inscribes para obtener un Certificado, tendrás acceso a todos los videos, cuestionarios y tareas de programación (si corresponde). Las tareas calificadas por compañeros solo pueden enviarse y revisarse una vez que haya comenzado tu sesión. Si eliges explorar el curso sin comprarlo, es posible que no puedas acceder a determinadas tareas.

  • Cuando te inscribes en un curso, obtienes acceso a todos los cursos que forman parte del Programa especializado y te darán un Certificado cuando completes el trabajo. Se añadirá tu Certificado electrónico a la página Logros. Desde allí, puedes imprimir tu Certificado o añadirlo a tu perfil de LinkedIn. Si solo quieres leer y visualizar el contenido del curso, puedes auditar el curso sin costo.

  • Compute the sufficient statistics of a data set that are necessary for learning a PGM from data

    Implement both maximum likelihood and Bayesian parameter estimation for Bayesian networks

    Implement maximum likelihood and MAP parameter estimation for Markov networks

    Formulate a structure learning problem as a combinatorial optimization task over a space of network structure, and evaluate which scoring function is appropriate for a given situation

    Utilize PGM inference algorithms in ways that support more effective parameter estimation for PGMs

    Implement the Expectation Maximization (EM) algorithm for Bayesian networks

    Honors track learners will get hands-on experience in implementing both EM and structure learning for tree-structured networks, and apply them to real-world tasks

¿Tienes más preguntas? Visita el Centro de Ayuda al Alumno.