Deep-Dive into Tensorflow Activation Functions

ofrecido por
Coursera Project Network
En este proyecto guiado, tú:

Learn when, where, why and how to use different activation functions and for which situations

Code examples of each activation function from scratch in Python

Clock2 hours
IntermediateIntermedio
CloudNo se necesita descarga
VideoVideo de pantalla dividida
Comment DotsInglés (English)
LaptopSolo escritorio

You've learned how to use Tensorflow. You've learned the important functions, how to design and implement sequential and functional models, and have completed several test projects. What's next? It's time to take a deep dive into activation functions, the essential function of every node and layer of a neural network, deciding whether to fire or not to fire, and adding an element of non-linearity (in most cases). In this 2 hour course-based project, you will join me in a deep-dive into an exhaustive list of activation functions usable in Tensorflow and other frameworks. I will explain the working details of each activation function, describe the differences between each and their pros and cons, and I will demonstrate each function being used, both from scratch and within Tensorflow. Join me and boost your AI & machine learning knowledge, while also receiving a certificate to boost your resume in the process! Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

Habilidades que desarrollarás

  • Neural Network Activation Functions
  • Deep Learning
  • Artificial Neural Network
  • Python Programming
  • Tensorflow

Aprende paso a paso

En un video que se reproduce en una pantalla dividida con tu área de trabajo, tu instructor te guiará en cada paso:

  1. Review the Activation Functions, Their Properties & the Principle of Nonlinearity

  2. Implementing Linear and Binary Step Activations

  3. Implementing Ridge-based Activation Functions (ReLu family)

  4. Implementing Variations of Relu & the Swish Family of Non-Monotonic Activations

  5. Implementing Radial-based Activation Functions (RBF family)

Cómo funcionan los proyectos guiados

Tu espacio de trabajo es un escritorio virtual directamente en tu navegador, no requiere descarga.

En un video de pantalla dividida, tu instructor te guía paso a paso

Preguntas Frecuentes

Preguntas Frecuentes

¿Tienes más preguntas? Visita el Centro de Ayuda al Alumno.