Deep AI’s data science glossary

Sunday 5th December, 2021 - Bruce Sterling

https://deepai.org/definitions

*You have to like the English language to appreciate what AI is doing to it, but AI jargon is quite poetic. This glossary not only lists some of the most weird and beautiful jargon terms, it also explains them. For instance: the wonderfully-named “Exploding Gradient Problem.”

What is the Exploding Gradient Problem?

In machine learning, the exploding gradient problem is an issue found in training artificial neural networks with gradient-based learning methods and backpropagation. An artificial neural network is a learning algorithm, also called neural network or neural net, that uses a network of functions to understand and translate data input into a specific output. This type of learning algorithm is designed to mimic the way neurons function in the human brain.

Exploding gradients are a problem when large error gradients accumulate and result in very large updates to neural network model weights during training. Gradients are used during training to update the network weights, but when the typically this process works best when these updates are small and controlled. When the magnitudes of the gradients accumulate,  an unstable network is likely to occur, which can cause poor predicition results or even a model that reports nothing useful what so ever. There are methods to fix exploding gradients, which include gradient clipping and weight regularization, among others.

Why is this Useful?

Exploding gradients can cause problems in the training of artificial neural networks. When there are exploding gradients, an unstable network can result and the learning cannot be completed. The values of the weights can also become so large as to overflow and result in something called NaN values. NaN values, which stands for not a number, are values that represent an undefined or unrepresentable values. It is useful to know how to identify exploding gradients in order to correct the training. 

Applications of the Exploding Gradient Problem
Machine Learning
Training Recurrent Neural Networks