- Help Center
- Machine Learning
-
Data Science Bootcamp
-
Pricing
-
Registration
-
Large Language Models Bootcamp
-
Agentic AI Bootcamp
-
Python Programming
-
Bootcamps
-
Practicum
-
Free Courses
-
Data Science for Business
-
Community
-
Blog
-
Employment Assistance
-
Homework and Notebooks
-
Machine Learning
-
Data Analysis
-
R Language
-
Python for Data Science
-
SQL
-
Introduction to Power BI
-
Power BI
-
Platform Related Issues
-
Programming and Tools
-
Partnerships
What is batch normalization?
Normalization is a data pre-processing tool used to bring the numerical data to a common scale without distorting its shape.
Generally, when we input the data to a machine or deep learning algorithm we tend to change the values to a balanced scale. The reason we normalize is partly to ensure that our model can generalize appropriately.
Now coming back to Batch normalization, it is a process to make neural networks faster and more stable through adding extra layers in a deep neural network. The new layer performs the standardizing and normalizing operations on the input of a layer coming from a previous layer.
But what is the reason behind the term “Batch” in batch normalization? A typical neural network is trained using a collected set of input data called batch. Similarly, the normalizing process in batch normalization takes place in batches, not as a single input.