Feature Scaling

202502012204
tags: #machine-learning #preprocessing #features

Feature scaling normalizes input features to similar ranges, preventing features with larger scales from dominating the learning process.

Why it matters:

Common scaling methods:

Apply the same scaling parameters from training data to test data to avoid Data Distribution Mismatch.

Feature scaling is particularly important for algorithms using Regularization since it ensures penalties are applied fairly across all features.


Reference

Machine Learning Yearning by Andrew Ng