Lecture Note
University
Stanford UniversityCourse
CS229 | Machine LearningPages
1
Academic year
2023
anon
Views
35
Feature Scaling and Machine Learning: Its Importance The variety and distribution of input features can have a signiﬁcant impact onmachine learning algorithms. For this reason, it's crucial to scale the features thatgo into a machine learning model correctly. Assuring that each feature contributesroughly the same amount to the outcome and preventing features with greaterranges from predominating the model are both beneﬁts of feature scaling. What does Feature Scaling mean? The method of feature scaling involves changing the input characteristics so thatthey have equivalent value ranges. It is a crucial step in the machine learning stageof pre-processing data. Assuring that each feature contributes roughly the sameamount to the outcome and preventing features with greater ranges frompredominating the model are both beneﬁts of feature scaling. Feature Scaling Techniques Scaling features can be done in a number of ways, such as by dividing by themaximum, mean normalizing, and Z-score normalization. Divide by the highest number Simply multiply each original feature value by the range's highest value to scalefeatures by dividing by the maximum. For instance, if feature has a range of 3 to 𝑥 1 2,000, one technique to scale is to divide each original value of by the 𝑥 1 𝑥 1 maximum value in the range, which in this case is 2,000. The scale that results 𝑥 1 will go from 0.15 to 1. Normalization to Mean The features are rescaled to be centered around zero during mean normalization.Find the average (mean) of the feature on your training set before normalizing it.Next, divide each feature value by the di erence between the maximum andminimum values after deducting the mean from each value. For instance, if theaverage of feature is 600, then each value for is subtracted 600, and the 𝑥 1 𝑥 1 di erence between 2,000 and 300, where 2,000 is the maximum and 300 is theminimum, is divided by 600. The range of the mean normalized as a 𝑥 1 consequence is 0.18 to 0.82. Normalization of Z-Score It is necessary to ﬁgure out each feature's standard deviation for Z-scorenormalization. Subtract the mean from each feature value and divide by thestandard deviation to normalize a feature's Z-score. For instance, to obtain theZ-score normalized x 1 if feature x 1 has a standard deviation of 450 and a mean of600, subtract 600 from each value of x 1 and divide by 450.
Feature Scaling and Machine Learning
Please or to post comments