Top 4 ways for Feature Scaling in Machine Learning

Feature Scaling in Machine Learning

Feature Scaling is some thing which really effects the Machine Learning Model in so many ways . I agree there are so many situations where Feature Scaling is optional or not required . Still there are so many Machine Learning Algorithms where Feature Scaling is must have process . For instances – Regression, K-Mean Clustering and PCA are those Machine Learning algorithms where Machine Learning is must to have technique. In the opposite side usually tree based algorithms need not to have Feature Scaling like Decision Tree etc . Today in this tutorial we will explore Top 4 ways for Feature Scaling in Machine Learning .

Feature Scaling in Machine Learning –

There are so many ways to scale the feature or column value . Its completely scenario oriented that which Scaler will be more performance oriented . Lets start exploring them one by one –

  1. Standardization –

This is one of the most use type of scaler in data preprocessing . This is known as z-score . This re distribute the data in such a way that mean =0 and standard deviation =1 . Here is the below formula for calculation –

z-score = [current_value – mean of data(feature)]/standard_deviation

For the implementation , you may use sklearn.preprocessing. StandardScaler

Please refer here for complete documentation on Standard Scaler here .

The another use case of standardization is to remove the outlier from the data set. See once you transform your data set using the standard scaler . All the values which are out from [-3,3] will be consider as outlier in data set / feature .

2. Mean Normalization –

Lets understand the formula first here –

normalization-score = [current_value – mean of data(feature)]/[max(feature)-min(feature)]

The range of normal distribution is [-1,1] with mean =0. We need this feature scaling technique for zero centric data .

If you are interested to read more on this topic specially implementation . Here is the scikit learn implementation of Normalization .

3. Min-Max Scaler Technique –

Specially when you need to transform the feature magnitude in [0,1] range . This Min-Max feature scaling technique is one the best option . Here is the formula  –

= [current_value – min(feature)]/[max(feature)-min(feature)]

The official documentation of its ( Min-Max Scaler ) implementation in scikit-learn  is here .

4.Unit Vector –

This Feature Scaling is very useful when we need to transform the feature value into unit form.

For more information in Feature Scaling Techniques specially to cover the implementation area  , please have a look on the scikit learn official documentation of preprocessing .

Conclusion –

Feature Scaling and related facts are usually creates confusion on data scientist while model development . This article was an effort to solve those issues . As I have already mention Feature Scaling is completely use-case oriented . In the very beginning we have explained where  feature scaling is optional and where is required . But we are planning to create a detail article on this point – When to apply Feature Scaling .

Anyways how did you find this article – Top 4 ways for Feature Scaling in Machine Learning . If you find any difficulty while understanding , Please let us know .If you think you need to add some more information over this topic feature scaling which is currently not here . You may write in the form of  guest posting .

Thanks 

Data Science Learner Team

 

 

Join our list

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Thank you for signup. A Confirmation Email has been sent to your Email Address.

Something went wrong.

Meet Abhishek ( Chief Editor) , a data scientist with major expertise in NLP and Text Analytics. He has worked on various projects involving text data and have been able to achieve great results. He is currently manages Datasciencelearner.com, where he and his team share knowledge and help others learn more about data science.
 
Thank you For sharing.We appreciate your support. Don't Forget to LIKE and FOLLOW our SITE to keep UPDATED with Data Science Learner