if you want to remove an article from website contact us from top.

    which method is used for encoding the categorical variables 1 point one hot encoder category encoder label encoder both a & c

    Mohammed

    Guys, does anyone know the answer?

    get which method is used for encoding the categorical variables 1 point one hot encoder category encoder label encoder both a & c from screen.

    What is Categorical Data

    Dealing with categorical data is key to creating a successful model. Learn what is categorical data and various categorical data encoding methods.

    Overview

    Understand what is Categorical Data Encoding

    Learn different encoding techniques and when to use them

     

    Introduction

    The performance of a machine learning model not only depends on the model and the hyperparameters but also on how we process and feed different types of variables to the model. Since most machine learning models only accept numerical variables, preprocessing the categorical variables becomes a necessary step. We need to convert these categorical variables to numbers such that the model is able to understand and extract valuable information.

    A typical data scientist spends 70 – 80% of his time cleaning and preparing the data. And converting categorical data is an unavoidable activity. It not only elevates the model quality but also helps in better feature engineering. Now the question is, how do we proceed? Which categorical data encoding method should we use?

    In this article, I will be explaining various types of categorical data encoding methods with implementation in Python.

    In case you want to learn concepts of data science in video format, check out our course- Introduction to Data Science

     

    Table of content

    What is Categorical Data?

    Label Encoding or Ordinal Encoding

    One hot Encoding Dummy Encoding Effect Encoding Binary Encoding BaseN Encoding Hash Encoding Target Encoding  

    What is categorical data?

    Since we are going to be working on categorical variables in this article, here is a quick refresher on the same with a couple of examples. Categorical variables are usually represented as ‘strings’ or ‘categories’ and are finite in number. Here are a few examples:

    The city where a person lives: Delhi, Mumbai, Ahmedabad, Bangalore, etc.

    The department a person works in: Finance, Human resources, IT, Production.

    The highest degree a person has: High school, Diploma, Bachelors, Masters, PhD.

    The grades of a student:  A+, A, B+, B, B- etc.

    In the above examples, the variables only have definite possible values. Further, we can see there are two kinds of categorical data-

    Ordinal Data: The categories have an inherent orderNominal Data: The categories do not have an inherent order

    In Ordinal data, while encoding, one should retain the information regarding the order in which the category is provided. Like in the above example the highest degree a person possesses, gives vital information about his qualification. The degree is an important feature to decide whether a person is suitable for a post or not.

    While encoding Nominal data, we have to consider the presence or absence of a feature. In such a case, no notion of order is present. For example, the city a person lives in. For the data, it is important to retain where a person lives. Here, We do not have any order or sequence. It is equal if a person lives in Delhi or Bangalore.

    For encoding categorical data, we have a python package category_encoders. The following code helps you install easily.

    pip install category_encoders

     

    Label Encoding or Ordinal Encoding

    We use this categorical data encoding technique when the categorical feature is ordinal. In this case, retaining the order is important. Hence encoding should reflect the sequence.

    In Label encoding, each label is converted into an integer value. We will create a variable that contains the categories representing the education qualification of a person.

    Python Code:

    Fit and transform train data

    df_train_transformed = encoder.fit_transform(train_df)

     

    One Hot Encoding

    We use this categorical data encoding technique when the features are nominal(do not have any order). In one hot encoding, for each level of a categorical feature, we create a new variable. Each category is mapped with a binary variable containing either 0 or 1. Here, 0 represents the absence, and 1 represents the presence of that category.

    These newly created binary features are known as Dummy variables. The number of dummy variables depends on the levels present in the categorical variable. This might sound complicated. Let us take an example to understand this better. Suppose we have a dataset with a category animal, having different animals like Dog, Cat, Sheep, Cow, Lion. Now we have to one-hot encode this data.

    After encoding, in the second table, we have dummy variables each representing a category in the feature Animal. Now for each category that is present, we have 1 in the column of that category and 0 for the others. Let’s see how to implement a one-hot encoding in python.

     

    import category_encoders as ce

    import pandas as pd

    स्रोत : www.analyticsvidhya.com

    Categorical encoding using Label

    In Machine Learning, convert categorical data into numerical data using Label-Encoder and One-Hot-Encoder

    Photo by Patrick Fore on Unsplash

    Categorical encoding using Label-Encoding and One-Hot-Encoder

    In many Machine-learning or Data Science activities, the data set might contain text or categorical values (basically non-numerical values). For example, color feature having values like red, orange, blue, white etc. Meal plan having values like breakfast, lunch, snacks, dinner, tea etc. Few algorithms such as CATBOAST, decision-trees can handle categorical values very well but most of the algorithms expect numerical values to achieve state-of-the-art results.

    Over your learning curve in AI and Machine Learning, one thing you would notice that most of the algorithms work better with numerical inputs. Therefore, the main challenge faced by an analyst is to convert text/categorical data into numerical data and still make an algorithm/model to make sense out of it. Neural networks, which is a base of deep-learning, expects input values to be numerical.

    There are many ways to convert categorical values into numerical values. Each approach has its own trade-offs and impact on the feature set. Hereby, I would focus on 2 main methods: One-Hot-Encoding and Label-Encoder. Both of these encoders are part of SciKit-learn library (one of the most widely used Python library) and are used to convert text or categorical data into numerical data which the model expects and perform better with.

    Code snippets in this article would be of Python since I am more comfortable with Python. If you need for R (another widely used Machine-Learning language) then say so in comments.

    Label Encoding

    This approach is very simple and it involves converting each value in a column to a number. Consider a dataset of bridges having a column names bridge-types having below values. Though there will be many more columns in the dataset, to understand label-encoding, we will focus on one categorical column only.

    BRIDGE-TYPEArch

    Beam Truss Cantilever Tied Arch Suspension Cable

    We choose to encode the text values by putting a running sequence for each text values like below:

    With this, we completed the label-encoding of variable bridge-type. That’s all label encoding is about. But depending upon the data values and type of data, label encoding induces a new problem since it uses number sequencing. The problem using the number is that they introduce relation/comparison between them. Apparently, there is no relation between various bridge type, but when looking at the number, one might think that ‘Cable’ bridge type has higher precedence over ‘Arch’ bridge type. The algorithm might misunderstand that data has some kind of hierarchy/order 0 < 1 < 2 … < 6 and might give 6X more weight to ‘Cable’ in calculation then than ‘Arch’ bridge type.

    Let’s consider another column named ‘Safety Level’. Performing label encoding of this column also induces order/precedence in number, but in the right way. Here the numerical order does not look out-of-box and it makes sense if the algorithm interprets safety order 0 < 1 < 2 < 3 < 4 i.e. none < low < medium < high < very high.

    Label Encoding in Python

    Using category codes approach:

    This approach requires the category column to be of ‘category’ datatype. By default, a non-numerical column is of ‘object’ type. So you might have to change type to ‘category’ before using this approach.

    # import required libraries

    import pandas as pd import numpy as np

    # creating initial dataframe

    bridge_types = ('Arch','Beam','Truss','Cantilever','Tied Arch','Suspension','Cable')

    bridge_df = pd.DataFrame(bridge_types, columns=['Bridge_Types'])

    # converting type of columns to 'category'

    bridge_df['Bridge_Types'] = bridge_df['Bridge_Types'].astype('category')

    # Assigning numerical values and storing in another column

    bridge_df['Bridge_Types_Cat'] = bridge_df['Bridge_Types'].cat.codes

    bridge_df

    Using sci-kit learn library approach:

    Another common approach which many data analyst perform label-encoding is by using SciKit learn library.

    import pandas as pd import numpy as np

    from sklearn.preprocessing import LabelEncoder

    # creating initial dataframe

    bridge_types = ('Arch','Beam','Truss','Cantilever','Tied Arch','Suspension','Cable')

    bridge_df = pd.DataFrame(bridge_types, columns=['Bridge_Types'])

    # creating instance of labelencoder

    labelencoder = LabelEncoder()

    # Assigning numerical values and storing in another column

    bridge_df['Bridge_Types_Cat'] = labelencoder.fit_transform(bridge_df['Bridge_Types'])

    bridge_df

    bridge_df with categorical caolumn and label-encoded column values

    स्रोत : towardsdatascience.com

    Ordinal and One

    Machine learning models require all input and output variables to be numeric. This means that if your data contains categorical data, you must encode it to numbers before you can fit and evaluate a model. The two most popular techniques are an Ordinal Encoding and a One-Hot Encoding. In this tutorial, you will discover how […]

    Ordinal and One-Hot Encodings for Categorical Data

    by Jason Brownlee on June 12, 2020 in Data Preparation

    Last Updated on August 17, 2020

    Machine learning models require all input and output variables to be numeric.

    This means that if your data contains categorical data, you must encode it to numbers before you can fit and evaluate a model.

    The two most popular techniques are an Ordinal Encoding and a One-Hot Encoding.

    In this tutorial, you will discover how to use encoding schemes for categorical machine learning data.

    After completing this tutorial, you will know:

    Encoding is a required pre-processing step when working with categorical data for machine learning algorithms.

    How to use ordinal encoding for categorical variables that have a natural rank ordering.

    How to use one-hot encoding for categorical variables that do not have a natural rank ordering.

    Kick-start your project with my new book Data Preparation for Machine Learning, including step-by-step tutorials and the Python source code files for all examples.

    Let’s get started.

    Ordinal and One-Hot Encoding Transforms for Machine Learning

    Photo by Felipe Valduga, some rights reserved.

    Tutorial Overview

    This tutorial is divided into six parts; they are:

    Nominal and Ordinal Variables

    Encoding Categorical Data

    Ordinal Encoding One-Hot Encoding

    Dummy Variable Encoding

    Breast Cancer Dataset

    OrdinalEncoder Transform

    OneHotEncoder Transform

    Common Questions

    Nominal and Ordinal Variables

    Numerical data, as its name suggests, involves features that are only composed of numbers, such as integers or floating-point values.

    Categorical data are variables that contain label values rather than numeric values.

    The number of possible values is often limited to a fixed set.

    Categorical variables are often called nominal.

    Some examples include:

    A “pet” variable with the values: “dog” and “cat“.

    A “color” variable with the values: “red“, “green“, and “blue“.

    A “place” variable with the values: “first“, “second“, and “third“.

    Each value represents a different category.

    Some categories may have a natural relationship to each other, such as a natural ordering.

    The “place” variable above does have a natural ordering of values. This type of categorical variable is called an ordinal variable because the values can be ordered or ranked.

    A numerical variable can be converted to an ordinal variable by dividing the range of the numerical variable into bins and assigning values to each bin. For example, a numerical variable between 1 and 10 can be divided into an ordinal variable with 5 labels with an ordinal relationship: 1-2, 3-4, 5-6, 7-8, 9-10. This is called discretization.

    Nominal Variable (Categorical). Variable comprises a finite set of discrete values with no relationship between values.Ordinal Variable. Variable comprises a finite set of discrete values with a ranked ordering between values.

    Some algorithms can work with categorical data directly.

    For example, a decision tree can be learned directly from categorical data with no data transform required (this depends on the specific implementation).

    Many machine learning algorithms cannot operate on label data directly. They require all input variables and output variables to be numeric.

    In general, this is mostly a constraint of the efficient implementation of machine learning algorithms rather than hard limitations on the algorithms themselves.

    Some implementations of machine learning algorithms require all data to be numerical. For example, scikit-learn has this requirement.

    This means that categorical data must be converted to a numerical form. If the categorical variable is an output variable, you may also want to convert predictions by the model back into a categorical form in order to present them or use them in some application.

    Want to Get Started With Data Preparation?

    Take my free 7-day email crash course now (with sample code).

    Click to sign-up and also get a free PDF Ebook version of the course.

    Encoding Categorical Data

    There are three common approaches for converting ordinal and categorical variables to numerical values. They are:

    Ordinal Encoding One-Hot Encoding

    Dummy Variable Encoding

    Let’s take a closer look at each in turn.

    Ordinal Encoding

    In ordinal encoding, each unique category value is assigned an integer value.

    For example, “red” is 1, “green” is 2, and “blue” is 3.

    This is called an ordinal encoding or an integer encoding and is easily reversible. Often, integer values starting at zero are used.

    For some variables, an ordinal encoding may be enough. The integer values have a natural ordered relationship between each other and machine learning algorithms may be able to understand and harness this relationship.

    It is a natural encoding for ordinal variables. For categorical variables, it imposes an ordinal relationship where no such relationship may exist. This can cause problems and a one-hot encoding may be used instead.

    स्रोत : machinelearningmastery.com

    Do you want to see answer or more ?
    Mohammed 16 day ago
    4

    Guys, does anyone know the answer?

    Click For Answer