d
WE ARE EXPERTS IN TECHNOLOGY

Let’s Work Together

n

StatusNeo

MACHINE LEARNING/ DEEP LEARNING – XGBoost

Extreme Gradient Boost is referred to as XGBoost. In contrast to Gradient Boost, XGBoost uses regularization parameters to prevent overfitting.

In order to learn about XGBoost we first need to know about Gradient Boosting.

Gradient Boosting

One well-liked boosting approach is gradient boosting. Each predictor in gradient boosting corrects the error of its predecessor. Unlike Adaboost, each predictor is trained using the residual errors of the predecessor as labels rather than adjusting the weights of the training instances.

The Gradient Boosted Trees approach uses CART as its base learner (Classification and Regression Trees).

Below is an example to show how gradient boosted trees are trained for the regression model:

Let us assume this consists of n number of trees. The first tree contains matrix X and labels y. The prediction in the first tree labeled as y1 (hat) is used to determine the residual/remaining errors in the tree denoted by r1. Tree 2 is then trained using the matrix X and the residual error of the previous tree. The predicted result is r1 which is then used to determine the residual r2. This process is repeated until all the n number of trees are trained.

Now that we have a basic idea about what gradient boosting is, let us see what is eXtreme Gradient Boost (XGBoost).

eXtreme Gradient Boost (XGBoost)

The gradient boosting solution XGBoost pushes the limits of processing power for upgraded tree algorithms. It is scalable and incredibly accurate. It was developed mainly to improve the efficiency and performance of machine learning models. XGBoost builds trees in parallel as compared to GBDT’s (Gradient Boosting Decision Tree) sequential approach. It uses a level-wise technique, scanning across gradient values, and utilizing these partial sums to evaluate the quality of splits at each potential break in the training set.

XGBoost gained significance in the past few years when it was used in almost every notebook and competition hosted on Kaggle. Initially, both Python and R implementations of XGBoost were built.

Various Advantages and Disadvantages of XGBoost are as follows:

Advantages:

1. No need to scale, or normalize data. It can also handle missing (null) values.

2. Has good execution speed

3. Overall good model performance as it is the most used in Kaggle competitions.

4. Handles large datasets with ease.

Disadvantages:

1. Difficulty in visualization and interpretation

2. If parameters are not appropriately set, overfitting is possible.

3. More challenging to tune due to the abundance of hyperparameters.

Applications of XGBoost:                  

Since the Machine Learning technique yields quick results it is deemed highly practical to use it in projects which contains heavy data and provides the real-time result. For example, a real-time accident detection program should use XGBoost as its primary ML technique. This research paper published by the Department of Civil and Materials Engineering, the University of Illinois at Chicago displays to us that XGBoost is an optimal option for this cause. The committee collected data used from the Chicago metropolitan expressway between December 2016 and

December 2017.  The data stated 244 traffic accidents and 6073 nonaccident cases. The result shows that XGBoost can detect accidents with an:

Accuracy of 99%

Detection Rate of 79%

False alarm rate of 0.16%

If you want to read more about this model and the research paper, Please find the original link below:

https://www.sciencedirect.com/science/article/abs/pii/S0001457519311790#:~:text=In%20this%20study%2C%20XGBoost%20is,leaf%20nodes%20(end%20nodes).

Mathematics behind XGBoost

https://www.geeksforgeeks.org/xgboost/

Future Applications of XGBoost

There are currently a number of effective alternatives to XGBoost in the highly active research field of machine learning. The recently published LightGBM gradient boosting system from Microsoft Research has a lot of potentials. Yandex Technology’s CatBoost has been producing remarkable benchmarking results. When a new model framework emerges that outperforms XGBoost in terms of prediction performance, flexibility, explainability, and pragmatism, it will only be a matter of time. However, XGBoost will continue to rule the machine learning world until a formidable rival emerges.

Comments

  • Garima Rastogii

    August 17, 2022

    Very helpful

  • Garima Rastogii

    August 17, 2022

    Very helpful

Add Comment