Tools | Bookmark & Share | Make MrWhy My Homepage
MrWhy.com
Go
MrWhy.com » Videos » Gradient Boosted Decision Trees on Hadoop
Gradient Boosted Decision Trees on Hadoop
Gradient Boosted Decision Trees on Hadoop
Report
Gradient Boosted Decision Trees on Hadoop
Stochastic Gradient Boosted Decision Trees (GBDT) is one of the most widely used learning algorithms in machine learning today. It is adaptable, easy to interpret, and produces highly accurate models. However, most implementations today are computationally expensive and require all training data to be in main memory. As training data becomes ever larger, there is motivation for us to parallelize the GBDT algorithm. Parallelizing decision tree training is intuitive and various approaches have been explored in existing literature. Stochastic boosting on the other hand is inherently a sequential process and have not been applied to distributed decision trees. In this paper, we describe a distributed implementation of GBDT that utilizes MPI on the Hadoop grid environment as presented by us at CIKM in 2009.
Channel: VideoLectures
Category: Educational
Video Length: 0
Date Found: January 14, 2011
Date Produced: January 13, 2011
View Count: 0
 
MrWhy.com Special Offers
1
2
3
4
5
 
About Us: About MrWhy.com | Advertise on MrWhy.com | Contact MrWhy.com | Privacy Policy | MrWhy.com Partners
Answers: Questions and Answers | Browse by Category
Comparison Shopping: Comparison Shopping | Browse by Category | Top Searches
Shop eBay: Shop eBay | Browse by Category
Shop Amazon: Shop Amazon | Browse by Category
Videos: Video Search | Browse by Category
Web Search: Web Search | Browse by Searches
Copyright © 2011 MrWhy.com. All rights reserved.