|
MrWhy.com » Videos » 1 Billion Instances, 1 Thousand Machines and 3.5 Hours |
|
|
1 Billion Instances, 1 Thousand Machines and 3.5 Hours
|
1 Billion Instances, 1 Thousand Machines and 3.5 Hours
Training conditional maximum entropy models on massive data sets requires significant computational resources, but by distributing the computation, training time can be significant reduced. Recent theoretical results have demonstrated conditional maximum entropy models trained by weight mixtures of independently trained models converge at the same rate as traditional distributed schemes, but significantly faster. This efficiency is achieved primarily by reducing network communication costs, a cost not usually considered but actually quite crucial.
Video Length: 0
Date Found: October 13, 2010
Date Produced: January 19, 2010
View Count: 0
|
|
|
|
|
I got punched by an old guy, for farting near his wife. Read MoreComic book creator Stan Lee talks the future of the medium in the digital age. Panelists Zachary... Read MoreThe U.S. launch of Spotify is still on music lovers' minds. Join Zachary Levi, from NBC’s... Read MoreTuesday: Rupert Murdoch testifies before Parliament on the hacking scandal that brought down "News... Read MoreAfter a long slump, the home construction industry may be showing signs of life. But as Bill... Read More | 1 2 3 4 5 |
|
|
|