10 Emerging Technologies That Will Change Your World
(Page 5 of 11)
Bayesian Machine Learning
When a computer scientist publishes genetics papers, you might think it would raise colleagues’ eyebrows. But Daphne Koller’s research using a once obscure branch of probability theory called Bayesian statistics is generating more excitement than skepticism. The Stanford University associate professor is creating programs that, while tackling questions such as how genes function, are also illuminating deeper truths about the long-standing computer science conundrum of uncertainty-learning patterns, finding causal relationships, and making predictions based on inevitably incomplete knowledge of the real world. Such methods promise to advance the fields of foreign-language translation, microchip manufacturing, and drug discovery, among others, sparking a surge of interest from Intel, Microsoft, Google, and other leading companies and universities.
How does an idea conceived by an 18th-century minister (Thomas Bayes) help modern computer science? Unlike older approaches to machine reasoning, in which each causal connection (“rain makes grass wet”) had to be explicitly taught, programs based on probabilistic approaches like Bayesian math can take a large body of data (“it’s raining,” “the grass is wet”) and deduce likely relationships, or “dependencies,” on their own. That’s crucial because many decisions programmers would like to automate-say, personalizing search engine results according to a user’s past queries-can’t be planned in advance; they require machines to weigh unforeseen combinations of evidence and make their best guesses. Says Intel research director David Tennenhouse, “These techniques are going to impact everything we do with computers-from user interfaces to sensor data processing to data mining.”