Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Boosted Statistical Relational LearnersIntroduction

Boosted Statistical Relational Learners: Introduction [There is no doubt the role of structure and relations within data becomes more and more important nowadays—for example, Google, Facebook, world wide mind etc. In many learning and mining tasks information about one objects can help a learner to reach conclusions about other, related objects and in turn to improve its overall performance. However, relations are difficult to represent using a fixed set of propositional features i.e., vectors of fixed dimensions—the standard approach within statistical machine learning and data mining. To overcome this, Statistical Relational Learning (SRL) Getoor and Taskar (2007) studies the combination of relational learning (e.g. inductive logic programming) and statistical machine learning. By combining the power of logic and probability, such approaches can perform robust and accurate reasoning and learning about complex relational data. The advantage of these formulations is that they can succinctly represent probabilistic dependencies among the attributes of different related objects, leading to a compact representation of learned models. Most of these methods essentially use first-order logic to capture domain knowledge and soften the rules using probabilities or weights. These approaches range can be broadly classified into directed models and undirected models. The advantage of these models is that they can succinctly represent probabilistic dependencies among the attributes of different related objects, leading to a compact representation of learned models.] http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

Loading next page...
 
/lp/springer-journals/boosted-statistical-relational-learners-introduction-M678vDpmuK
Publisher
Springer International Publishing
Copyright
© The Author(s) 2014
ISBN
978-3-319-13643-1
Pages
1 –3
DOI
10.1007/978-3-319-13644-8_1
Publisher site
See Chapter on Publisher Site

Abstract

[There is no doubt the role of structure and relations within data becomes more and more important nowadays—for example, Google, Facebook, world wide mind etc. In many learning and mining tasks information about one objects can help a learner to reach conclusions about other, related objects and in turn to improve its overall performance. However, relations are difficult to represent using a fixed set of propositional features i.e., vectors of fixed dimensions—the standard approach within statistical machine learning and data mining. To overcome this, Statistical Relational Learning (SRL) Getoor and Taskar (2007) studies the combination of relational learning (e.g. inductive logic programming) and statistical machine learning. By combining the power of logic and probability, such approaches can perform robust and accurate reasoning and learning about complex relational data. The advantage of these formulations is that they can succinctly represent probabilistic dependencies among the attributes of different related objects, leading to a compact representation of learned models. Most of these methods essentially use first-order logic to capture domain knowledge and soften the rules using probabilities or weights. These approaches range can be broadly classified into directed models and undirected models. The advantage of these models is that they can succinctly represent probabilistic dependencies among the attributes of different related objects, leading to a compact representation of learned models.]

Published: Mar 4, 2015

Keywords: Complex Relational Data; Capture Domain Knowledge; Inductive Logic Programming; Undirected Models; Statistical Relational Learning (SRL)

There are no references for this article.