A Concise Introduction to Models and Methods for Automated PlanningMDP Planning: Stochastic Actions and Full Feedback
A Concise Introduction to Models and Methods for Automated Planning: MDP Planning: Stochastic...
Geffner, Hector; Bonet, Blai
2013-01-01 00:00:00
[Markov Decision Processes (MDPs) generalize the model underlying classical planning by allowing actions with stochastic effects and fully observable states. In this chapter, we look at a variety of MDP models and the basic algorithms for solving them: from offline methods based on dynamic programming and heuristic search, to online methods where the action to do next is obtained by solving simplifications, like finite-horizon versions of the problem or deterministic relaxations.]
http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.pnghttp://www.deepdyve.com/lp/springer-journals/a-concise-introduction-to-models-and-methods-for-automated-planning-xalEQo9iG2
A Concise Introduction to Models and Methods for Automated PlanningMDP Planning: Stochastic Actions and Full Feedback
[Markov Decision Processes (MDPs) generalize the model underlying classical planning by allowing actions with stochastic effects and fully observable states. In this chapter, we look at a variety of MDP models and the basic algorithms for solving them: from offline methods based on dynamic programming and heuristic search, to online methods where the action to do next is obtained by solving simplifications, like finite-horizon versions of the problem or deterministic relaxations.]
Published: Jan 1, 2013
Recommended Articles
Loading...
There are no references for this article.
Share the Full Text of this Article with up to 5 Colleagues for FREE
Sign up for your 14-Day Free Trial Now!
Read and print from thousands of top scholarly journals.
To get new article updates from a journal on your personalized homepage, please log in first, or sign up for a DeepDyve account if you don’t already have one.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.