Access the full text.

Sign up today, get DeepDyve free for 14 days.

Springer E-books — Jul 8, 2007

/lp/springer-e-books/a-basic-course-in-probability-theory-8rP0aPhpsx

- Publisher
- Springer New York
- Copyright
- Copyright © Springer Basel AG
- DOI
- 10.1007/978-0-387-71939-9
- Publisher site
- See Book on Publisher Site

Introductory Probability is a pleasure to read and provides a fine answer to the question: How do you construct Brownian motion from scratch, given that you are a competent analyst? There are at least two ways to develop probability theory. The more familiar path is to treat it as its own discipline, and work from intuitive examples such as coin flips and conundrums such as the Monty Hall problem. An alternative is to first develop measure theory and analysis, and then add interpretation. Bhattacharya and Waymire take the second path. ; A lively book that is a pleasure to read, this work gives analysts the tools to construct Brownian motion from scratch. It includes a complete overview of basic measure theory as well as analysis with proofs. ; Introductory Probability is a pleasure to read and provides a fine answer to the question: How do you construct Brownian motion from scratch, given that you are a competent analyst? There are at least two ways to develop probability theory. The more familiar path is to treat it as its own discipline, and work from intuitive examples such as coin flips and conundrums such as the Monty Hall problem. An alternative is to first develop measure theory and analysis, and then add interpretation. Bhattacharya and Waymire take the second path. To illustrate the authors' frame of reference, consider the two definitions they give of conditional expectation. The first is as a projection of L 2 spaces. The authors rely on the reader to be familiar with Hilbert space operators and at a glance, the connection to probability may not be not apparent. Subsequently, there is a discusssion of Bayes's rule and other relevant probabilistic concepts that lead to a definition of conditional expectation as an adjustment of random outcomes from a finer to a coarser information set. ; Random Maps, Distribution, and Mathematical Expectation.- Independence, Conditional Expectation.- Martingales and Stopping Times.- Classical Zero–One Laws, Laws of Large Numbers and Deviations.- Weak Convergence of Probability Measures.- Fourier Series, Fourier Transform, and Characteristic Functions.- Classical Central Limit Theorems.- Laplace Transforms and Tauberian Theorem.- Random Series of Independent Summands.- Kolmogorov's Extension Theorem and Brownian Motion.- Brownian Motion: The LIL and Some Fine-Scale Properties.- Skorokhod Embedding and Donsker's Invariance Principle.- A Historical Note on Brownian Motion.; From the reviews: "Bhattacharya (Univ. of Arizona, Tucson) and Waymire (Oregon State Univ., Corvallis) write to provide the necessary probability background for studying stochastic processes. For students exposed to analysis and measure theory, the book can be used as a graduate-level course resource on probability. … Every chapter ends with a set of exercises, including numerous solved examples. Appendixes explain measure theory and integration, function spaces and topology, and Hilbert spaces and applications to measure theory. List of symbols. Summing Up: Recommended. Graduate students; faculty and researchers." (D. V. Chopra, CHOICE, Vol. 45 (7), 2008) "The mentioned prerequisites are exposure to measure theory and analysis. Three appendices (29 pages) provide a brief but thorough introduction to the measure theory and functional analysis that is needed. … This well-written book is full of wonderful probability theory." (Kenneth A. Ross, MathDL, February, 2008) "The book provides the fundamentals of probability theory in a measure-theoretic framework ... . is suitable for advanced undergraduate students and graduate students. The material is presented in a very dense and concise way and each chapter includes a section with exercises at the end ... . Thus the book may be used very well as a reference text and companion literature for a lecture course ... ." (Evelyn Buckwar, Zentralblatt MATH, Vol. 1138 (16), 2008) "This book is a self-contained exposition of various basic elements of probability theory. It is suitable for graduate students with some background in analysis, but may also serve as a quick reference for more experienced readers. … Overall, this book is quite rich and very pleasant to read … ." (Djalil Chafaï, Mathematical Reviews, Issue 2009 e) ; The book develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. With this goal in mind, the pace is lively, yet thorough. Basic notions of independence and conditional expectation are introduced relatively early on in the text, while conditional expectation is illustrated in detail in the context of martingales, Markov property and strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two highlights. The historic role of size-biasing is emphasized in the contexts of large deviations and in developments of Tauberian Theory. The authors assume a graduate level of maturity in mathematics, but otherwise the book will be suitable for students with varying levels of background in analysis and measure theory. In particular, theorems from analysis and measure theory used in the main text are provided in comprehensive appendices, along with their proofs, for ease of reference. Rabi Bhattacharya is Professor of Mathematics at the University of Arizona. Edward Waymire is Professor of Mathematics at Oregon State University. Both authors have co-authored numerous books, including the graduate textbook, Stochastic Processes with Applications. ; Quicker paced introduction to the basics allows for a more in-depth treatment of such topics as convergence theory and Brownian motion Self-contained and suitable for students with varying levels of background in analysis and measure theory Includes a complete overview of basic measure theory and analysis (with proofs) Written in a lively and engaging style Contains an extensive bibliography ; This new text gives analysts the tools to construct Brownian motion from scratch. There are at least two ways to develop probability theory. The more familiar path is to treat it as its own discipline, and work from intuitive examples such as coin flips and conundrums such as the Monty Hall problem. An alternative is to first develop measure theory and analysis, and then add interpretation. Bhattacharya and Waymire take the second path. The authors rely on the reader to be familiar with Hilbert space operators and at a glance, the connection to probability may not be not apparent. Subsequently, there is a discusssion of Bayes's rule and other relevant probabilistic concepts that lead to a definition of conditional expectation as an adjustment of random outcomes from a finer to a coarser information set. ; US

**Published: ** Jul 8, 2007

Loading...

Read and print from thousands of top scholarly journals.

System error. Please try again!

Already have an account? Log in

Bookmark this article. You can see your Bookmarks on your DeepDyve Library.

To save an article, **log in** first, or **sign up** for a DeepDyve account if you don’t already have one.

Copy and paste the desired citation format or use the link below to download a file formatted for EndNote

Access the full text.

Sign up today, get DeepDyve free for 14 days.

All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.