1 - 6 of 6 Chapters
[This chapter serves as a basic introduction to neural networks, including their history and some applications in which they have achieved state-of-the-art results.]
[Current mathematical descriptions of neural networks are either exclusively based on scalars or with loosely-defined vector-valued derivatives, which needs to be improved upon. Thus, in this chapter we build up the framework by introducing prerequisite mathematical concepts and notation for...
[In the previous chapter, we took the first step towards creating a standard mathematical framework for neural networks by developing mathematical tools for vector-valued functions and their derivatives. We use these tools in this chapter to describe the operations employed in a generic layered...
[We developed an algebraic framework for a generic layered network in the preceding chapter, including a method to express error backpropagation and loss function derivatives directly over the inner product space in which the network parameters are defined. We dedicate this chapter to expressing...
[We applied the generic neural network framework from Chap. 3 to specific network structures in the previous chapter. Multilayer Perceptrons and Convolutional Neural Networks fit squarely into that framework, and we were also able to modify it to capture Deep Auto-Encoders. We now extend the...
[This chapter serves as a conclusion of this book and provides possible directions for future research.]
Read and print from thousands of top scholarly journals.
Continue with Facebook
Log in with Microsoft
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Sign Up Log In
To subscribe to email alerts, please log in first, or sign up for a DeepDyve account if you don’t already have one.
To get new article updates from a journal on your personalized homepage, please log in first, or sign up for a DeepDyve account if you don’t already have one.