Thinking Machines: Machine Learning and Its Hardware Implementation covers the theory and application of machine learning, neuromorphic computing and neural networks. This is the first book that focuses on machine learning accelerators and hardware development for machine learning. It presents not only a summary of the latest trends and examples of machine learning hardware and basic knowledge of machine learning in general, but also the main issues involved in its implementation. Readers will learn what is required for the design of machine learning hardware for neuromorphic computing and/or neural networks. This is a recommended book for those who have basic knowledge of machine learning or those who want to learn more about the current trends of machine learning. Presents a clear understanding of various available machine learning hardware accelerator solutions that can be applied to selected machine learning algorithms Offers key insights into the development of hardware, from algorithms, software, logic circuits, to hardware accelerators Introduces the baseline characteristics of deep neural network models that should be treated by hardware as well Presents readers with a thorough review of past research and products, explaining how to design through ASIC and FPGA approaches for target machine learning models Surveys current trends and models in neuromorphic computing and neural network hardware architectures Outlines the strategy for advanced hardware development through the example of deep learning accelerators
|Author||: Patrick K. Simpson|
|Publisher||: Institute of Electrical & Electronics Engineers(IEEE)|
|Release Date||: 1996|
|Pages||: 943 pages|
|Release Date||: 1993|
|ISBN 10||: 9780780312005|
|Pages||: 1983 pages|
Christof Teuscher revives, analyzes, and simulates Turing's ideas, applying them to different types of problems, and building and training Turing's machines using evolutionary algorithms. In a little known paper entitled 'Intelligent Machinery' Turing investigated connectionist networks, but his work was dismissed as a 'schoolboy essay'and it was left unpublished until 1968, 14 years after his death. This is not a book about today's (classical) neural networks, but about the neuron network-like structures proposed by Turing. One of its novel features is that it actually goes beyond Turing's ideas by proposing new machines. The book also contains a Foreward by B. Jack Copeland and D. Proudfoot.
While the primary objective of the text is to provide a teaching tool, practicing engineers and scientists are likely to find the clear, concept-based treatment useful in updating their backgrounds.
This new edition also treats smart materials and artificial life. A new chapter on information and computational dynamics takes up many recent discussions in the community.
Though mathematical ideas underpin the study of neural networks, the author presents the fundamentals without the full mathematical apparatus. All aspects of the field are tackled, including artificial neurons as models of their real counterparts; the geometry of network action in pattern space; gradient descent methods, including back-propagation; associative memory and Hopfield nets; and self-organization and feature maps. The traditionally difficult topic of adaptive resonance theory is clarified within a hierarchical description of its operation. The book also includes several real-world examples to provide a concrete focus. This should enhance its appeal to those involved in the design, construction and management of networks in commercial environments and who wish to improve their understanding of network simulator packages. As a comprehensive and highly accessible introduction to one of the most important topics in cognitive and computer science, this volume should interest a wide range of readers, both students and professionals, in cognitive science, psychology, computer science and electrical engineering.