TY - GEN N2 - "McClelland and Rumelhart's Parallel Distributed Processing was the first book to present a definitive account of the newly revived connectionist/neural net paradigm for artificial intelligence and cognitive science. While Neural Computing Architectures addresses the same issues, there is little overlap in the research it reports. These 18 contributions provide a timely and informative overview and synopsis of both pioneering and recent European connectionist research. Several chapters focus on cognitive modeling; however, most of the work covered revolves around abstract neural network theory or engineering applications, bringing important complementary perspectives to currently published work in PDP.In four parts, chapters take up neural computing from the classical perspective, including both foundational and current work; the mathematical perspective (of logic, automata theory, and probability theory), presenting less well-known work in which the neuron is modeled as a logic truth function that can be implemented in a direct way as a silicon read only memory. They present new material both in the form of analytical tools and models and as suggestions for implementation in optical form, and summarize the PDP perspective in a single extended chapter covering PDP theory, application, and speculation in US research. Each part is introduced by the editor." AB - "McClelland and Rumelhart's Parallel Distributed Processing was the first book to present a definitive account of the newly revived connectionist/neural net paradigm for artificial intelligence and cognitive science. While Neural Computing Architectures addresses the same issues, there is little overlap in the research it reports. These 18 contributions provide a timely and informative overview and synopsis of both pioneering and recent European connectionist research. Several chapters focus on cognitive modeling; however, most of the work covered revolves around abstract neural network theory or engineering applications, bringing important complementary perspectives to currently published work in PDP.In four parts, chapters take up neural computing from the classical perspective, including both foundational and current work; the mathematical perspective (of logic, automata theory, and probability theory), presenting less well-known work in which the neuron is modeled as a logic truth function that can be implemented in a direct way as a silicon read only memory. They present new material both in the form of analytical tools and models and as suggestions for implementation in optical form, and summarize the PDP perspective in a single extended chapter covering PDP theory, application, and speculation in US research. Each part is introduced by the editor." T1 - Neural computing architectures :the design of brain-like machines / DA - 1989. CY - Cambridge, Mass. : AU - Aleksander, Igor. ET - 1st MIT Press ed. CN - QA76.5 PB - MIT Press, PP - Cambridge, Mass. : PY - 1989. ID - 1387534 KW - Neural computers. KW - Computer architecture. KW - COMPUTER SCIENCE/Machine Learning & Neural Networks SN - 0262255596 SN - 9780262255592 TI - Neural computing architectures :the design of brain-like machines / LK - https://univsouthin.idm.oclc.org/login?url=https://doi.org/10.7551/mitpress/4926.001.0001?locatt=mode:legacy LK - http://www.oclc.org/content/dam/oclc/forms/terms/vbrl-201703.pdf UR - https://univsouthin.idm.oclc.org/login?url=https://doi.org/10.7551/mitpress/4926.001.0001?locatt=mode:legacy UR - http://www.oclc.org/content/dam/oclc/forms/terms/vbrl-201703.pdf ER -