Neural computation

Neural computation is the information processing performed by networks of neurons. Neural computation is affiliated with the philosophical tradition known as Computational theory of mind, also referred to as computationalism, which advances the thesis that neural computation explains cognition. The first persons to propose an account of neural activity as being computational was Warren McCullock and Walter Pitts in their seminal 1943 paper, A Logical Calculus of the Ideas Immanent in Nervous Activity. There are three general branches of computationalism, including classicism, connectionism, and computational neuroscience. All three branches agree that cognition is computation, however they disagree on what sorts of computations constitute cognition. The classicism tradition believes that computation in the brain is digital, analogous with digital computing. Both connectionism and computational neuroscience do not require that the computations which realize cognition are necessarily digital computations. However, the two branches greatly disagree upon which sorts of experimental data should be used to construct explanatory models of cognitive phenomena. Connectionists rely upon behavioral evidence to construct models to explain cognitive phenomenon, whereas computational neuroscience leverages neuroanatomical and neurophysiological information to construct mathematical models which explain cognition.

When comparing the three main traditions of the computational theory of mind, as well as the different possible forms of computation in the brain, it is helpful to define what we mean by computation in a general sense. Computation is the processing of vehicles, otherwise known as variables or entities, according to a set of rules. A rule in this sense is simply an instruction for executing a manipulation on the current state of the variable, in order to produce an specified output. In other words, a rule dictates which output to produce given a certain input to the computing system. A computing system is a mechanism whose components must be functionally organized to process the vehicles in accordance with the established set of rules. The types of vehicles processed by a computing system determine which type of computations it performs. Traditionally, in cognitive science there have been two proposed types of computation related to neural activity - digital and analog, with the vast majority of theoretical work incorporating a digital understanding of cognition. Computing systems which perform digital computation are functionally organized to execute operations on strings of digits with respect to the type and location of the digit on the string. It has been argued that neural spike train signaling implements some form of digital computation, since neural spikes may be considered as discrete units or digits, like 0 or 1 - the neuron either fires an action potential or it does not. Accordingly, neural spike trains could be seen as strings of digits. Alternatively, analog computing systems perform manipulations on non-discrete, irreducibly continuous variables, that is, entities which vary continuously as a function of time. These sorts of operations are characterized by systems of differential equations.

Neural computation can be studied for example by building models of neural computation.

There is a scientific journal dedicated to this subject, Neural Computation.

Artificial neural networks (ANN) is a subfield of the research area machine learning. Work on ANNs has been somewhat inspired by knowledge of neural computation.

[1]

References

  1. Piccinini, Gualtiero; Bahar, Sonya (2013). "Neural Computation and the Computational Theory of Cognition". Cognitive Science. 37 (3): 453–488. doi:10.1111/cogs.12012. PMID 23126542.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.