It is often said that even if the brain is a computing mechanism, it need not have a von Neumann architecture (Pylyshyn 1984, Churchland and Sejnowski 1992). In these discussions, “von Neumann architecture” is used as a generic term for the functional organization of ordinary digital computers. This claim is used to discount apparent dissimilarities between the functional organization of brains and that of ordinary digital computers as irrelevant to computationalism. The idea is that brains may compute by means other than those exploited by modern digital computers.
It is true that not all computing mechanisms need have a von Neumann architecture. For example, Turing Machines don’t. But this does not eliminate the constraints that different versions of computationalism put on the functional organization of the brain, if the brain is to perform the relevant kinds of computations. In the current discussion, I am intentionally avoiding the term “von Neumann architecture” because it is so generic that it obscures the many issues of functional organization that are relevant to the design and computing power of computing mechanisms. The present account allows us to increase the precision of our claims about computer and brain architectures, avoiding the generic term “von Neumann architecture” and focusing on various functional properties of computing mechanisms (and hence on what their computing power is).
If the brain is expected to be a programmable, stored-program, universal computer, as it is by some versions of computationalism, it must contain programs as well as components that store and execute programs. More generally, any kind of computation, even the most trivial transformation of one symbol into another (as performed by a NOT gate) requires appropriate hardware. So every nontrivial computationalist thesis, depending on the computation power it ascribes to the brain, constrains the functional properties that brains must exhibit if they are to perform the relevant computations. The following are general questions about neural hardware that apply to some or all computationalist theses about the brain:
What are the symbols manipulated in the neural computation, and what are their types?
What are the elementary computational operations on neural symbols, and what are the components that perform them?
How are the symbols concatenated to one another, so that strings of them can be identified as inputs, internal states, and outputs of neural mechanisms and nontrivial computations from input strings to output strings can be ascribed to neural mechanisms?
What are the compositional rules between elementary operations, and the corresponding ways to connect the components, such that complex operations can be formed out of elementary ones and performed by the mechanism?
If the system stores programs or even just data for the computations, what are the memory cells and registers and how do they work?
What are the control units that determine which operations are executed at any given time and how do they work? This question is particularly pressing if there has to be execution of programs, because the required kind of control unit is particularly sophisticated and needs to correctly coordinate its behavior with the components that store the programs.
When McCulloch and Pitts (1943) initially formulated computationalism, they had answers to the relevant versions of the above questions. In answer to (1), they thought that the presence and the absence of a neural spike were the two types of symbols on which neural computations were defined. In answer to (2), they appealed to Boolean operations and claimed that they were performed by neurons. In answer to (3) and (4), they relied on a formalism they in part created and in part drew from Carnap, which is equivalent to a mixture of Boolean algebra and finite state automata. In answer to (5), McCulloch hypothesized that there were closed loops of neural activity, which acted as memory cells. In answer to (6),they largely appealed to the innate wiring of the brain.29
When von Neumann formulated his own version of computationalism (von Neumann 1958), he also tried to answer at least the first two of the above questions. In answer to (1), he thought that the firing rates of neurons were the symbol types. In answer to (2), he thought the elementary operations were arithmetical and logical operations on these firing rates. Although von Neumann’s answers take into account the functional significance of neuronal spikes as it is understood by modern neurophysiologists, von Neumann did not have answers to questions 3 to 6, and he explicitly said that he did not know how the brain could possibly achieve the degree of computational precision that he thought it needed under his assumptions about its computational organization.30
Today’s computationalists no longer believe McCulloch’s or von Neumann’s versions of computationalism. But if computationalism is to remain a substantive, empirical hypothesis about the brain, these questions need to find convincing answers. If they don’t, it may be time to abandon computationalism in favor of other functional explanations of neural mechanisms.
Contrary to what some maintain (e.g., Churchland and Sejnowski 1992), whether something is a computer, and what kind of computer it is, is independent of observers. Computers are very special mechanisms, whose function is to perform computations that involve long sequences of primitive operations on strings of symbols, operations that can be directly performed by the computers’ processors. Whether something performs computations, and what computations it performs, can be determined by functional analysis. Moreover, different classes of computers can be programmed in different ways or compute different classes of functions. These and other useful distinctions between classes of computers can be drawn by looking at computers’ functional properties, and can be profitably used in historical and philosophical discussions pertaining to computers.
This functional account of computers has several advantages. First, it underwrites our intuitive distinctions between systems that compute and systems that don’t, and between computers and other computing mechanisms (such as calculators). Second, it explains the versatility of computers in terms of their functional organization. Third, it sheds light on why computers, not calculators or other computing mechanisms, inspired the computational theory of mind and brain. Fourth, it explicates the notion of explanation by program execution, i.e. an explanation of a system’s capacity by postulating the execution of a program for that capacity.
Explanations by program execution are invoked in the philosophy of mind literature (cf. Piccinini forthcoming c). Given the functional account of computers, explanations by program execution are a special kind of functional explanation, which relies on the special kind of functional analysis that applies to soft programmable computers. Soft programmable computers are computers with processors that respond differentially to different strings of symbols, to the effect that different operations are performed on the data. Program execution is a process by which a (stable state of a) certain part of the mechanism, the program, affects a certain other part of the mechanism, the processor, so that the processor performs appropriate operations on a (stable state of a) certain other part of the mechanism, the data. A mechanism must be functionally analyzable in this way to be subject to explanation by program execution. Explanation by program execution is the most interesting genus of the species of explanation by appeal to the computations performed by a mechanism. Appealing to the computations performed by a mechanism is explanatory in so far as the mechanism is a computing mechanism, i.e. a mechanism subject to the relevant kind of functional analysis. By identifying more precisely the class of computers that support explanation by program execution and how they do so, the functional account of computers vindicates the use of explanation by program execution in the philosophy of mind (within the constraints of an appropriate functional analysis of the relevant mechanisms).
Finally, the present account of computers can be used to formulate a rigorous taxonomy of computationalist theses about the brain, which makes explicit their empirical commitments to specific functional properties of brains, and to compare the strength of the different empirical commitments of different computationalist theses. This makes it ideal to ground discussions of computational theories of mind and brain.
Atanasoff, J. V. (1940). Computing Machine for the Solution of Large Systems of Linear Algebraic Equations. Ames, Iowa, Iowa State College.
Atanasoff, J. V. (1984). "Advent of Electronic Digital Computing." Annals of the History of Computing6(3): 229-282.
Boshernitzan, M. (1986). "Universal Formulae and Universal Differential Equations." The Annals of Mathematics, 2nd Series124(2): 273-291.
Blanchowicz, J. (1997). "Analog Representation Beyond Mental Imagery." The Journal of Philosophy94(2): 55-84.
Brennecke, A. (2000). Hardware Components and Computer Design. The First Computers-History and Architectures. R. Rojas and U. Hashagen. Cambriedge, MA, MIT Press: 53-68.
Bromley, A. G. (1983). "What Defines a "General-Purpose" Computer?" Annals of the History of Computing5(3): 303-305.
Burks, A. R. (2002). Who Invented the Computer? Amherst, Prometheus.
Burks, A. R. and A. W. Burks (1988). The First Electronic Computer: The Atanasoff Story. Ann Arbor, University of Michigan Press.
Cohen, I. B. (1999). Howard Aiken: Portrait of a Computer Pioneer. Cambridge, MA, MIT Press.
Goodman, N. (1968). Languages of Art. Indianapolis, Bobbs-Merrill.
Churchland, P. S., C. Koch, et al. (1990). What is Computational Neuroscience? Computational Neuroscience. E. L. Schwartz. Cambridge, MA, MIT Press: 46-55.
Churchland, P. S. and T. J. Sejnowski (1992). The Computational Brain. Cambridge, MA, MIT Press.
Corcoran, J., W. Frank, et al. (1974). "String Theory." The Journal of Symbolic Logic39(4): 625-637.
Davis, M., R. Sigal, et al. (1994). Computability, Complexity, and Languages. Boston, Academic.
Duffin, R. J. (1981). "Rubel's Universal Differential Equation." Proceedings of the National Academy of Sciences USA78(8 [Part 1: Physical Sciences]): 4661-4662.
Eliasmith, C. (2003). "Moving Beyond Metaphors: Understanding the Mind for What It Is." Jounal of PhilosophyC(10): 493-520.
Engelsohn, H. S. (1978). Programming Programmable Calculators. Rochelle Park, NJ, Hayden.
Fodor, J. A. (1975). The Language of Thought. Cambridge, MA, Harvard University Press.
Gustafson, J. (2000). Reconstruction of the Atanasoff-Berry Computer. The First Computers-History and Architectures. R. Rojas and U. Hashagen. Cambridge, MA, MIT Press: 91-106.
Haugeland, J. (1981). "Analog and Analog." Philosophical Topics12: 213-225.
Jackson, A. S. (1960). Analog Computation. New York, McGraw-Hill.
Johnson, C. L. (1963). Analog Computer Techniques, Second Edition. New York, McGraw-Hill.
Hughes, R. I. G. (1999). The Ising Model, Computer Simulation, and Universal Physics. Models as Mediators. M. S. Morgan and M. Morrison. Cambridge, Cambridge University Press: 97-145.
Korn, G. A. and T. M. Korn (1972). Electronic Analog and Hybrid Computers; Second, Completely Revised Edition. New York, McGraw-Hill.
Lewis, D. K. (1971). "Analog and Digital." Nous5: 321-327.
Lipshitz, L. and L. A. Rubel (1987). "A Differentially Algebraic Replacement Theorem, and Analog Computability." Proceedings of the American Mathematical Society99(2): 367-372.
McCulloch, W. S. and W. H. Pitts (1943). "A Logical Calculus of the Ideas Immanent in Nervous Activity." Bulletin of Mathematical Biophysics 7: 115-133.
Newell, A. and H. A. Simon (1976). "Computer Science as an Empirical Enquiry: Symbols and Search." Communications of the ACM19: 113-126.
Patterson, D. A. and J. L. Hennessy (1998). Computer Organization and Design: The Hardware/Software Interface. San Francisco, Morgan Kauffman.
Piccinini, G. (2003a). Computations and Computers in the Sciences of Mind and Brain. Pittsburgh, PA, University of Pittsburgh. URL =
Piccinini, G. (2003b). "Alan Turing and the Mathematical Objection." Minds and Machines13(1): 23-48.
Piccinini, G. (2003c). "Review of John von Neumann's The Computer and the Brain." Minds and Machines13(2): 327-332.
Piccinini, G. (forthcoming a). "Functionalism, Computationalism, and Mental Contents." Canadian Journal of Philosophy.
Piccinini, G. (forthcoming b). "The First Computational Theory of Mind and Brain: A Close Look at McCulloch and Pitts's 'Logical Calculus of Ideas Immanent in Nervous Activity'. Synthese.
Piccinini, G. (forthcoming c). "Functionalism, Computationalism, and Mental States." Studies in the History and Philosophy of Science.
Pour-El, M. B. (1974). "Abstract Computability and Its Relation to the General Purpose Analog Computer (Some Connections Between Logic, Differential Equations and Analog Computers)." Transactions of the American Mathematical Society199: 1-28.
Putnam, H. (1988). Representation and Reality. Cambridge, MA, MIT Press.
Pylyshyn, Z. W. (1984). Computation and Cognition. Cambridge, MA, MIT Press.
Rojas, R. (1998). "How to Make Zuse's Z3 a Universal Computer." IEEE Annals of the History of Computing20(3): 51-54.
Rojas, R. and U. Hashagen, Eds. (2000). The First Computers-History and Architectures. Cambridge, MA, MIT Press.
Rubel, L. A. (1989). "Digital Simulation of Analog Computation and Church's Thesis." Journal of Symbolic Logic54(3): 1011-1017.
Rubel, L. A. (1993). "The Extended Analog Computer." Advances in Applied Mathematics14(1): 39-50.
Rubel, L. A. and M. F. Singer (1985). "A Differentially Algebraic Elimination Theorem with Application to Analog Computability in the Calculus of Variations." Proceedings of the American Mathematical Society94(4): 653-658.
Searle, J. R. (1992). The Rediscovery of the Mind. Cambridge, MA, MIT Press.
Shannon, C. E. (1941). "Mathematical Theory of the Differential Analyzer." Journal of Mathematics and PhysicsXX(4): 337-354.
Siegelmann, H. T. (1999). Neural Networks and Analog Computation: Beyond the Turing Limit. Boston, MA, Birkhäuser.
Siegelmann, H. T. (2003). "Neural and Super-Turing Computing." Minds and Machines13(1): 103-114.
Turing, A. M. (1936-7 ). On computable numbers, with an application to the Entscheidungsproblem. The Undecidable. M. Davis. Ewlett, Raven.
Van der Spiegel, J., J. F. Tau, et al. (2000). The ENIAC: History, Operation and Recostruction in VSLI. The First Computers-History and Architectures. R. Rojas and U. Hashagen. Cambridge, MA, MIT Press: 121-178.
von Neumann, J. (1945). First Draft of a Report on the EDVAC. Philadelphia, PA, Moore School of Electrical Engineering, University of Pennsylvania.
von Neumann, J. (1958). The Computer and the Brain. New Haven, Yale University Press.
Wilkins, B. R. (1970). Analogue and Iterative Methods in Computation, Simulation, and Control, Chapman and Hall.
1 Many thanks to Peter Machamer for his comments on earlier drafts. Ancestors of this paper were presented at the Canadian Society for the History and Philosophy of Science, Toronto, Canada, May 2002, at Computing and Philosophy (CAP@CMU), Pittsburgh, PA, August 2002, and at the University of Pittsburgh in January 2003. Thanks to the audiences for their feedback.
2 This use of the phrase “large capacity” is due to John V. Atanasoff, who was apparently the first person to use the term “computer” for a kind of machine (Atanasoff 1940; Atanasoff 1984).
3 For a technical treatment of the theory of strings, see Corcoran et al. 1974. The present notion of computation over strings is a generalization of the rigorous notion of computation of functions over natural numbers, which was introduced by Alan Turing (Turing 1936-7) and other logicians. The present notion is relevant to the analysis of what are ordinarily called digital computers and computing mechanisms. In section 2.5 below, I will discuss so called analog computers and argue that they must be analyzed in terms of a different notion of computation.
4 For more details on these large-scale components, and for their functional analysis in terms of simpler components, see Piccinini 2003.
5 For more details on programmable calculators, and a statement that they are a kind of computers, see Engelsohn 1978.
6 Of course, each primitive operation of a calculator is performed by following a pseudo-algorithm (that is, an algorithm defined over finitely many inputs). But a calculator—unlike a computer—cannot follow any algorithm or pseudo-algorithm defined in terms of its primitive operations.
7 A consequence of this is the often-remarked fact that calculators cannot perform branches; namely, they cannot choose among different operations depending on whether some condition obtains. (Branching is necessary to compute all computable functions.)
8 Atanasoff 1940. On the ABC, see also Burks and Burks 1988, Burks 2002, and Gustafson 2000.
9 A similar proposal is made by Burks and Burks 1988, chap. 5.
10 This distinction between programmability insensitive to the input and programmability sensitive to the input was inspired by a somewhat analogous distinction between executing a fixed sequence and using the output as input, which is made by Brennecke 2000, pp. 62-64.
11 On the ENIAC, see Van der Spiegel et al. 2000.
12 This point is effectively made by Brennecke 2000, pp. 64-65.
13 On the Harvard Mark I, see Cohen 1999.
14 Cf. Brennecke 2000, p. 66.
15 Alan Turing is one of the few people who discussed this feature of internally soft programmable computers; he used it in his reply to the mathematical objection to the view that machines can think. For an extended discussion, see Piccinini 2003b.
16 Patterson and Hennessy list these two ideas as the essence of modern computers (Patterson and Hennessy 1998, p. 121).
18 For a valuable attempt at distinguishing between degrees to which a computer may be called general-purpose, see Bromley 1983.
19 For an introduction to computability theory, see Davis et al. 1994.
20 Strictly speaking, branching is not necessary for computational universality, but the alternative is too impractical to be relevant (Rojas 1998).
21 For more details on virtual memory, including its many advantages, see Patterson and Hennessy 1998, pp. 579-602.
22 Cf.: “The input to a neuron is analog (continuous values between 0 and 1)” (Churchland and Sejnowski 1992, p. 51).
23 Cf. Goodman 1968, Lewis 1971, Haugeland 1981, and Blanchowicz 1997.
24 Authoritative works on analog computers, which I have used as sources for the following remarks, include Jackson 1960, Johnson 1963, Korn and Korn 1972, and Wilkins 1970.
25 For some speculations in this direction, see Rubel 1993.
26 For more details on pipelining, see Patterson and Hennessy 1998, chap. 6.
27 For the notions of Boolean circuit and finite state automaton and their computing power, see Piccinini 2003.
28 Strictly speaking, (6) does not presuppose (5). For instance, universal TMs are not stored-program. In practice, however, all supporters of (6) also endorse (5), for the good reason that there is no evidence of a storage system in the environment—analogous to the tape of TMs—that would store the putative programs executed by brains.
29 For a detailed analysis of McCulloch and Pitts’s theory, see Piccinini forthcoming b.
30 For a more extended discussion of von Neumann’s theory of the brain, see Piccinini 2003a.