[The following is inspired by my reply to two questions posed by a highschool student: How fast does the human brain compute, and how much information can it store? My original response to that student has since been updated.]

There are several ways to answer the question about how fast the brain processes information.

The best answer for this question can be obtained because we have good estimates for the three main variables that enter into it: how many neurons (brain cells) we have, how fast a neuron can fire, and how many cells it connects to. A human being has about 100 billion brain cells. Although different neurons fire at different speeds, as a rough estimate it is reasonable to estimate that a neuron can fire about once every 5 milliseconds, or about 200 times a second. The number of cells each neuron is connected to also varies, but as a rough estimate it is reasonable to say that each neuron connects to 1000 other neurons- so every time a neuron fires, about 1000 other neurons get information about that firing. If we multiply all this out we get 100 billion neurons X 200 firings per second X 1000 connections per firing = 20 million billion calculations per second.

This estimate might easily be off by an order of magnitude- that is, it might be 10 times too high or low. It also is a bit misleading because it estimates the raw 'clock speed' of the brain, which is much higher than the number of real useful calculations we do in a second. An apparently much simpler way to approach the problem is to note that the time it takes for the brain to make a really simple decision- like naming a picture or reading a word aloud- is about 300-700 milliseconds. So we can say that brain can only make about two conscious calculations per second. However, this is also misleading, for a bunch of reasons. One reason is that many well-trained brains can make incredibly complex decisions that quickly. Moreover, even simple tasks like reading a word aloud are actually very complex, actually requiring huge amounts of low-level computation. Finally, note that your brain is doing all sorts of things unconsciously at the same time- maintaining your body and its relation to the world- whenever you are engaged in conscious calculations. So depending on whether you want the raw clock speed, or some higher-level measure of information processing, your question has two answers that differ widely.

It is interesting to put this into the perspective of contemporary technology. The 'clock speed' of a neuron is abysmal by technological standards. The central processing unit in the machine on which I wrote this document has a 1 Ghz. clock speed, which means it runs1000 million clock cycles per second. If we divide 1000 million by 200, we see that the CPU on my computer is 5 million times faster than the clock speed of a neuron. Of course the computational power of our brains comes from the fact that we have a lot of neurons. Nevertheless, the gap between technology and neurons is closing fast. It has been estimated many times by many different people, using uncontroversial projections into the future (the exponential growth curve suggested by Moore's law), that we will have a computer that can process as many bits per second as the human brain within a few decades at most. Soon thereafter, computers will exceed human beings in raw processing power. If you are under 50 years old as you read this, then you can reasonably bet that you will own a cheap desktop computer that will process more information than your brain does before you die (Ray Kurzweil estimates that cheap computers will process this fast by 2023). If you are under 25 years old, there is a good chance that you will own a cheap desktop computer that processes more information than the whole human race (Ray Kurzweil estimates that cheap computers will process this fast by 2049). This does not necessarily mean that the computers will be as intelligent as you or as a small town full of people, since at the moment we have no idea how to program computers to be generally intelligent. But nobody really knows what such computers might be capable of in the long run. If you are interested in speculation on this issue, check out Ray Kurzweil's site, and (if change doesn't make you squeamish) look into the Singularity.

The second question was about the storage capacity of the brain. This is much more difficult to measure, because we don't really understand how the brain stores information, and we do know that it can use very different methods for different storage problems. For example, excellent chess players are much much better at remembering chess positions than poor chess players, and trained musicians are much better at remembering music than untrained musicians.

Some people have estimated that the storage capacity of the human brain is functionally infinite- that is, we can essentially always find room to store more information if we want to, so no practical limit exists. A more principled lower estimate might be made using the numbers above. Let’s assume that a change in any connection strength between two connected neurons is equal to one bit of information and further assume (a huge over-simplification) that neural connections have just two possible strengths (like a bit in a computer, which is either 1 or 0). Then each neuron has ‘write’ access to 1000 bits of information, or about 1 kilobyte. So we have 100 billion (number of neurons) X 1 K of storage capacity, or 100 billion K. That’s about 100 million megabytes. Since in fact neural connections are not two-state but multi-state and since neuron bodies can also change their properties and thereby store information, this is a very low estimate, so you can see why some people have estimated it to be functionally infinite.

However, we can also make the same kind of 'reality adjustment' we did for the speed question above. As you probably know, the number of bits used to store an item on a computer is not equal to the number of items. For example, to store one letter of text (one item) on a computer takes a theoretical minimum of seven bits, and in real computers it usually takes more. To store one picture can take thousands or even millions of bits. The same must apply to the human brain. Each memory must be composed of many many bits. The first person to try to estimate the amount of storage in a human brain was Robert Hooke, in 1666. He estimated how fast he could think, multiplied by his lifespan, and decided that he had 2 x 10^9 bits of storage. He had a high estimate of himself: his estimate for an average person was twenty times less, at 10^8 bits! A psychologist named Tom Landauer wrote a paper in 1986 ("How Much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-term Memory", Cognitive Science, volume 10, pages 477-493), in which he tried to estimate from a review of experimental results how many useful distinctions a person might be able to remember in all. His estimate was one billion distinctions. At the 2002 Psychonomics conference, Landauer re-visited this question. He used a novel technical method (whose details need not concern us here) to estimate how much word knowledge a person had. His new estimate is in the same ball park as Hooke's: 4 x 10^8 bits.

These storage estimates may also be placed into a modern technological perspective. The storage capacity of a neuron is so-so by modern technological standards. The hard disk on my computer holds 60,000 megabytes (about 60 gigabytes), so we can say that by the first estimate above the storage capacity of the human mind is equal to about 1666 modern computer hard disks, and the brain looks pretty good. However, the near future promises to massively increase information storage capabilities. The past certainly suggests that it will. My first hard disk purchased in the mid-1980s (for more cost than my newest hard disk, even without taking inflation into account) held 20 megabytes, or 3000 times less than my latest one. If the future rate of increase of storage speed mirrors the rate I have seen in my own life to date, then we should be able to put the entire contents of a human brain on a cheap hard disk within about 15 years (if we can think of a way to get that information out of the brain!). Much better storage is certainly possible: for example, the DNA molecule inside your cells contains about 750 megabytes of information. This has made it possible for someone to take a crack at estimating the amount of information in a single male ejaculate.

However, molecular storage is nothing. In his book Robot, Hans Moravec discusses the possibility (not yet realized) of using individual low-energy photons to store information. If we can do this, then we will be able to store one megabyte in a single hydrogen atom, and ten billion megabytes in a small structure the size of a virus. By the above estimate, that's the equivalent of the complete memory contents of 100 people, all contained in a speck that is very much smaller than a single neuron.

The two questions have no definite answers, so I hope these rough answers will be satisfactory.