In a recent post, PZ Myers aggressively critizises Ray Kurzweil's prediction that the human brain will be digitally simulated by 2029. Yesterday Kurzweil responded, restating his belief that the complexity of the brain, though considerable, must not be overestimated. One of Ray's arguments is that the genetic blue-print for the brain is about 50 MB. (Update: Myers has now responded to Kurzweil's reply.)
I'm torn here. Intuitively I agree with Myers: the brain consists of biological tissue and therefore has a theoretically infinite complexity. Our understanding even of a single synapse is for example still very limited, as evidenced by entire journals being dedicated to research on synapses. But there is of course a considerable degree of noise in the brain, suggesting a limit to useful complexity (Eliasmith 2000).
How much of the brain's complexity must we include in digital simulations for them to, say, pass the Turing test? This is where Kurzweil makes mistakes. Phrases and assertions like "the cerebellum (which has been modeled, simulated and tested)" and "We have sufficiently high-resolution in-vivo brain scanners now that we can see how our brain creates our thoughts and see our thoughts create our brain" do indeed indicate that, as Myers puts it, Kurzweil does not understand the brain, and only serve to remind us of Henry Markrams scathing (but accurate) dismissal of the IBM cat brain simulation.
The issue seems to be this: Kurzweil's successful predictions - the decoding of the human genome, the growth of the internet - concerned discrete systems, where units (base pairs decoded, computers connected) could be clearly defined and counted. Digital simulations of brains are growing exponentially in size and complexity, but we truly do not know how complex they need to be before they can be said to match their biological counterparts. Kurzweil needs to present a scientifically robust theory of brain function before neuroscientists will take his 2029 prediction seriously.