This piece from Wired reeks of that pop-science sensationalism that I often find hard to stomach, but it's a cool idea, and they have flashy graphics! Wired is like the US Weekly (OK, maybe more like People) of science/tech magazines; it focuses on science that's hot, in the Paris Hilton sort of way (Last minute edit: NO, Paris Hilton is not hot, this is referring to her common catchphrase "That's Hot," which she uses to indicate something that is cool or trendy or expensive. Fans of the movie Mean Girls may also relate to the term "Fetch," as in, that piece of clothing is soo fetch! Clearly, these terms can be applied to science-y things as well, as in, "that particle accelerator is hot!.") But they also simplify it to such an extreme as to state it as fact, thus seemingly removing the need for critical thought or consideration. If one were to only read Wired, and never examine more serious science journalism (or, maybe-just-maybe, actual science papers!), one would come to believe that researchers are curing cancer, are communicating with aliens, or are, in the case of the highlighted article, building computers equal to the human brain. Such computers, or computer, singular, the One Computer that is a conglomerate of all the world's data-calculation machines, interconnected by the world wide web, can already do more calculations than your brain. By 2040, this net of computers will have the combined processing power of all the human brains in the world. Wired's conclusion is simple: computers are totally becoming smarter than humans, which is awesome! It's so simple!
The article makes a simple analogy between a neuron (a nerve cell, as you might have heard it called) in the brain, and a transistor on a computer chip. The One Computer has many many many more transistors, or switches, than a single brain has neurons. But neurons are only part of the brain's processing story. It's the connections between neurons, or synapses, that matter for information processing, learning, and as a collective, thought. The synapse analog in the global computer is the hyperlink - a connection between webpages. Compared to the human brain, with roughly 100 trillion synapses, the One Computer falls short at 57 trillion hyperlinks. This is already half, sure, and is increasing exponentially, but I don't think the basic analogy holds up. In the brain, synapses, these neuron-neuron connections, are constantly changing, strengthening, weakening, decaying, shifting, and splitting. Their chemical makeup is also in constant flux; synapses are some of the most molecularly complicated cell interfaces around. When you learn something new, taking in and processing sensory information, it doesn't just run through a circuit of static nodes, like transistors and links, but the very nature of the cells change, in infinitely minute ways. This change occurs continuously, from the time your brain develops in the womb, to when you die. THAT's a lot of information, all locked up in one three-pound mass.
In order to form the equivalent connection on the internet, there would have to be a more (much more) fluid interface between nodes/machines/pages. The difference between the brain and a big computer is information density. We're talking millions of molecules - receptors, enzymes, signaling proteins, etc, moving, recycling, pulsating with life within a single synapse. If a computer system can ever approximate that sort of complexity at each of its connections, and do it on a mere few watts of power, I'll be impressed. What this Wired article fails to point out is that while we are busy congratulating ourselves for technological innovation, we forget that that innovation is plagiarized from the natural world. Something that's only been "online" for a couple billion years.
Okay, okay, I get it, that's not a "hot" message.