Monday, July 21, 2008

New Gadgets

Last Friday I turned 24 years old (nearly a quarter century!), and I ventured away from Ann Arbor for some fun with family and friends. I was given several nice gifts, but I want to highlight two that satisfy some of my greatest interests: technology and coffee.

First, the Flip Mino, a miniature camcorder.

It's about the size of a pack of cigarettes and holds up to an hour of video. The best part is the flip-out USB adapter; all I have to do is plug it into my MacBook and upload, or connect the TV Out cable and display images on a bigger screen. I've recorded a couple short ones, but nothing yet worth posting. There may be an opportunity to record and post video of some of the more visually interesting things I do in lab. YouTube, here I come!

Second, a Bialetti mukka, a stovetop device that allows you to make cappuccinos and lattes without an expensive milk steamer and espresso machine. It works great, like magic actually, as you can see in the demonstration video at that Williams Sonoma link.

This mukka is going to save me so much money. Every morning I get coffee at the Espresso Royale outside my building. Every morning. And not just coffee, sometimes I get fancy drinks, like a soy chai latte. This amounts to $2-4 a day. Just this summer, I've spent over $200 on coffee. Continue reading "New Gadgets"!

Monday, July 14, 2008

One computer to rule them all

This piece from Wired reeks of that pop-science sensationalism that I often find hard to stomach, but it's a cool idea, and they have flashy graphics! Wired is like the US Weekly (OK, maybe more like People) of science/tech magazines; it focuses on science that's hot, in the Paris Hilton sort of way (Last minute edit: NO, Paris Hilton is not hot, this is referring to her common catchphrase "That's Hot," which she uses to indicate something that is cool or trendy or expensive. Fans of the movie Mean Girls may also relate to the term "Fetch," as in, that piece of clothing is soo fetch! Clearly, these terms can be applied to science-y things as well, as in, "that particle accelerator is hot!.") But they also simplify it to such an extreme as to state it as fact, thus seemingly removing the need for critical thought or consideration. If one were to only read Wired, and never examine more serious science journalism (or, maybe-just-maybe, actual science papers!), one would come to believe that researchers are curing cancer, are communicating with aliens, or are, in the case of the highlighted article, building computers equal to the human brain. Such computers, or computer, singular, the One Computer that is a conglomerate of all the world's data-calculation machines, interconnected by the world wide web, can already do more calculations than your brain. By 2040, this net of computers will have the combined processing power of all the human brains in the world. Wired's conclusion is simple: computers are totally becoming smarter than humans, which is awesome! It's so simple!


The article makes a simple analogy between a neuron (a nerve cell, as you might have heard it called) in the brain, and a transistor on a computer chip. The One Computer has many many many more transistors, or switches, than a single brain has neurons. But neurons are only part of the brain's processing story. It's the connections between neurons, or synapses, that matter for information processing, learning, and as a collective, thought. The synapse analog in the global computer is the hyperlink - a connection between webpages. Compared to the human brain, with roughly 100 trillion synapses, the One Computer falls short at 57 trillion hyperlinks. This is already half, sure, and is increasing exponentially, but I don't think the basic analogy holds up. In the brain, synapses, these neuron-neuron connections, are constantly changing, strengthening, weakening, decaying, shifting, and splitting. Their chemical makeup is also in constant flux; synapses are some of the most molecularly complicated cell interfaces around. When you learn something new, taking in and processing sensory information, it doesn't just run through a circuit of static nodes, like transistors and links, but the very nature of the cells change, in infinitely minute ways. This change occurs continuously, from the time your brain develops in the womb, to when you die. THAT's a lot of information, all locked up in one three-pound mass.

In order to form the equivalent connection on the internet, there would have to be a more (much more) fluid interface between nodes/machines/pages. The difference between the brain and a big computer is information density. We're talking millions of molecules - receptors, enzymes, signaling proteins, etc, moving, recycling, pulsating with life within a single synapse. If a computer system can ever approximate that sort of complexity at each of its connections, and do it on a mere few watts of power, I'll be impressed. What this Wired article fails to point out is that while we are busy congratulating ourselves for technological innovation, we forget that that innovation is plagiarized from the natural world. Something that's only been "online" for a couple billion years.

Okay, okay, I get it, that's not a "hot" message. Continue reading "One computer to rule them all"!

Saturday, July 5, 2008

Check out this brain book

I'm reading Soul Made Flesh: The Discovery of the Brain--and How it Changed the World (buy it from Amazon), by Carl Zimmer. Zimmer is a science journalist and author, featured often in the New York Times and other newspapers, Discover, Scientific American, and other publications. He also has a blog, The Loom, and has written several books on eclectic science topics.

Soul Made Flesh is an account of the conceptual developments and discoveries of the past 2,000 years that led to our current understanding of the brain and nervous system, presented within their political and cultural context. From the start the focus is on Aristotle and the Four Humors, then Galen and Descartes. The meat of the book dwells on the Oxford circle in 17th century Britain. This group of natural philosophers included Robert Hooke, Christopher Wren, Robert Boyle, and Thomas Willis, with influences from Thomas Hobbes, among others. In the midst of English civil war, the fall of the monarchy at the hands of Oliver Cromwell, the eventual Restoration of royal rule, and many religious feuds, Oxford remained a bastion of scientific pursuit. Willis and the others performed countless dissections and experiments on the corpses of criminals (and dogs), leading to a detailed account of the anatomy of the nervous system and the first basic understanding of where human perception, thought, emotion, and intelligence reside. For nearly 1500 years before their discoveries, most people thought the heart was the center of human sensation and the soul was it's rational intellect. Aristotle's dogma loomed large over the Catholic church of Europe, and criticism of his Humors was serious business.

Most of the men of the Oxford group are known today for other work (e.g., Boyle, of Boyle's Law fame. Remember high-school chemistry?), or not known at all, but they are largely responsible for giving brain science (and in some ways, science in general) a systematic framework. It's true, though, that most were polymaths, dabbling in alchemy/chemistry, medicine, physics, astronomy, and other fields. Willis, the unofficial leader of the group, specifically published detailed drawings of the nerves, muscles, and brain, and is partly responsible for the idea of blood circulation. He was also the first psychiatrist, unknowingly diagnosing disorders like bipolar, depression, schizophrenia, and Alzheimer's centuries before they were fully understood. Most of this seems to be forgotten today. You might have heard of Boyle, or Hooke and Hobbes, but you probably don't know Willis. Ask any medical or neuroscience graduate student, however, and they will be able to credit Willis with at least one thing: the Circle of Willis.

The Circle of Willis is a ring of blood vessels that runs along the underside of the brain and brainstem, supplying it with blood. Oh, how many times I drew this picture during neuroanatomy last Spring. This is all that Willis gets, his name associated with a few (very important) blood vessels. It's a raw deal, if you ask me.

I'm two-thirds into this book at the moment - still in the 17th century - so I've yet to get to the more modern work of 19th century anatomists and the rise of the neuron in the 20th century - topics I'm more familiar with. So far this is a very nice account of the conceptual revolution underlying the shift away from a dualist perspective on the brain/mind that dates back to the ancient Greeks. The Oxford circle, though deeply religious themselves, moved understanding of human consciousness away from the sort of mystical soul-talk of the Middle Ages, and into the head. The cultural significance of this shift was huge, and Zimmer spends a lot of time discussing the persistent ideological struggles in Britain and elsewhere.

This obviously isn't just a neuroscience book. If you are interested in understanding how radical ideas develop within a cultural and political context, this offers an interesting perspective. Other late Renaissance philosophers and scientists, like Galileo, revolutionized their fields amidst similar tumult. There are also plenty of good bits on England early in it's colonial period, something most history classes seem to gloss over. Sometimes in the middle of the book, which I just read through, the chapters meander, and you're not sure who's book this is, who Zimmer is focusing on. I think that underscores the point, however, that big changes in the way people view themselves and their world, like the shift from a metaphysical mind to one made of matter, require a lot of time and a lot of people working to drive that change.

Continue reading "Check out this brain book"!