Dec
31
2010
Today marks the end of my fourth year of blogging. It is estimated that 60-80% of new blogs go dead within a month, so I’m happy to have survived for four years. NeuroLogica has fulfilled what I intended for it – to keep me writing on a regular basis, to provide a useful outlet for engaging in the online skeptical conversation, and to attract attention from journalists looking for information or story ideas.
The world of blogging has evolved a bit over the past four years. Technorati tracks blogging trends, and their “state of blogging” report for 2010 notes several trends. They note that bloggers are using more social media to spread their blogs. And blogs have been having an increasing effect on mainstream media. I have definitely noticed this myself. In many cases the science news cycle has expanded to increased a phase of analysis by bloggers, followed by mainstream reporting of that analysis. In the past the media might give a completely bogus report on some science news item, and that would be the end of it, until a month or two later when the popular science journals covered the story in more depth. Now it takes a day or two for science bloggers to dissect and, if necessary, correct the story. This is often followed by the mainstream media then readdressing the story – “Hey, remember that story we told you a few days ago? Well, it turns out it’s BS.”
I also see a trend where journalist are increasingly going to popular science bloggers for information while writing the original report, rather than waiting to get smacked down after they publish. This is a good trend, and I think in order for journalists to survive they will have to take advantage of those scientists and experts who blog.
All things considered, I think blogging has had a positive impact on the flow and quality of information, and it is still not fully mature. I am glad to be a part of it.
Thanks to all my readers, especially those who take the time to regularly comment or who have sent me blogging ideas. Have a great 2011.
Dec
30
2010
For the last half a century (more than my entire lifetime) industrialized nations have lived a relatively bedbug-free existence. This year, however, bedbugs have started to make a comeback. Reports of bedbug infestations in hotels, theaters, stores, and homes have increased dramatically. Bedbugs, it seems, are not a thing of the past. Rather we have been living in a brief respite from these parasites, which will now resume their former levels of feasting on the blood of sleeping humans.
To me, this story was surprisingly surprising – meaning that I was initially surprised that we had not permanently dealt with the bedbug problem, and then I was curious as to why I was so surprised. Perhaps it is a result of some cognitive bias worth exploring.
But first – the story of bedbugs.
Continue Reading »
Dec
28
2010
A new report in Science details a possible new way to harness sunlight for energy. Researchers Chueh et al exploit the properties of the metal ceria to create a chemical reaction from sunlight that splits CO2 and H2O into CO (carbon monoxide), H2 (hydrogen gas), and O2 (oxygen gas). The hydrogen could be collected and used as fuel. The CO is a nasty byproduct, but it could be used in further chemical reactions to create other types of fuel, such as methane.
The ability to derive useful energy from sunlight is very appealing. More energy from the sun falls upon the surface of the earth than we use, by far. In fact, this energy in one year is greater than all our fossil fuel reserves.
The trick is how to efficiently and cost-effectively access this energy. (The term “exergy” is used to refer to the portion of energy that can be used to do work – and that is what we are talking about when we discuss harnessing energy.) While sunlight is abundant, free, renewable, and clean – is also has some drawbacks. Right now our methods of utilizing this resource are inefficient and not cost-effective. Further, sunlight is an intermittent source of energy – available only during the day and limited by cloud cover. It also varies by latitude, and so may become cost-effective in Florida and Mexico long before it is in Canada.
Continue Reading »
Dec
27
2010
Three years ago I wrote a post about a popular illusion – the spinning girl or silhouette illusion. This is a popular online illusion, and also remains my most popular post. (Original illusion by Nobuyuki Kayahara here.) The popularity of this illusion seems to be tied to the fact that it is used in many online quizzes, with the claim that the direction in which you see the girl spin will tell you which side of your brain is dominant. In my prior post I primarily addressed that claim – explaining that the “left brain – right brain” thing is all nonsense, and which way the girl appears to spin tells you nothing about your personality or talents. (Briefly – while many neurological functions are lateralized to one side of the brain or the other, both hemispheres are massively connected and work together to form your abilities and personality.)
The real question prompted by this illusion is why do we perceive it as rotating one way or the other, and is there a preference. It turns out, most people will see the girl spinning clockwise. You can get her to switch and spin the opposite way to your original perception – but when first looking at the illusion most people will see her spinning clockwise.
Continue Reading »
Dec
23
2010
I recently saw the movie Tron – the sequel that just came out last week. Not a bad sequel for a Disney movie – the plot and characters were a bit thin, but there was some nice eye-candy. The movie did try one thing that few have before it – it tried to accurately simulate a realistic person in CG. They needed a young Jeff Bridges so they created him in CG. It was pretty good, but just slightly off. I especially noticed it around the mouth when he spoke – it was creepy.
Despite advances in computer graphics and animation, it is not yet possible to create a convincing human. That’s partly why so many CG movies feature bugs, toys, robots, and dragons – they look seamless. But the human ability to discriminate human expression is remarkable, and so subtle that even the slightest imperfections are noticeable and tend to provoke an emotional response.
This phenomenon has been dubbed the “uncanny valley.” The term refers to a map of emotional acceptance on the vertical axis and accuracy of human simulation on the horizontal axis. As simulations appear more human we tend to accept them more, until you get close to realistic but not quite – then acceptance plummets into the uncanny valley where acceptance turns into revulsion.
Continue Reading »
Dec
22
2010
If ever there were an oxymoron it is this phrase: “scientific heresy.” I understand it may be used at times as a bit of poetic license, a metaphor for a new and seemingly outrageous (but scientific) idea, but I despise it none-the-less. The phrase is more often used as a direct or implied criticism of science and scientists, and generates deliberate confusion.
The notion of heresy is – well, Wikipedia actually has a good summary:
Heresy is a controversial or novel change to a system of beliefs, especially a religion, that conflicts with established dogma. It is distinct from apostasy, which is the formal denunciation of one’s religion, principles or cause, and blasphemy, which is irreverence toward religion. The founder or leader of a heretical movement is called a heresiarch, while individuals who espouse heresy or commit heresy, are known as heretics.
Heresiology is the study of heresy.
Now, I’m no heresiologist, but it seems to me that the core of the notion of “heresy” is inexorably tied to the notion of dogma – a fixed set of beliefs promoted and sustained through authority. Dogma and heresy are anathema to science.
Continue Reading »
Dec
21
2010
About a year ago I asked my readers and SGU listeners to send me suggestions about medical myths they would like to hear debunked. Now I get to reveal what project that was for – I was asked to record a course series for The Teaching Company. The final result is now available:
Medical Myths, Lies, and Half-Truths: What We Think We Know May Be Hurting Us
This is a series of 24 lectures, each 30 minutes long, on a variety of medical topics. It is available in audio (CD or download) and video (DVD) format. This was a year-long project, and contains a great deal of new material I have never written or lectured about before.
Continue Reading »
Dec
20
2010
Last week on SkepticBlog Michael Shermer wrote a nice post about JFK assassination conspiracies, and not surprisingly a couple of conspiracy advocates showed up in the comments. While reading through their arguments I was struck by how consistent the tactics and tone of conspiracy theorists tends to be. They are heavy on sarcasm, ridicule, and condescension, and like to call anyone who disagrees with them “gullible.”
It also struck me that skeptics can often take a similar tone, and certainly conspiracy theorists (as with deniers) think of themselves as being the true skeptics. But they are skeptics’ evil twins – they use a tone that only the harsher skeptics use, and only when dealing with the truly absurd – those topics that we do not wish to legitimize with serious treatment, but don’t wish to ignore either. Some claims deserve ridicule, and anything less falsely elevates them.
It is true that sometimes skeptics do not properly adjust their tone when dealing with topics that range from the truly absurd to the genuinely controversial. I do think it is counterproductive and unfair to attack a well-meaning and generally scientific individual with whom you happen to disagree about a complex and controversial topic, as if they were a homeopath or creationist. This is a minor problem, for example, with the show Bullshit. Penn and Teller have created a premise for their show that does not lend itself to a nuanced discussion of a scientific controversy – and so they end up giving circumcision and second-hand smoke the same treatment as magnet therapy and feng sui.
Continue Reading »
Dec
17
2010
This is my new favorite internet toy – the Google Books Ngram Viewer. You may already know that Google Books is a project to digitize as many books as possible. Many of the recent books have copyright, so they cannot be made available online for free (authors and publishers have to eat too). But the words can be made available. The Ngram viewer is a great example of the power of computers and the internet to facilitate research and human knowledge.
Google currently has 5,195,769 books digitized – this is a massive storehouse of knowledge about 400 years of human culture. What the Ngram Viewer allows you to do is to search on words, and it will print a graph of how many times those words appear in the books it has digitized. This allows you to see trends over time. Obviously this can be used to track word usage, and is a boon to etymologists, but those words also have meanings, and that can be tracked also. Obviously there are multiple variables involved, but still this is a powerful window into the reflection of human culture in the written word.
This is already the source of a great deal of research – which goes beyond simply searching on words to comparing multiple searches looking for trends. For example, one researcher compared the use of names of famous people to track their fame. He found that over time people tend to become famous quicker and younger, and their fame fades faster also.
I decided to do a little quick research myself. Here’s what I found:
Continue Reading »
Dec
16
2010
A recent opinion piece in the New Scientist by Dan Hind continues the debate about democratizing science – changing or adding to the institutions of science to introduce more democratic public participation. This is an interesting debate, with several facets, and I have some mixed feelings about it.
Hind’s article is mainly about science funding, which I will get to shortly. But if you search for articles on “democratizing science” you will see that the issue extends beyond funding. For example, it also includes the notion of making journal articles available for free to the public online. I am completely in favor of this form of democratization – open access, all the way. I understand the need for journals to be viable commercial entities. I would like for journals to find business models that are viable but still allow for open public access to content.
I am spoiled because I have institutional access through my job to most journals, but when I am away from work or want access to a journal that Yale does not subscribe to, it is extremely frustrating not to access to the full articles I need for my research. Sometimes this affects my blogging (and therefore must also affect other bloggers), and therefore it detracts from the public discourse on the topic. Usually the articles are available for pay – at what I think are ridiculous prices, often $50 or more for one time access to a single article. That’s just not in my blogging budget.
Continue Reading »