Archive for the 'General Science' Category

Feb 18 2019

Warning About Big Data in Science

Published by under General Science

At the AAAS this past weekend Dr Genevera Allen from Rice University in Houston presented her findings regarding the impact of using machine learning algorithms on evaluating scientific data. She argues that it is contributing to the reproducibility problem.

The core problem, which I have discussed many times before, is that if scientists do not use sufficiently rigorous methods, they will find erroneous patterns in their data that are not really real. Sometimes this amounts to p-hacking, which results from methods that may seem innocent but tweak the statistical results in order to manufacture significance. This could be something as innocuous as analyzing the data as you go and then stopping the study when you reach statistical significance. Or, similarly, if you initially plan on testing 100 subjects, and the results are not quite significant, you may decide to enroll another 20 subjects in the hopes that you will “cross the finish line.”

Here is another issue that is similar to what Allen is warning about. Let’s say a doctor notices an apparent pattern – my patients with disease X all seem to have red hair. So they review their patient records, and find that indeed there is an increased probability of having red hair if you have disease X. That all seems perfectly cromulent – but it isn’t. Or we could say that such a correlation is preliminary, and needs to be verified.

The reason for this is that the physician may have just noticed a random correlation in their patient population, a statistical fluke. Every data set, such as a patient population, will have many such spurious correlations by chance alone. If you notice such a random correlation, that doesn’t make it a real phenomenon, even if you then count the numbers and do the math. You haven’t tested the hypothesis that the correlation is real, you just confirmed an observation of a chance clumpiness in the data.

Continue Reading »

Like this post? Share it!

No responses yet

Feb 07 2019

Can We All Agree the Earth Is Warming Yet?

Published by under General Science

The last four years were the four warmest years on record (in the order 2016,2017,2015 and 2018). Since 1880 the average surface temperature on Earth has risen by about 1 degree C, 0.79 degrees above the 20th century average. At the same time global ice is decreasing, especially in the arctic which is losing 12.8% per decade. Sea level is also rising – in 2014 the average sea level was 2.6 inches above the 1993 average.

These numbers are all clear, but are abstract for most people. If scientists didn’t tell us the planet was warming on average, we wouldn’t see it in our daily lives. This made it easier to engage in politically motivated denial. The science is also complex, which leaves a lot of room for rationalization. You can focus on different types of measurement (surface temperatures vs atmospheric, for example) or on the need to adjust the raw data to account for historical changes in measurements. You could play games with the statistics to manufacture an illusory “pause” or focus on the uncertainty.

To be clear, carefully examining the details is critically important. The problem is when you do so with an agenda other than objectively describing reality. There is enough wiggle room to convince yourself of whatever it is you want to believe.

The needle appears to be moving, however, not because the science has become more solid, but because the effects have become more obvious. Extreme weather events are also significantly increasing, by about 40% since 1950. Just in the past year we had record breaking fires on the West coast, record breaking tornadoes in my home state of CT, and now record breaking polar vortex driven cold in the midwest. We are seeing more powerful hurricanes, like Hurricanes Florence and Michael.  There are record breaking heat waves around the world, with Death Valley having the warmest month ever recorded on Earth last July.

Continue Reading »

Like this post? Share it!

No responses yet

Jan 28 2019

Climate Change Survey

Published by under General Science

Climate change has certainly been a hot topic over the last year, so where do we stand in terms of public perception? Well – “The Energy Policy Institute at the University of Chicago and The AP-NORC Center conducted a national survey of 1,202 adults in November 2018 to explore Americans’ views on climate change, carbon tax and fuel efficiency standards.” Overall the survey suggests the public is inching toward greater acceptance of man-made climate change, but let’s delve into the numbers.

First, the majority of Americans, 71%, think that global warming is happening. Of those that think it is happening, 60% say it is mostly due to human activity, 12% that it is mostly natural, and 28% that it is evenly mixed. So that means that 62% (88% of 71%) of Americans accept anthropogenic climate change to some degree. That is a solid majority, but not as solid as the science or the consensus of scientists.

Predictably, these numbers vary dramatically according to political affiliation. For Democrats, the percent that believe global warming is happening and think that it is mostly or partly human-caused is 86/82. For independents the numbers are 70/64, and for Republicans its 52/37. So it seems that part of the problem is a knowledge deficit, for even among Democrats the rate of acceptance is lower than for the scientific community. But a larger part of this resistance is ideological.

For those who have recently changed their minds into accepting the reality of climate change (regardless of cause) 76% say it is because of recent extreme weather events, and 57% because of personal observations of local weather. Meanwhile, 63% say it is because of arguments in favor of climate change (they did not ask specifically about scientists or the scientific consensus). So essentially recent converts are relying more on anecdote than the science.

Things get more interesting when we get to questions about possible solutions to climate change. Previous research found that people who deny anthropogenic climate change were motivated primarily by “solution aversion” – they deny the problem because they don’t like the proposed solutions. This fits with much far right rhetoric, that climate change is a conspiracy by which liberals seek to expand the government and take control of the energy industry.

Continue Reading »

Like this post? Share it!

No responses yet

Dec 14 2018

More Evidence Organic Farming is Bad

Published by under General Science

I know I have been hitting this topic frequently recently, but I can’t ignore a major study published in Nature. The study is not just about organic farming, but about how we use land and implications for climate change, specifically carbon sequestration. The core idea is this – when we consider land use and its impact on the climate, we also have to consider the opportunity cost of not using the land in a more useful way. This echoes a previous study by different authors I discussed five months ago, and a review article by still different authors I discussed three months ago.

There certainly does now seem to be a growing consensus that we have to think very carefully about how we use land in order to minimize any negative impact on the environment, and specifically limit carbon in the atmosphere driving climate change.

The new study essentially argues that we need to use land optimally. If land is well suited to growing corn, then we should grow corn. If it is better suited for forestation, then we should allow forests to grow there and not convert it to farmland. Forests sequester a lot more carbon than farmland, and this is a critical component to any overall strategy to mitigate climate change. The authors calculate that land use contributes, “about 20 to 25 per cent of greenhouse gas emissions.”

If we put the various studies I have been discussing together, a compelling image emerges. First, we need to consider that we are already using all the best farmland to grow crops. Any expansion of our farmland will by necessity be using less and less optimal land for farming. This translates to a greater negative impact on the climate. However, our food production needs will grow by about 50% by 2050.

This is a strong argument, in my opinion, against biofuels. We need that land to grow food, not fuel – unless we can source biofuels from the ocean or industrial vats without increasing land use.

Continue Reading »

Like this post? Share it!

No responses yet

Nov 26 2018

New Pew Survey About GMOs

Published by under General Science

The Pew Research Center has recently published a large survey regarding American’s attitudes toward food, including genetic modification, food additives, and organics. There are some interesting findings buried in the data that are worth teasing out.

First, some of the top line results. They found that 49% of Americans feel that genetically modified organisms (GMOs) are bad for health, while 44% said they were neutral, and 5% said they were better. So the public is split right down the middle over the health effects of GMOs. The 49% who feel that GMOs are bad for health is up from 39% when they gave the same survey in 2016 – so unfortunately, we have lost ground on this issue.

Breaking these numbers down, we find that women are a little more likely to fear GMOs as a health risk than men, 56% compared to 43%. I suspect this is due primarily to differences in how anti-GMO messages are marketed, and the general marketing of pseudoscience to women (the Goop effect). This is also significant because women are more likely to make food purchasing decisions for their families.

Even more interesting is the relationship between science knowledge and fear of GMO’s – among those with a high degree of science knowledge, 38% thought GMOs had health risks, while 52% of those with a low degree of science knowledge thought so. The same pattern is seen through all the subquestions about GMOs. For example, 49% of those with a high degree of science knowledge believe GMOs have the potential to increase the global food supply, while only 20% of those with a low degree of science knowledge believe this.

Continue Reading »

Like this post? Share it!

No responses yet

Nov 20 2018

Possible New Branch of Eukaryotes Defined

Scientists report in Nature the indentification of two new species of Hemimastigophora, a predatory protist. What makes the paper newsworthy is that the authors are arguing that their genetic analysis suggests Hemimastigophora, currently categorized as a phylum, should instead be its own suprakingdom.

To make sense of this let’s review the basic structure of taxonomy, the system we use to categorize all life. All known life is divided first into three domains, the bacteria, archaea, and eukaryotes. Bacteria and archaea do not have a nucleus, while eukaryotes are larger and have a nucleus which contains most of their DNA.

Eukaryotes are divided into kingdoms, including plants, animals, fungus, protozoa, and chromista (algae with a certain kind of chlorophyll). Kingdoms are then divided into phyla, which are essentially major body plans within that group.

This is a simplified overview, because there is a lot of complexity here, with suprakingdoms, subkingdoms, and further breakdowns. Further, there is a lack of consensus on how to exactly divide up these major groups. Even in the cited paper, the authors say there are 5-8 “suprakingdom level groups” within the domain eukaryotes. The number of kingdoms depends on which scheme you use, and how you interpret the existing evidence.

The reason for uncertainty is that we have not yet done a full genetic analysis on every known group. Further, when we discover new species that lie outside of the existing scheme, we have to rethink how different groups are actually related.

Continue Reading »

Like this post? Share it!

No responses yet

Nov 16 2018

Changing the Kilogram

This is one of those items that at first does not seem like a big deal, and probably won’t get much play in the mainstream media, but is actually a significant milestone. Today, the international General Conference on Weights and Measures will meet in Versailles, France, to vote on whether or not to adopt a new standard for the kilogram. This is a formality, because this change has been worked on for years and the standard is now all set to change.

I have been reading a lot recently about the history of science and technology, and one common theme is that an important core feature of our modern society is infrastructure. If, for example, there were some sort of apocalypse, what would it take to reboot society? Theoretically, we would preserve much of our knowledge in books and would not have to start from scratch. The limiting factor would likely be infrastructure. Gasoline engines won out over electric engines for cars partly (and some believe primarily) because the infrastructure for distributing gasoline was put in place before the electrical infrastructure.

Science itself also has an infrastructure, which includes standard weights and measures. This sounds boring, but being able to precisely measure something, using standardized units that every scientist around the world can use, is critically important to both science and technology. Anything that makes doing science easier reduces the cost and increases the pace of science, with incredible downstream benefits.

In 1879 Le Grand K (or the International Prototype Kilogram – IPK) was created – this is a cylinder of platinum and iridium that is the ultimate reference for 1 kilogram. This hunk of metal is kept in a double bell jar, and never touched. Even a slight finger print would change how much it weighs. From this original kilogram, exact copies were made and distributed to countries to serve as their national standard. Occasionally these copies are sent back to France to compare to the original. These copies are then used to calibrate equipment used for precise measurement.

Continue Reading »

Like this post? Share it!

No responses yet

Nov 13 2018

Engineering Photosynthesis

Published by under General Science

There’s some bad news, followed by good news, but partially countered by further bad news. The bad news is that our population is growing, and therefore our food requirements, and yet we are approaching the limits of our ability to increase crop yield with cultivation alone. Experts can quibble about whether or not we are at or near the limit, but it’s pretty clear that we are not going to double crop efficiency in the next 50 years through cultivation.

That, however, is pretty much what we need to do if we are going to meet humanity’s caloric needs. By 2050 yields will need to be 60% higher than 2005, and needs will likely continue to rise before they stabilize. Sure, there are some gains to be made in reducing waste, but not nearly enough. And sure, we need to take steps to stabilize our population more quickly, like fighting poverty and promoting the rights of women in developing countries.

But even under optimistic conditions – we simply need to grow more food. Further, as I recently reviewed, we are pretty much using all the good arable land available. Expanding into more land for growing food is not a good option.

So really we have one viable option if we are going to meet our food needs – genetically modifying crops. That is the good news – GMOs actually have the potential to significantly increase crop yields. One way to do that is through making photosynthesis more efficient. It turns out, there are several ways to do this.

First, there is a difference between C3 and C4 photosynthetic pathways. The C4 pathway is more efficient, and increases biomass production. Part of the efficiency is through better carbon concentration mechanisms. This pathway has independently evolved in many plants, and there are others that are part way between C3 and C4, but our major food crops all use C3.

Continue Reading »

Like this post? Share it!

No responses yet

Oct 23 2018

Problems With That Organic Food and Cancer Study

Published by under General Science

One of the frustrating aspects of how science is reported in the mainstream media is when a complex study with very unclear results is presented with a misleading bottom line. Most people read only the headline, or perhaps the first paragraph, in order to glean the essence of a scientific study. They don’t read deep into the reporting to find the important details, or go to the study itself.

This is especially problematic when the study is of a preliminary design, or when the author’s conclusions are biased or misleading.

The most recent example of these issues is a study looking at the consumption of organic food and the risk of cancer. CNN reported the study as showing: “You can cut your cancer risk by eating organic, a new study says.” No – that is not what the study shows.

The study itself is not bad, for what it is, but it is highly limited in the conclusions that can be drawn from it, and it has some serious limitations. The researchers looked at a French database of good consumption, NutriNet-Santé Prospective Cohort Study. They had volunteers fill out food diaries for three days, report their organic food consumption, gathered demographic and other lifestyle data, and then followed them for over four years, using various methods to track the incidence of cancer.

Continue Reading »

Like this post? Share it!

No responses yet

Oct 05 2018

Neanderthal Healthcare

Published by under General Science

Neanderthals were our close cousins. They are the closest species to modern humans that we know of. There is also the Denisovans, which are currently classified as a subspecies of Homo sapiens, but may eventually be classified as their own species.

Neanderthals lived from 400,000 to 40,000 years ago. They spread out of Africa, and throughout Europe and Asia. When modern humans arrived later, there was some clear interbreeding going on – Europeans and Asians have about 2% Neanderthal DNA. In fact a recent study suggests that modern humans specifically retained Neanderthal genes that conveyed improved resistance to European viruses.

The first fossil specimen of Neanderthal was discovered in 1829, although this was not recognized until later. The first recognized specimen was collected in 1856 in the Neander valley in Germany. This was the first early hominid specimen found. Perhaps because of the time it was discovered, our image of Neanderthal is still colored by the notions of the day. “Primitive” was synonymous with brutish and animalistic.

Continue Reading »

Like this post? Share it!

No responses yet

Next »