Category Archives: Fitness & Health

New Year Resolutions 2019: Health

@jpricephoto 3

In the New Year, target your own health and fitness.

Buy a gym membership. Then use it.

Hit the mark.

Advertisements

Fitness @ItsJessicaLaine

Q&A

All the best questions come from Liberals.

All the best answers come from Conservatives.

Give Thanks That We No Longer Live on the Precipice

Fossil fuels helped humanity improve our health, living standards and longevity in just 200 years.

Thanksgiving is a good time to express our sincere gratitude that we no longer “enjoy” the “simpler life of yesteryear.” As my grandmother said, “The only good thing about the good old days is that they’re gone.”

For countless millennia, mankind lived on a precipice, in hunter-gatherer, subsistence farmer and primitive urban industrial societies powered by human and animal muscle, wood, charcoal, animal dung, water wheels and windmills. Despite backbreaking dawn-to-dusk labor, wretched poverty was the norm; starvation was a drought, war or long winter away; rampant diseases and infections were addressed by herbs, primitive medicine and superstition. Life was “eco-friendly,” but life spans averaged 35 to 40 years.

Then, suddenly, a great miracle happened! Beginning around 1800, health, prosperity and life expectancy began to climb … slowly but inexorably at first, then more rapidly and dramatically. Today, the average American lives longer, healthier and better than even royalty did a mere century ago.

How did this happen? What was suddenly present that had been absent before, to cause this incredible transformation?

Humanity already possessed the basic scientific method (1250), printing press (1450), corporation (1600) and early steam engine (1770). So what inventions, discoveries and practices arrived after 1800, to propel us forward over this short time span?

Ideals of liberty and equality took root, says economics historian Deidre McCloskey. Liberated people are more ingenious, free to pursue happiness, and ideas; free to try, fail and try again; free to pursue their self-interests and thereby, intentionally or not, to better mankind – just as Adam Smith described.

Equality (of social dignity and before the law) emboldened otherwise ordinary people to invest, invent and take risks. Once accidents of parentage, titles, inherited wealth or formal education no longer controlled destinies, humanity increasingly benefitted from the innate inspiration, perspiration and perseverance of inventors like American Charles Newbold, who patented the first iron plow in 1807.

Ideas suddenly start having sex, say McCloskey and United Kingdom parliamentarian and science writer Matt Ridley. Free enterprise capitalism and entrepreneurship took off, as did commercial and international banking, risk management and stock markets.

Legal and regulatory systems expanded to express societal expectations, coordinate growth and activities, and punish bad actors. Instead of growing, making and buying locally, we did so internationally – enabling families, communities and countries to specialize, and buy affordable products from afar.

The scientific method began to flourish, unleashing wondrous advances at an increasingly frenzied pace. Not just inventions like steam-powered refrigeration (1834) but, often amid heated debate, discoveries like the germ theory of disease that finally bested the miasma theory around 1870.

All this and more were literally fueled by another absolutely vital, fundamental advance that is too often overlooked or only grudging recognized: abundant, reliable, affordable energy – the vast majority of it fossil fuels. Coal and coal gas, then also oil, then natural gas as well, replaced primitive fuels with densely packed energy that could power engines, trains, farms, factories, laboratories, schools, hospitals, offices, homes, road building and more, 24 hours a day, seven days a week, 365 days a year.

The fuels also ended our unsustainable reliance on whale oil, saving those magnificent creatures from extinction. Eventually, they powered equipment that removes harmful pollutants from our air and water.

Today, coal, oil and natural gas still provide 80% of America’s and the world’s energy for heat, lights, manufacturing, transportation, communication, refrigeration, entertainment and every other aspect of modern life. Equally important, they supported and still support the infrastructure and vibrant societies, economies and institutions that enable the human mind (what economist Julian Simon called our Ultimate Resource) to create seemingly endless new ideas and technologies.

Electricity plays an increasingly prominent and indispensable role in modern life. Indeed, it is impossible to imagine life without this infinitely adaptable energy form. By 1925, half of all U.S. homes had electricity; a half century later, all did – from coal, hydroelectric, natural gas or nuclear plants.

Medical discoveries and practices followed a similar trajectory, as millions of “invisible hands” worked together across buildings, cities, countries and continents – without most of them ever even knowing the others existed. They shared and combined ideas and technologies, generating new products and practices that improved and saved billions of lives.

Medical research discovered why people died from minor wounds, and what really caused malaria (1898), smallpox and cholera. Antibiotics (the most vital advance in centuries), vaccinations and new drugs began to combat disease and infection. X-rays, anesthesia, improved surgical techniques, sanitation and pain killers (beginning with Bayer Aspirin in 1899) permitted life-saving operations. Indoor plumbing, electric stoves (1896) and refrigerators (1913), trash removal, and countless other advances also helped raise average American life expectancy from 46 in 1900 to 76 (men) and 81 (women) in 2017.

Washing visible hands with soap (1850) further reduced infections and disease. Wearing shoes in southern U.S. states (1910) all but eliminated waterborne hookworm, while the growing use of window screens (1887) kept hosts of disease-carrying insects out of homes. Petrochemicals increasingly provided countless pharmaceuticals, plastics and other products that enhance and safeguard lives.

Safe water and wastewater treatment – also made possible by fossil fuels, electricity and the infrastructure they support – supported still healthier societies that created still more prosperity, by eliminating the bacteria, parasites and other waterborne pathogens that made people too sick to work and killed millions, especially children. They all but eradicated cholera, one of history’s greatest killers.

Insecticides and other chemicals control disease-carrying and crop-destroying insects and pathogens. Ammonia-based fertilizers arrived in 1910; tractors and combines became common in the 1920s. Today, modern mechanized agriculture, fertilizers, hybrid and biotech seeds, drip irrigation and other advances combine to produce bumper crops that feed billions, using less land, water and insecticides.

The internal combustion engine (Carl Benz, 1886) gradually replaced horses for farming and transportation, rid cities of equine pollution (feces, urine and corpses), and enabled forage cropland to become forests. Today we can travel states, nations and the world in mere hours, instead of weeks – and ship food, clothing and other products to the globe’s farthest corners. Catalytic converters and other technologies mean today’s cars emit less than 2% of the pollutants that came out of tailpipes in 1970.

Power equipment erects better and stronger houses and other buildings that keep out winter cold and summer heat, better survive hurricanes and earthquakes, and connect occupants with entertainment and information centers from all over the planet. Radios, telephones, televisions and text messages warn of impending dangers, while fire trucks and ambulances rush accident and disaster victims to hospitals.

Today, modern drilling and mining techniques and technologies find, extract and process the incredible variety of fuels, metals and other raw materials required to manufacture and operate factories and equipment, to produce the energy and materials we need to grow or make everything we eat, wear or use.

Modern communication technologies combine cable and wireless connections, computers, cell phones, televisions, radio, internet and other devices to connect people and businesses, operate cars and equipment, and make once time-consuming operations happen in nanoseconds. In the invention and discovery arena, Cosmopolitan magazine might call it best idea-sex ever.

So, this holiday season, give thanks for all these blessings – while praying and doing everything you can to help bring the same blessings to billions of people worldwide who still do not enjoy them.

Original article: Give Thanks That We No Longer Live on the Precipice

Did Neanderthals Start Fires?

It is one of the most iconic Christmas songs of all time.

Written by Bob Wells and Mel Torme in the summer of 1945, “The Christmas Song” (subtitled “Chestnuts Roasting on an Open Fire”) was crafted in less than an hour. As the story goes, Wells and Torme were trying to stay cool during the blistering summer heat by thinking cool thoughts and then jotting them down on paper. And, in the process, “The Christmas Song” was born.

Many of the song’s lyrics evoke images of winter, particularly around Christmastime. But none has come to exemplify the quiet peace of a Christmas evening more than the song’s first line, “Chestnuts roasting on an open fire . . . ”

Gathering around the fire to stay warm, to cook food, and to share in a community has been an integral part of the human experience throughout history—including human prehistory. Most certainly our ability to master fire played a role in our survival as a species and in our ability as human beings to occupy and thrive in some of the world’s coldest, harshest climates.

But fire use is not limited only to modern humans. There is strong evidence that Neanderthals made use of fire. But, did these creatures have control over fire in the same way we do? In other words, did Neanderthals master fire? Or, did they merely make opportunistic use of natural fires? These questions are hotly debated by anthropologists today and they contribute to a broader discussion about the cognitive capacity of Neanderthals. Part of that discussion includes whether these creatures were cognitively inferior to us or whether they were our intellectual equals.

In an attempt to answer these questions, a team of researchers from the Netherlands and France characterized the microwear patterns on bifacial (having opposite sides that have been worked on to form an edge) tools made from flint recovered from Neanderthal sites, and concluded that the wear patterns suggest that these hominins used pyrite to repeatedly strike the flint. This process generates sparks that can be used to start fires.1 To put it another way, the researchers concluded that Neanderthals had mastery over fire because they knew how to start fires.

However, a closer examination of the evidence along with results of other studies, including recent insight into the cause of Neanderthal extinction, raises significant doubts about this conclusion.

What Do the Microwear Patterns on Flint Say?

The investigators focused on the microwear patterns of flint bifaces recovered from Neanderthal sites as a marker for fire mastery because of the well-known practice among hunter-gatherers and pastoralists of striking flint with pyrite (an iron disulfide mineral) to generate sparks to start fires. Presumably, the first modern humans also used this technique to start fires.

The research team reasoned that if Neanderthals started fires, they would use a similar tactic. Careful examination of the microwear patterns on the bifaces led the research team to conclude that these tools were repeatedly struck by hard materials, with the strikes all occurring in the same direction along the bifaces’ long axis.

The researchers then tried to experimentally recreate the microwear pattern in a laboratory setting. To do so, they struck biface replicas with a number of different types of materials, including pyrites, and concluded that the patterns produced by the pyrite strikes most closely matched the patterns on the bifaces recovered from Neanderthal sites. On this basis, the researchers claim that they have found evidence that Neanderthals deliberately started fires.

Did Neanderthals Master Fire?

While this conclusion is possible, at best this study provides circumstantial, not direct, evidence for Neanderthal mastery of fire. In fact, other evidence counts against this conclusion. For example, bifaces with the same type of microwear patterns have been found at other Neanderthal sites, locales that show no evidence of fire use. These bifaces would have had a range of usages, including butchery of the remains of dead animals. So, it is possible that these tools were never used to start fires—even at sites with evidence for fire usage.

Another challenge to the conclusion comes from the failure to detect any pyrite on the bifaces recovered from the Neanderthal sites. Flint recovered from modern human sites shows visible evidence of pyrite. And yet the research team failed to detect even trace amounts of pyrite on the Neanderthal bifaces during the course of their microanalysis.

This observation raises further doubt about whether the flint from the Neanderthal sites was used as a fire starter tool. Rather, it points to the possibility that Neanderthals struck the bifaces with materials other than pyrite for reasons not yet understood.

The conclusion that Neanderthals mastered fire also does not square with results from other studies. For example, a careful assessment of archaeological sites in southern France occupied by Neanderthals from about 100,000 to 40,000 years ago indicates that Neanderthals could not create fire. Instead, these hominins made opportunistic use of natural fire when it was available to them.2

These French sites do show clear evidence of Neanderthal fire use, but when researchers correlated the archaeological layers displaying evidence for fire use with the paleoclimate data, they found an unexpected pattern. Neanderthals used fire during warm climate conditions and failed to use fire during cold periods—the opposite of what would be predicted if Neanderthals had mastered fire.

Lightning strikes that would generate natural fires are much more likely to occur during warm periods. Instead of creating fire, Neanderthals most likely harnessed natural fire and cultivated it as long as they could before it extinguished.

Another study also raises questions about the ability of Neanderthals to start fires.3 This research indicates that cold climates triggered Neanderthal extinctions. By studying the chemical composition of stalagmites in two Romanian caves, an international research team concluded that there were two prolonged and extremely cold periods between 44,000 and 40,000 years ago. (The chemical composition of stalagmites varies with temperature.)

The researchers also noted that during these cold periods, the archaeological record for Neanderthals disappears. They interpret this disappearance to reflect a dramatic reduction in Neanderthal population numbers. Researchers speculate that when this population downturn took place during the first cold period, modern humans made their way into Europe. Being better suited for survival in the cold climate, modern human numbers increased. When the cold climate mitigated, Neanderthals were unable to recover their numbers because of the growing populations of modern humans in Europe. Presumably, after the second cold period, Neanderthal numbers dropped to the point that they couldn’t recover, and hence, became extinct.

But why would modern humans be more capable than Neanderthals of surviving under extremely cold conditions? It seems as if it should be the other way around. Neanderthals had a hyper-polar body design that made them ideally suited to withstand cold conditions. Neanderthal bodies were stout and compact, comprised of barrel-shaped torsos and shorter limbs, which helped them retain body heat. Their noses were long and sinus cavities extensive, which helped them warm the cold air they breathed before it reached their lungs. But, despite this advantage, Neanderthals died out and modern humans thrived.

Some anthropologists believe that the survival discrepancy could be due to dietary differences. Some data indicates that modern humans had a more varied diet than Neanderthals. Presumably, these creatures primarily consumed large herbivores—animals that disappeared when the climatic conditions turned cold, thereby threatening Neanderthal survival. On the other hand, modern humans were able to adjust to the cold conditions by shifting their diets.

But could there be a different explanation? Could it be that with their mastery of fire, modern humans were able to survive cold conditions? And did Neanderthals die out because they could not start fires?

Taken in its entirety, the data seems to indicate that Neanderthals lacked mastery of fire but could use it opportunistically. And, in a broader context, the data indicates that Neanderthals were cognitively inferior to humans.

What Difference Does It Make?

One of the most important ideas taught in Scripture is that human beings uniquely bear God’s image. As such, every human being has immeasurable worth and value. And because we bear God’s image, we can enter into a relationship with our Maker.

However, if Neanderthals possessed advanced cognitive ability just like that of modern humans, then it becomes difficult to maintain the view that modern humans are unique and exceptional. If human beings aren’t exceptional, then it becomes a challenge to defend the idea that human beings are made in God’s image.

Yet, claims that Neanderthals are cognitive equals to modern humans fail to withstand scientific scrutiny, time and time, again. Now it’s time to light a fire in my fireplace and enjoy a few contemplative moments thinking about the real meaning of Christmas.

From Reasons to Believe: Did Neanderthals Start Fires?

The Origin of Human Chromosome 2: Another Look

In a September 2018 article, Fazale Rana discredited what is purported to be evidence for an evolutionary theory that humans descended from an “ancestral ape.”

The evolutionary argument posits that “two ancestral ape chromosomes [chimpanzee chromosomes 2A and 2B] fused to give rise to human chromosome 2” in what is referred to as an “ancient telomere-telomere fusion.”1 Proponents cite two recent articles in Nature as evidence for this hypothesis. The articles report how two teams of synthetic biologists successfully performed multiple telomere-telomere fusions and produced functional organisms in brewer’s yeast.2

Rana argues these articles actually undermine the chimpanzee-to-human hypothesis. Because the synthetic biologists had to undertake such extensive and precise gene editing, Rana contends that there is no way such fusions could have occurred via undirected random processes. Nevertheless, some might challenge Rana’s argument by claiming “anything can happen, given enough time.” We propose to address such claims through a series of “what if” scenarios.

What If Spontaneous Fusion Happened?

Let’s suppose chimpanzee chromosomes 2A and 2B truly did fuse spontaneously into human chromosome 2 in the distant past? Would the fusion have resulted in viable humans that outcompeted “ancestral apes” to become a dominant species?

First, it should be understood that telomeres are a DNA sequence that terminate chromosomes to provide protection and stabilization. Chromosomes occasionally break, and a broken chromosome can readily combine with another broken chromosome. One function of telomeres is to prevent intact chromosomes from fusing with either broken chromosome fragments or other intact chromosomes.

Yet the scientific literature reports that telomeres occasionally fail in this function, and fusion of intact chromosomes can occur. A protein complex known as shelterin appears to be critical to the protection process because when shelterin is absent telomeres can undergo end-to-end fusion. The result is genomic instability and possibly carcinogenesis.3 Furthermore, a 2013 article in Nucleic Acids Research reports, “Functional telomeres can be involved in spontaneous telomere fusions [in yeast, . . . resulting in] chromosome instability and may underlie early tumourigenesis” (emphasis added).4 The authors also reported that such spontaneous fusion events are rare.

This research suggests that if ancient chimpanzee chromosomes 2A and 2B did indeed undergo a telomere-telomere fusion into human chromosome 2, the likely outcome would have been genetic instability which might possibly lead to cancer. We could argue, then, that it is extremely unlikely that viable humans would have been the result.

Yet this argument might also not be sufficient for some. So suppose such a spontaneous telomere-telomere fusion did occur and produced a viable human chromosome 2. What would be the outcome?

What If Fusion Produced Viable Humans?

Humans and chimpanzees have a common reproductive process. Offspring receive two, paired versions of each chromosome: one from the father and one from the mother. During reproduction (meiosis), the pairs split into sperm (male) or ovum (female) “gametes” having half the number of chromosomes. Sperm and ovum gametes then come together, with each gamete contributing half the chromosomes to form a new individual. However, whereas human gametes each have 23 chromosomes, chimpanzee gametes have 24.

Suppose an ancient chimpanzee experienced a telomere-telomere fusion of chromosomes 2A and 2B into a viable human chromosome 2. In the reproductive process, it would generate a sperm or ovum gamete with 23 chromosomes—but what would it “mate” with? What is the likelihood that the same thing happened spontaneously with another chimpanzee of the opposite sex and that these two creatures with human chromosome 2 just happened to reproduce with each other? That seems intuitively improbable! Thus it would be more likely that the hypothetical 23-chromosome gamete would have had to join with a standard 24-chromosome chimpanzee gamete. There are historical experiments and other scenarios that illustrate the probable outcome of such a mating.

In the 1920s, Russian zoologist Ilia Ivanov, an expert in artificial insemination, sought to create an ape-human hybrid with the financial backing of the Russian Bolshevik government and the American Association for the Advancement of Atheism.5 He succeeded in inseminating three female chimpanzees with human sperm, but no successful fertilization occurred.6 His efforts went no further, as he became the victim of a Stalinist purge.

Ivanov’s three unsuccessful efforts do not prove that it is impossible to unite a 23-chromosome gamete produced by telomere-telomere fusion with a normal 24-chromosome chimpanzee gamete. Nevertheless, the notion resembles the well-known case of mating a male donkey (31 pairs of chromosomes) with a female horse (32 pairs) to produce a mule. Mules cannot reproduce—or at least they reproduce so infrequently that it is big news when it does occur. In humans, an extra copy of chromosome 21 results in Down syndrome. Hence, it is unlikely that such an ancient chimpanzee union could have produced a viable human.

This unlikelihood is borne out by the synthetic biologists who reduced yeast chromosomes. They found a “reduction in gamete production and viability in meiosis”7 in the yeast cells with reduced numbers of chromosomes. They also found that these modified yeast cells were unlikely to reproduce with “wild,” unmodified yeast.

Rana concludes that “the fusion of yeast chromosomes in the lab makes it hard to think that unguided evolutionary processes could ever successfully fuse two chromosomes, including human chromosome 2, end on end.” To this we add that even if such a fusion event did occur in the distant past, it is hard to think that the result would have been viable, reproducing humans. Creation appears to make even more sense.

From Reasons to Believe: The Origin of Human Chromosome 2: Another Look