A Nuptial Head Reading

In 1844, Lydia Folger married Lorenzo Niles Fowler in Nantucket. Fowler was, with his brother Orson Squire, the foremost proponents of phrenology in the United States, so it should probably come as no surprise that some head reading occurred at the ceremony. Lorenzo read the bumps on the head of Lydia’s uncle Walter, declaring, so the story goes, that his ego was nearly as large as his genius.

I ran across this story on the website of the Nantucket Historical Society while looking up more about Lydia Folger Fowler. Lydia was a remarkable woman. The second woman to graduate from medical school in the United States (and the first American-born–Elizabeth Blackwell was English), she lectured extensively on health, anatomy, physiology, and hygiene in addition to practicing medicine. She wrote books and taught courses to women.

In her lectures to women, Lydia praised their roles as mothers but also urged them to think about the other years of their lives, those not tied to child bearing and raising. She told them that an educated mother made the best mother. She constantly emphasized how important it was for individuals to study, practice, and perfect themselves. This was especially important for mothers, Lydia said, who had a responsibility to more than merely caretakers of their children and husbands.

Lydia’s story is little known, in part, because of her gender but also because of her marriage to one of the famous Fowler brothers. She was also the cousin of a much more famous woman: Lucretia Mott, the famed Quaker woman’s rights advocate.  But she made important contributions to the history of women and medicine. And who could forget a wedding that involved head bump reading?

Healing the Spiritual and Physical

In colonial America, your minister often provided succor to those afflicted in faith and fever. Ministers were usually the most educated people in a community so it made sense that they would be entrusted with medical care as well. Not to mention that so little was known about disease and its causes that fusing religion and medicine made as much sense as any other theory (and probably more so to the deeply religious American colonists). As historian Richard D. Brown put it, “even the most rational and learned individual..saw no clear boundary between physical and spiritual phenomena.” The joining of these two roles, however, was not without its complications.


Cotton Mather, one of the most prominent Puritan ministers in New England, introduced smallpox inoculation in 1721, a move that ignited a fierce debate in the Puritan community. Puritans believed that every affliction was proof of God’s special interest in their affairs. Some suggested that smallpox (a serious and deadly threat in colonial America) was perhaps God’s punishment for sin and that to interfere with that through inoculation or anything else would only anger God and provoke more punishment. Mather did not waver, though. He wrote that “whether a Christian may not employ this Medicine (let the matter of it be what it will) and humbly give Thanks to God’s good Providence in discovering of it to a miserable World; and humbly look up to His Good Providence (as we do in the use of any other Medicine) It may seem strange, that any wise Christian cannot answer it. And how strangely do Men that call themselves Physicians betray their Anatomy, and their Philosophy, as well as their Divinity in their invectives against this Practice?”Eventually, inoculation did gain widespread acceptance. 


The spread of print and increased literacy helped bring elite medicine to the masses through home medical manuals. Many ministers as well as their followers had copies of some of the most popular manuals, including William Buchan’s Domestic Medicine (1769) and John Wesley’s Primitive Physick (1747). 


Wesley, better known as the founder of Methodism, was extremely interested in the democratization of medicine. Believing that medicine and doctors were often only accessible to the wealthy, he wrote his own medical manual to allow anyone to easily treat themselves. Wesley considered making medical knowledge and treatments available and comprehensible to the public was part and parcel of his pastoral duties. 


The minister-doctor predominated until at least 1750 and continued on to the early 19th century in some areas of the country. It was only with the rise of medical education, apprenticeships, and licensing laws that  the two forms of healing separated. But ministers continued to play an integral role in healing, particularly in many of the alternative health movements (homeopathy, hydropathy, osteopathy, etc) that flourished in the 19th century.

What’s so heroic about "heroic medicine?"

Heroes don’t usually make you bleed or vomit. You’ll never see Superman fight evil doers by lancing someone or forcing large doses of calomel down Lex Luther’s throat.

And yet “heroic” was what medicine in the 18th and 19th centuries was called. At least the medicine practiced by so-called “regular” doctors through at least the mid-19th century (those who didn’t practice heroic medicine were known as alternative or sectarian practitioners–or in unkinder moments, quacks).

Heroic medicine consisted of bleeding, purging, leeching, blistering, and sweating patients to release disease. Calomel, mercurous chloride, was one of the most commonly used mineral concoctions, a harsh treatment that would induce vomiting and purging. At the time, most diseases were seen as systemic imbalances caused by something being either over- or under-stimulated in the body. Until the 1840s, most doctors believed that diseases overstimulated the body so most treatments involved lowering the overexcited patient back to a normal, healthy state. Bleeding often became the first “therapeutic” line of attack, being a seemingly easy way to get whatever disease was poisoning the system and knocking it out of balance back into place.

Heroic medicine gave clear evidence that it worked. Or at least that the treatment was doing something–and that something was often healing in of itself. People felt better just knowing that they were being treated, even if that treatment could sometimes kill them. It also gave the doctor the appearance of being in control of the situation. There’s just something about doing that feels a whole lot better than waiting and watching for nature to run its course, as it often does in disease. The body is an amazing healing machine.

So why heroic? The word apparently comes from the large dosage size and effect of the therapies. They didn’t just give you a bit of calomel. They gave you A LOT to produce a near instantaneous effect–which, in this instance, was mostly a lot of vomiting. According to the dictionary, “heroic” is “behavior that is bold and dramatic.” These treatments certainly were bold and the results often dramatic if not always healing.

But they were also rather harsh and public outcry against them helped lead to their demise in the 19th century.

Creating Frankenstein

Mary Shelley’s 1818 novel Frankenstein is more than a literary work of the early 19th century–it also represents the scientific discoveries and enthusiasms of her time for electricity.

“I succeeded in discovering the cause of the generation and life; nay, more, I became myself capable of bestowing animation on life matter.” –Victor Frankenstein

In Frankenstein, electricity is seen as the secret of life, able to give life to the lifeless. Shelley merely reflected a belief that was becoming increasingly popular in American and European culture. In her novel, Victor Frankenstein alludes to lightning and to Galvanism as the basis for reanimating a lifeless cadaver. Luigi Galvani had popularized the idea of electricity as an innate force of life, what he called animal electricity. Galvani’s ideas had largely been supplanted in the scientific community by the time of Shelley, but the idea of an internal electrical fire and particularly reanimation remained strong in the public imagination.

Shelley herself mentioned discussing many of the electrical experiments going on in Europe and the United States with her husband, Percy Shelley, and Lord Byron. They, like the rest of the public, were especially intrigued with the idea of reanimating the dead. These discussions led Shelley to explore the moral and personal responsibilities of scientific advances in her own writing. She recognizes science as a powerful force but one capable of great harm if left uncontrolled. Victor Frankenstein uses science to create his monster yet it ultimately leads to his demise.

Interestingly, Shelley does not provide much description of the laboratory or the way in which Frankenstein is created. Only two sentences in the book mention lightning and Galvanism, though, spectacular electrical displays with shooting lightning bolts became the standard means for depicting the act of creation in movies.

Recharging my batteries

Why do we say we are “recharging our batteries” when we take a break or do something for ourselves? Or that we “short-circuited” if we can’t remember something?

It turns out that these phrases are directly tied to our enthusiasm for electricity in the 19th century. As I’ve written elsewhere, many scientists, doctors, and the general public came to believe that we all had a set amount of electricity in our bodies that made everything run–called our “vital fluid.” Modern urban living tended to deplete this energy source according to some leading doctors and scientists, our “internal battery” as the analogy went, and, therefore, we needed a “recharge” from a jolt of electricity. Our bodies were essentially electrical machines that could short-circuit and burn out just like any other machine.

Public enthusiasm for electricity by the late 19th was so great that many came to believe that electricity could fix anything! And waiting in the wings to take advantage of that deep desire were any number of doctors and entrepreneurs promising electrotherapy treatments for every dysfunction or ill-feeling you could imagine. Few in the general public completely understood electricity so they relied on manufacturers of electrical devices to educate them. In a world changing so fast with new inventions and technology, it was hard for anyone to know what was possible and probable. Advertisements for electrical devices made boisterous cure-all promises, and people richly rewarded those manufacturers for giving them what they wanted.

Books, entertainment, and even food of the late 19th century showed that the image of the “electric body” wasn’t just a metaphor–people willingly imbibed electricity directly in an attempt to receive all of its benefits. One scientist even compared its effects to that of the sun on the leaves of a plant.

In Europe, researchers studied electricity’s effect on school children. They outfitted a classroom with a high-frequency electrical current that ran for six months. At the end of the experiment, the researchers found that the children had grown an average of 20mm more than those not exposed to the continuous current. Their teachers also reported that they had grown smarter during the experiment due to the “quickening” of their faculties by electrical stimulation.

Popular culture teemed with electrical fads and follies, providing both tangible and intangible signs that linked electricity, and especially the electrified human body, with ideas of progress.

While electricity remains part of the treatment regiment for some diseases today, the idea of a vital fluid made of electricity that needs recharging has since passed out of popular and scientific medical theories. But its mark on our language remains.

Just Humor Me

The idea of staying in good humor or humoring those around us has an ancient lineage. It actually traces back to Greece and the humoral theory of medicine.

“To begin at the beginning: the elements from which the world is made are air, fire, water, and earth; the seasons from which the year is composed are spring, summer, winter and autumn; the humours from which animals and humans are composed are yellow bile, blood, phlegm, and black bile.” –Galen

For centuries, the idea that an excess of phlegm or of yellow bile could cause illness was an accepted medical diagnosis. The four humours–yellow and black bile, phlegm, and blood–circulated throughout the body and an in balance in one or more were believed to be the cause of illness. The theory began in the 5th century BCE with work attributed to Greek physician Hippocrates (though it was his son-in-law and disciple Polybus who wrote the first treatise that clearly explained the whole idea of the humours) and continued with Roman doctor Galen, who adopted the theory in the 2nd century CE. For the next two thousand years (give or take some disruptions and such like the sacking of Rome), humoral theory explained most things about a person’s character, medical history, taste, appearance, and behavior.

Why? What was so compelling about this theory?

Well, for one, it seemed to unify passions and cognition, physiology and psychology, and the individual and his/her environment. Various parts of the body and the environment caused disease and stirred various emotions and passions. The humours also made sense to many cultures of people who based their earliest stories of creation on four elements: air, fire, water, and earth. Each of the four humours was tied to one of these elements so it seemed a natural extension of what people already knew about the world. Our human need to understand what we are made of, where we came from, and how we work often causes us to resort to structures and traditions that match our intuitions. The theory offered a potent image of substances, particles, or currents traveling through the body from the limbs to the organs to the brain and heart and back. It seemed to explain how the sight of an attractive person could trigger desire, induce a rush of blood in the veins, and increase the heartbeat.

The fall of Rome wasn’t the end of the humoral theory, though. It was more of a shift–to the east to Islam where the knowledge of the Greeks and Romans was saved and expanded upon, and to the abbeys were monks preserved ancient texts. (sidenote: In researching my book on apples, I discovered how much of ancient knowledge was preserved in the Islamic and Christian monastic traditions on apple orcharding and fruit growing in general. So it wasn’t too surprising to find that medical knowledge, too, lived on in these same places.)

And so the theory lived on and on, taking on various forms to fit the times but always coming back to the idea of balance and imbalance in the body as the source of illness. The theory only really died with the discovery of the existence of germs in the late 19th century. Not everyone bough the germ theory right away, however, so books based on humoral theory continued into the early 20th century.

Humours now remain mostly familiar in our expressions about keeping balanced and experiencing something with ill-humor. In French, the word for mood is humeur. Many Asian medical traditions are also humoral, based on the idea of energy flows, mind-body connections, and balances between hot and cold, moist and dry. So even if the theory is no longer used to describe disease (at least in the West), the idea of humours still serve as useful and suggestive images in our culture.

Who’s a Quack?


What makes someone a quack? Is he or she actually doing something nefarious or just doing something you don’t agree with?
When I first started reading medical history, I (foolishly) thought the line between a quack and a legitimate doctor were easily drawn. A quack is selling ridiculous medicines claiming to cure everything and bilking gullible people out of money, right? The real story isn’t nearly so simple.
Quack or man with a different idea?

Before the 20th century, medical knowledge was very limited. Those proclaiming themselves legitimate doctors rarely knew anything more than those hawking patent medicines and traveling from town to town. Many doctors engaged in what was thought of as “quackish” behavior, including advertising and putting their name on proprietary remedies. Some quacks even trained at celebrated medical schools or had medical licenses. There really was little scientific evidence separating the two, so calling someone a “quack” became an easy way of targeting those you didn’t agree with for one reason or another. So many people stood accused of quackery that the term lost any real meaning, though not its sting of opprobrium.  
Everyone felt okay excoriating quacks because all were sure they weren’t one. Most of the time, those calling out quacks were those in the medical establishment who belonged to some organization or institution or who had trained in Europe. But sometimes, so-called quacks called out other quacks. What makes someone a quack? Is he or she actually doing something nefarious or just doing something you don’t agree with?
When I first started reading medical history, I (foolishly) thought the line between a quack and a legitimate doctor were easily drawn. A quack is selling ridiculous medicines claiming to cure everything and bilking gullible people out of money, right? The real story isn’t nearly so simple.
Before the 20th century, medical knowledge was very limited. Those proclaiming themselves legitimate doctors rarely knew anything more than those hawking patent medicines and traveling from town to town. Many doctors engaged in what was thought of as “quackish” behavior, including advertising and putting their name on proprietary remedies. Some quacks even trained at celebrated medical schools or had medical licenses. There really was little scientific evidence separating the two, so calling someone a “quack” became an easy way of targeting those you didn’t agree with for one reason or another. So many people stood accused of quackery that the term lost any real meaning, though not its sting of opprobrium.  
Everyone felt okay excoriating quacks because all were sure they weren’t one. Most of the time, those calling out quacks were those in the medical establishment who belonged to some organization or institution or who had trained in Europe. But sometimes, so-called quacks called out other quacks.
Many that the medical establishment labeled as quacks simply disagreed with the medical therapies that had been practiced for centuries, including blood letting. And they had good reason to do so as many of these traditional practices had hurt and even killed people rather than helped them. 
As doctors began to organize into professional organizations in the mid-19th century, one of the motivating factors was to protect people from quacks. These organizations created sharp divisions between “insiders” and “outsiders.” But the ethical and moral grounds for this distinction weren’t nearly so clear, despite claims to the contrary. The medical marketplace was competitive and what these organizations did do was give some doctors a competitive advantage by their membership and illusory claims at standards, although many people found these organizations elitist and, obviously, exclusionary: but that was the point. 
So maybe the better way to think of quacks, doctors, and medical history more generally is to think of the development of the profession as one with many ways to prosperity. Medical men of all kinds were competing for custom, recognition, and financial reward in his own way, each straining to seize the high moral ground in a vicious arena. Some opted for the individualism of the entrepreneur and others opted for the safety and security of the establishment. None were better than the other with the scientific evidence available at the time. Surely, there were a few people who did know that they were peddling nothing more than alcohol and herbs in a jar and wanted to make as much money as possible, but it may not be as many as is commonly portrayed in the literature of the heroic doctor. 

Genius or Madness?

They say there’s a fine line between genius and madness. Some of our foremost thinkers and artists have also suffered from mental illness and/or lived really tumultuous, troubling lives. A recent New Yorker cartoon featured a teenage girl blaming her parents for sinking her writing career by giving her a stable, happy childhood.

It turns out, we’ve been thinking this way for a long time.

In the mid-19th century, French doctor Jacques Joseph Moreau attributed genius and madness to an overexcitation of certain parts of the brain. Moreau was a follower of phrenology, a system developed by Franz Josef Gall (1758-1828), that attributed various human attributes to specific areas of the brain. These attributes were mapped on the brain and could be used to measured to determine certain things about a person. An overly emotional person probably had a larger emotion section of the brain, for example.

A phrenology map of the brain

Moreau took this idea a step farther and applied it to nervous disorders. He believed that nervous energy could become more concentrated and active in certain people, causing an overexcitation in one part of the brain that could either result in insanity or genius. A build of energy in the thinking part of the brain could lead to raving madness or it could lead to a great work of literature or a whole new philosophical system. Moreau believed that an exalted state of mind could allow genius to spring forth! But it didn’t work for everyone.

In his book Morbid Psychology, Moreau wrote that “the virtue and the vices [of overexcitation] can come from the same foyer, the virtue being the genius, the vice, idiocy.” That is, the genius was in constant danger of crossing the line because, according to Moreau, creative energy exhibits all of the reveries, trances, and exalted moments of inspiration that madness often has.

Moreau also did a lot of work with the effects of drugs on the central nervous system, writing a book called Hashish and Mental Alienation that has made him a hero of sorts for the marijuana crowd.

Moreau’s work, along with several other doctors and scientists, greatly influenced late 19th century interpretations of neuroses and its causes. It may also have helped popularize the notion of madness as an antecedent to the creative process.

The Hysterical Woman

Reading 19th century literature, you might start to think that every woman was a swooning, mad hysteric.  Women seemed forever prone to fainting or madness–few were ever fully in their right minds. Most acted somewhat bizarre and with high theatricality, not unlike the clothes that fashion dictated women should wear.



Hysteria was a purely female problem. Men suffered from their own version of nervous disease known as neurasthenia. The supposed problem? Women’s small size and supposed governance by their reproductive systems. That’s right. Women acted crazy because their womb overrode the power of their brains. The word even comes from the Greek word hystera  for womb.


So little was known about physical illnesses, such as epilepsy and neurological diseases, that the ”nervous” illnesses were generally lumped in together with them. The idea of neuroses as distinctly and purely in the mind was a minority view. More popular were a whole series of medical models for nervous illness and the psychological and physical options for treating them. Nerves were commonly ascribed a “force” that gave vitality to organs. Hysteria and neurasthenia were said to be caused by a weakness in this force, though the psychological evidence for this “force” and its loss were never evident. Anecdotes provided more powerful proof… at least for a time.  


Hysteria helped to reinforce existing gender and class attitudes. Women couldn’t hold positions of power or be trusted to vote if they were not rational beings! Hysteria worked very well at keeping women in the home and out of the public space, just what many men wanted. 


Hysteria could also work in a woman’s favor, though. Being classed as “sick” or “infirm” was one way to get out of performing your expected duties as a woman. In the book The Peabody Sisters by Megan Marshall, an amazing biography of three women in the 19th century, one of the sisters, Sophia, is often ill and seems to use her illness to forge an art career for herself. 


Female hysteria had mostly run its course by the 20th century. Doctors stopped recognizing it as a legitimate medical diagnosis and catch-all disease for any number of symptoms. 

The Flow of Electricity

Did you know that we used to think electricity was a fluid? That’s why we use words like “current” and “flow” to describe what is clearly not a liquid. But in the 19th century, electricity was a force that many believed had the power to heal and recharge humans.

In the early years of scientific medicine, many doctors came to believe that nervous diseases had physical causes and could only be cured by physical means. Some compared humans to batteries, saying that we sometimes suffered from a low charge that needed a jolt of energy to recharge. This idea seemed to offer a legitimate explanation for the worn out businessman, the languishing youth, and the excitable, swooning woman of the 19th century. This “disease” came to be called “neurasthenia,” a name popularized by physician George M. Beard who combined all of the so-called symptoms of the disease under one named cause.

The solution? Electricity (it also probably didn’t hurt that Beard happened to be friends with Thomas Edison, the electric wizard of Menlo Park).  Many doctors came to believe that we could recharge our “neural batteries” with electricity. And so various devices purporting to soothe weary brains began to appear with increasing frequency. My favorite is Dr. Scott’s Electric Hair-Brush, which claimed to both make hair more glossy and calm worn-out brains.

The era of electric medicine began to come to an end by the turn-of-the-20th century. Because neither neurasthenia nor electricity actually exhibited physical changes–only proclamations of change (suggestion is powerful)–skepticism about its power began to emerge in the medical community. We also had new theories about mental illness and the mind as scientific research advanced. Electricity remained a magical force, though, its enigmatic nature and invisible power provoking the imagination to ever more fantastical heights.