Creating Frankenstein

Mary Shelley’s 1818 novel Frankenstein is more than a literary work of the early 19th century–it also represents the scientific discoveries and enthusiasms of her time for electricity.

“I succeeded in discovering the cause of the generation and life; nay, more, I became myself capable of bestowing animation on life matter.” –Victor Frankenstein

In Frankenstein, electricity is seen as the secret of life, able to give life to the lifeless. Shelley merely reflected a belief that was becoming increasingly popular in American and European culture. In her novel, Victor Frankenstein alludes to lightning and to Galvanism as the basis for reanimating a lifeless cadaver. Luigi Galvani had popularized the idea of electricity as an innate force of life, what he called animal electricity. Galvani’s ideas had largely been supplanted in the scientific community by the time of Shelley, but the idea of an internal electrical fire and particularly reanimation remained strong in the public imagination.

Shelley herself mentioned discussing many of the electrical experiments going on in Europe and the United States with her husband, Percy Shelley, and Lord Byron. They, like the rest of the public, were especially intrigued with the idea of reanimating the dead. These discussions led Shelley to explore the moral and personal responsibilities of scientific advances in her own writing. She recognizes science as a powerful force but one capable of great harm if left uncontrolled. Victor Frankenstein uses science to create his monster yet it ultimately leads to his demise.

Interestingly, Shelley does not provide much description of the laboratory or the way in which Frankenstein is created. Only two sentences in the book mention lightning and Galvanism, though, spectacular electrical displays with shooting lightning bolts became the standard means for depicting the act of creation in movies.

Recharging my batteries

Why do we say we are “recharging our batteries” when we take a break or do something for ourselves? Or that we “short-circuited” if we can’t remember something?

It turns out that these phrases are directly tied to our enthusiasm for electricity in the 19th century. As I’ve written elsewhere, many scientists, doctors, and the general public came to believe that we all had a set amount of electricity in our bodies that made everything run–called our “vital fluid.” Modern urban living tended to deplete this energy source according to some leading doctors and scientists, our “internal battery” as the analogy went, and, therefore, we needed a “recharge” from a jolt of electricity. Our bodies were essentially electrical machines that could short-circuit and burn out just like any other machine.

Public enthusiasm for electricity by the late 19th was so great that many came to believe that electricity could fix anything! And waiting in the wings to take advantage of that deep desire were any number of doctors and entrepreneurs promising electrotherapy treatments for every dysfunction or ill-feeling you could imagine. Few in the general public completely understood electricity so they relied on manufacturers of electrical devices to educate them. In a world changing so fast with new inventions and technology, it was hard for anyone to know what was possible and probable. Advertisements for electrical devices made boisterous cure-all promises, and people richly rewarded those manufacturers for giving them what they wanted.

Books, entertainment, and even food of the late 19th century showed that the image of the “electric body” wasn’t just a metaphor–people willingly imbibed electricity directly in an attempt to receive all of its benefits. One scientist even compared its effects to that of the sun on the leaves of a plant.

In Europe, researchers studied electricity’s effect on school children. They outfitted a classroom with a high-frequency electrical current that ran for six months. At the end of the experiment, the researchers found that the children had grown an average of 20mm more than those not exposed to the continuous current. Their teachers also reported that they had grown smarter during the experiment due to the “quickening” of their faculties by electrical stimulation.

Popular culture teemed with electrical fads and follies, providing both tangible and intangible signs that linked electricity, and especially the electrified human body, with ideas of progress.

While electricity remains part of the treatment regiment for some diseases today, the idea of a vital fluid made of electricity that needs recharging has since passed out of popular and scientific medical theories. But its mark on our language remains.

Anglophilia

There’s just something about the United Kingdom that I can’t get enough of. I’m not sure when it began. In sixth grade, I did my country report on England. I’m not sure any other country was even a contender. I spent months making poster and after poster, until my English homage covered the entire front of the classroom as well as the fronts of the desks where I had carefully taped the 3-D map and word game that asked you to guess the American equivalent to the Britishism (i.e. lorry, boot, loo).

In college, I claimed to be a political science major for a month while I filled out the application to study British politics in London for a month. As soon as we got back, I dropped the double major but not my fascination with that part of the world. My first trip with my now-husband was to England and Wales. An accident? I think not. And where did we go for our honeymoon? That’s right. Ireland (yes, not part of the UK now but still in that magic realm of those isles), Northern Ireland, and Scotland.

Where does this anglophilia come from?

It turns out that anglophilia has a long history in the United States–it’s not, in fact, merely confined to bedrooms in Redmond, Washington. Anglophilia is about admiring England, its people and culture. The Federalists (Alexander Hamilton and John Adams, kind of), one of the first pseudo political parties in our nascent nation, were generally anglophiles, while their rivals, the Democratic Republicans (James Madison and Thomas Jefferson), admired the French. Even as we threw off English authority, England itself retained some symbolic value and a compelling object of attention throughout the 19th century and into the 20th.

Affinity with another nation allows people to feel some release from the burdens of their own nationality. Personally, whenever I feel fed up with American politics, I only need to pull up the BBC or the Guardian on my computer to happily immerse myself in David Cameron’s latest idea. This England in my mind and that of other anglophiles isn’t necessarily a true image of the country, however. Our anxieties and wishes are often imposed on our image of England. Anglophilia owes much of its energy to a backward belief in the aura of the British. The Englishness that Americans love may not exist at all.

The United Kingdom played an integral part in the United States’ history as well as the way we defined ourselves after breaking free. Benedict Arnold once said that “it is useful to remind ourselves that nations inspire love.” Even nations we didn’t want to be a part of anymore yet can’t seem to completely pull ourselves away.

Just Humor Me

The idea of staying in good humor or humoring those around us has an ancient lineage. It actually traces back to Greece and the humoral theory of medicine.

“To begin at the beginning: the elements from which the world is made are air, fire, water, and earth; the seasons from which the year is composed are spring, summer, winter and autumn; the humours from which animals and humans are composed are yellow bile, blood, phlegm, and black bile.” –Galen

For centuries, the idea that an excess of phlegm or of yellow bile could cause illness was an accepted medical diagnosis. The four humours–yellow and black bile, phlegm, and blood–circulated throughout the body and an in balance in one or more were believed to be the cause of illness. The theory began in the 5th century BCE with work attributed to Greek physician Hippocrates (though it was his son-in-law and disciple Polybus who wrote the first treatise that clearly explained the whole idea of the humours) and continued with Roman doctor Galen, who adopted the theory in the 2nd century CE. For the next two thousand years (give or take some disruptions and such like the sacking of Rome), humoral theory explained most things about a person’s character, medical history, taste, appearance, and behavior.

Why? What was so compelling about this theory?

Well, for one, it seemed to unify passions and cognition, physiology and psychology, and the individual and his/her environment. Various parts of the body and the environment caused disease and stirred various emotions and passions. The humours also made sense to many cultures of people who based their earliest stories of creation on four elements: air, fire, water, and earth. Each of the four humours was tied to one of these elements so it seemed a natural extension of what people already knew about the world. Our human need to understand what we are made of, where we came from, and how we work often causes us to resort to structures and traditions that match our intuitions. The theory offered a potent image of substances, particles, or currents traveling through the body from the limbs to the organs to the brain and heart and back. It seemed to explain how the sight of an attractive person could trigger desire, induce a rush of blood in the veins, and increase the heartbeat.

The fall of Rome wasn’t the end of the humoral theory, though. It was more of a shift–to the east to Islam where the knowledge of the Greeks and Romans was saved and expanded upon, and to the abbeys were monks preserved ancient texts. (sidenote: In researching my book on apples, I discovered how much of ancient knowledge was preserved in the Islamic and Christian monastic traditions on apple orcharding and fruit growing in general. So it wasn’t too surprising to find that medical knowledge, too, lived on in these same places.)

And so the theory lived on and on, taking on various forms to fit the times but always coming back to the idea of balance and imbalance in the body as the source of illness. The theory only really died with the discovery of the existence of germs in the late 19th century. Not everyone bough the germ theory right away, however, so books based on humoral theory continued into the early 20th century.

Humours now remain mostly familiar in our expressions about keeping balanced and experiencing something with ill-humor. In French, the word for mood is humeur. Many Asian medical traditions are also humoral, based on the idea of energy flows, mind-body connections, and balances between hot and cold, moist and dry. So even if the theory is no longer used to describe disease (at least in the West), the idea of humours still serve as useful and suggestive images in our culture.

Who’s a Quack?


What makes someone a quack? Is he or she actually doing something nefarious or just doing something you don’t agree with?
When I first started reading medical history, I (foolishly) thought the line between a quack and a legitimate doctor were easily drawn. A quack is selling ridiculous medicines claiming to cure everything and bilking gullible people out of money, right? The real story isn’t nearly so simple.
Quack or man with a different idea?

Before the 20th century, medical knowledge was very limited. Those proclaiming themselves legitimate doctors rarely knew anything more than those hawking patent medicines and traveling from town to town. Many doctors engaged in what was thought of as “quackish” behavior, including advertising and putting their name on proprietary remedies. Some quacks even trained at celebrated medical schools or had medical licenses. There really was little scientific evidence separating the two, so calling someone a “quack” became an easy way of targeting those you didn’t agree with for one reason or another. So many people stood accused of quackery that the term lost any real meaning, though not its sting of opprobrium.  
Everyone felt okay excoriating quacks because all were sure they weren’t one. Most of the time, those calling out quacks were those in the medical establishment who belonged to some organization or institution or who had trained in Europe. But sometimes, so-called quacks called out other quacks. What makes someone a quack? Is he or she actually doing something nefarious or just doing something you don’t agree with?
When I first started reading medical history, I (foolishly) thought the line between a quack and a legitimate doctor were easily drawn. A quack is selling ridiculous medicines claiming to cure everything and bilking gullible people out of money, right? The real story isn’t nearly so simple.
Before the 20th century, medical knowledge was very limited. Those proclaiming themselves legitimate doctors rarely knew anything more than those hawking patent medicines and traveling from town to town. Many doctors engaged in what was thought of as “quackish” behavior, including advertising and putting their name on proprietary remedies. Some quacks even trained at celebrated medical schools or had medical licenses. There really was little scientific evidence separating the two, so calling someone a “quack” became an easy way of targeting those you didn’t agree with for one reason or another. So many people stood accused of quackery that the term lost any real meaning, though not its sting of opprobrium.  
Everyone felt okay excoriating quacks because all were sure they weren’t one. Most of the time, those calling out quacks were those in the medical establishment who belonged to some organization or institution or who had trained in Europe. But sometimes, so-called quacks called out other quacks.
Many that the medical establishment labeled as quacks simply disagreed with the medical therapies that had been practiced for centuries, including blood letting. And they had good reason to do so as many of these traditional practices had hurt and even killed people rather than helped them. 
As doctors began to organize into professional organizations in the mid-19th century, one of the motivating factors was to protect people from quacks. These organizations created sharp divisions between “insiders” and “outsiders.” But the ethical and moral grounds for this distinction weren’t nearly so clear, despite claims to the contrary. The medical marketplace was competitive and what these organizations did do was give some doctors a competitive advantage by their membership and illusory claims at standards, although many people found these organizations elitist and, obviously, exclusionary: but that was the point. 
So maybe the better way to think of quacks, doctors, and medical history more generally is to think of the development of the profession as one with many ways to prosperity. Medical men of all kinds were competing for custom, recognition, and financial reward in his own way, each straining to seize the high moral ground in a vicious arena. Some opted for the individualism of the entrepreneur and others opted for the safety and security of the establishment. None were better than the other with the scientific evidence available at the time. Surely, there were a few people who did know that they were peddling nothing more than alcohol and herbs in a jar and wanted to make as much money as possible, but it may not be as many as is commonly portrayed in the literature of the heroic doctor. 

Genius or Madness?

They say there’s a fine line between genius and madness. Some of our foremost thinkers and artists have also suffered from mental illness and/or lived really tumultuous, troubling lives. A recent New Yorker cartoon featured a teenage girl blaming her parents for sinking her writing career by giving her a stable, happy childhood.

It turns out, we’ve been thinking this way for a long time.

In the mid-19th century, French doctor Jacques Joseph Moreau attributed genius and madness to an overexcitation of certain parts of the brain. Moreau was a follower of phrenology, a system developed by Franz Josef Gall (1758-1828), that attributed various human attributes to specific areas of the brain. These attributes were mapped on the brain and could be used to measured to determine certain things about a person. An overly emotional person probably had a larger emotion section of the brain, for example.

A phrenology map of the brain

Moreau took this idea a step farther and applied it to nervous disorders. He believed that nervous energy could become more concentrated and active in certain people, causing an overexcitation in one part of the brain that could either result in insanity or genius. A build of energy in the thinking part of the brain could lead to raving madness or it could lead to a great work of literature or a whole new philosophical system. Moreau believed that an exalted state of mind could allow genius to spring forth! But it didn’t work for everyone.

In his book Morbid Psychology, Moreau wrote that “the virtue and the vices [of overexcitation] can come from the same foyer, the virtue being the genius, the vice, idiocy.” That is, the genius was in constant danger of crossing the line because, according to Moreau, creative energy exhibits all of the reveries, trances, and exalted moments of inspiration that madness often has.

Moreau also did a lot of work with the effects of drugs on the central nervous system, writing a book called Hashish and Mental Alienation that has made him a hero of sorts for the marijuana crowd.

Moreau’s work, along with several other doctors and scientists, greatly influenced late 19th century interpretations of neuroses and its causes. It may also have helped popularize the notion of madness as an antecedent to the creative process.

Living and Loving Lefse

To those without Viking blood coursing through their veins, the fight that ensues over the last piece of lefse at my grandmother’s house likely makes no sense.  How could an admittedly bland potato flatbread tear a family apart? But lefse is no ordinary food; it’s an edible legend.  And it’s a minor miracle that the pointy stick used to turn lefse on the grill has not resulted in injury… yet. 
Although recipes vary, mashed or riced potatoes, flour, sugar, and salt, along with some combination of milk, cream, shortening or butter, are the simple ingredients that turn to magic when rolled flat and grilled. Butter and sugar are traditional lefse fillings but in my family, anything from mashed potatoes and salad greens to green bean casserole are rolled inside like a Norwegian tortilla. The recent acquisition of a small cookbook called 99 Ways with Lefse was nothing too revolutionary for my family.
The pale, golden-brown flecked rounds are a staple of Norwegian grandmothers everywhere, especially in the Midwest, where lefse can be found in restaurants and in the refrigerated case at the grocery store. But the best lefse is homemade and I was fortunate to make lefse many times under the expert guidance of my grandmother.
More than 850,000 Norwegians came to the United States between 1820 and 1875, most to the Upper Midwest. Many left because of the impossibility of farming Norway’s mountainous, rocky terrain, leaving little opportunity for poorer families in a highly stratified society. So the Norwegians came to America, bringing the poverty foods of lefse and lutefisk that sustained them through artic winters and poor harvests with them.
In Norway, lefse virtually disappeared from the culinary landscape, but not in America, where Norwegian women would get together to make enough lefse to last the year. In many communities, a woman’s worth was measured by the thinness and lightness of her lefse.
The arsenal of the lefse maker
Lefse appears on the table of nearly every Norwegian-American family during the holidays. Or anytime it was available in mine, holiday or special occasion-be-damned. Everything seems more special when lefse is in the mix.
Like her Norwegian ancestors, my grandmother usually makes lefse once, maybe twice, a year, freezing small packets of lefse to last throughout the year. Truth be told, lefse making is chaos incarnate, which perhaps explains why a year’s supply is made in one fell swoop. The sticky, gummy dough sticks like library paste to the grooved rolling pin and counter tops, while a thin layer of flour covers every horizontal surface in addition to your face, hair, clothes, and the inside of your eyelids.
The dough is no match for my grandmother, though, whose slight frame masks a fierce rolling skill. The dough quickly becomes thin enough to “read the newspaper through,” her constant refrain as she rolls and watches my feeble attempts to match her dexterity. Good lefse requires careful discernment of the right amount of flour, the proper temperature of the griddle or pan, and the perfect temperature of the dough, neither too warm nor too cold. What’s right depends on the cook, however, as a discussion between relatives on whether to place the dough outside overnight before baking quickly grew heated; Garrison Keillor would have had a field day with this material. Skill and know-how are the badge of the good lefse maker, skills that can’t be learned from a cookbook.
The grilling is one area that I’ve managed to master with aplomb, lifting the dough in one swift swoop of my sword-shaped stick, laying down the edge, and rolling it out quickly so it lies flat on the round lefse griddle. Thirty seconds or so later, the lefse needs to be flipped. Timing is everything in achieving the perfect balance of knobby brown flecks and bubbles on the pale rounds. A whole batch can take all day.
The mess, frustration, and hard work are worth it. The foil wrapped packets that emerge from my grandmother’s freezer—and often arrive in the mail if we miss a holiday—taste all the better for the struggle that went into their creation.  Lefse both ties us to the past while simultaneously carrying forth our family’s cultural marker to the future.
These days, good commercial lefse is available in many places, making it a rare family that continues to make their own. But learning from a master like my grandmother, standing shoulder-to-shoulder in the kitchen with floured arms and hands, hot griddles, and melted butter, is the only way to keep the tradition alive.  

All Leaders Are Men… or so I hear

One of the best parts of my job at the radio station is receiving books in the mail. It’s like Christmas every day when I go in and find a big stack of books from publishers waiting for me below my mailbox. Even the bad ones–and many of them are really bad–are still fun to open because you just never know what will be inside.

The other day I got a book called Profiles in Leadership: Historians on the Elusive Quality of Greatness edited by Walter Isaacson. Being of a historical mind, I thought “great!” And then I opened it and looked at the table of contents. Ugh. Seriously?

George Washington…
Charles Finney…
Ulysses S. Grant…
Herbert Hoover and FDR…
Wendell Willkie…
Robert Kennedy…

And finally the one that I was looking for: Pauli Murray. The ONE token woman who counts as great (not to diminish Murray in any way. She was a champion for civil and human rights and she deserves our attention).  I couldn’t believe it. I mean, I could believe it because it isn’t uncommon to find books heralding our nation’s great men with nary a mention of the other half of the population. But it makes me sad that these books continue to be published, especially from Isaacson who’s previous work I’ve enjoyed.

It’s easy to find male leaders. The position of “leader” has throughout much of our history only been open to men. But that’s taking a very narrow view of leadership. If leadership is only open to those holding a high political office or leading big companies then women and people of color can be hard to find in history. But why is that our definition of leadership?

Where’s Elizabeth Cady Stanton and Susan B. Anthony? Stanton helped organize that world’s first women’s rights convention. The first convention in the world proclaiming that half of the human race had rights. Anthony worked tirelessly until the day she died speaking and writing and traveling for women’s rights, particularly suffrage. She spoke on stages with armed guards to protect her. Can you imagine? Is that not greatness and leadership?

How about Sojourner Truth? Or Wilma Pearl Mankiller? Dorothea Dix? Frances Perkins? Fannie Lou Hamer? Ida B. Wells? The list goes on and on (and if you want to see more, look at Equal Visibility Everywhere’s list of 100 Great American Women).

I’m not saying the men included in the book are not worthy of attention and consideration for their leadership skills and greatness. I just don’t think another book profiling mostly men–and nearly all white men at that–is the real story of leadership and greatness in this country or any country.

Songcatchers

Did you know that the government once paid people to record our nation’s musical heritage? In the years before World War II, fieldworkers, evocatively known as “songcatchers,” traveled around the country  recording, collecting, and transcribing folk music from everyone from lumberjacks to American Indians and recent immigrants. Many of the recordings ended up in the Library of Congress in its folk music collection, along with photographs and other ephemera from our nation’s singers and musicians.

A great picture of some Wisconsin women playing the Swiss bells.  WHI-25191

The urge to collect this music came from many sources. One was technology. As electricity spread and more people bought radios, many folklorists and other songcatchers worried that people would sing along with the radio rather than their traditional music, spelling the end to the rich and vital music of our nation’s ethnic heritage.

Another was employment. During the Depression, several New Deal programs, including the Federal Music Project, the California Folk Music Project, the Wisconsin Folk Music Project, and the Resettlement Administration, gave unemployed men and women jobs collecting music. One goal of the Federal Music Project was to record and define the American musical scene in all its variety.

Interestingly, many of the people given the task of collecting folk music were women. Frances Densmore, for instance, devoted her life to the study of American Indian music, visiting Indian communities across the Upper Midwest to study and transcribe their music. Another woman, Helen Heffron Roberts traveled to Jamaica, Hawaii, California, and the American Southwest collecting music and other ethnographic materials.

At the time, music, like many other professional fields, was largely closed to women. Many did not believe women possessed the bodily strength or presence of mind to play music in professional orchestras, conduct, or compose complete pieces. Yet these women became the first to go out in the field and live among their informants, studying and recording music. They truly were pioneers in American ethnomusicology and in pushing women forward into new careers.

This music truly is great stuff. It’s nice to think that we once thought it was so important to invest in our culture.

Here’s a verse from a song collected in Wisconsin called “Fond du Lac Jail:”
“In the morning you receive a dry loaf of bread
That’s hard as stone and heavy as lead
It’s thrown from the ceiling down into your cell,
Like coming from Heaven popped down into Hell.”

My story on Wisconsin songcatcher Helene Stratman-Thomas will appear in the winter issue of the Wisconsin Magazine of History.

The Hysterical Woman

Reading 19th century literature, you might start to think that every woman was a swooning, mad hysteric.  Women seemed forever prone to fainting or madness–few were ever fully in their right minds. Most acted somewhat bizarre and with high theatricality, not unlike the clothes that fashion dictated women should wear.



Hysteria was a purely female problem. Men suffered from their own version of nervous disease known as neurasthenia. The supposed problem? Women’s small size and supposed governance by their reproductive systems. That’s right. Women acted crazy because their womb overrode the power of their brains. The word even comes from the Greek word hystera  for womb.


So little was known about physical illnesses, such as epilepsy and neurological diseases, that the ”nervous” illnesses were generally lumped in together with them. The idea of neuroses as distinctly and purely in the mind was a minority view. More popular were a whole series of medical models for nervous illness and the psychological and physical options for treating them. Nerves were commonly ascribed a “force” that gave vitality to organs. Hysteria and neurasthenia were said to be caused by a weakness in this force, though the psychological evidence for this “force” and its loss were never evident. Anecdotes provided more powerful proof… at least for a time.  


Hysteria helped to reinforce existing gender and class attitudes. Women couldn’t hold positions of power or be trusted to vote if they were not rational beings! Hysteria worked very well at keeping women in the home and out of the public space, just what many men wanted. 


Hysteria could also work in a woman’s favor, though. Being classed as “sick” or “infirm” was one way to get out of performing your expected duties as a woman. In the book The Peabody Sisters by Megan Marshall, an amazing biography of three women in the 19th century, one of the sisters, Sophia, is often ill and seems to use her illness to forge an art career for herself. 


Female hysteria had mostly run its course by the 20th century. Doctors stopped recognizing it as a legitimate medical diagnosis and catch-all disease for any number of symptoms.