Wisconsin’s Hardy Early Settlers

I just recorded an audio piece for WPR’s Wisconsin Life series on the state’s settlers and why they stayed in such a cold place. Take a listen.

That’s some deep snow. Hurley, Wisconsin, 1899.

Get your head examined

Today, when you tell someone to “get his or her head examined,” you are usually implying that they are crazy. But the phrase had real currency in the 19th century. People really did get their heads examined–but not to find out they were crazy.

The phrase actually comes from the antebellum phrenology fad when people–all kinds of people, from president James Garfield to Walt Whitman–got their heads “read.”Phrenologists could read your character, including what you are good at and what weren’t, by looking at the bumps on your head. Supposedly, Clara Barton, founder of the Red Cross, and Ulysses S. Grant even picked their careers based on a head reading. Forget What Color is Your Parachute? In the 19th century, it was What Bumps are on My Brain?

Phrenology offered physical “proof” of your internal self. That was part of its appeal in an age when everyone was driven to “know thyself.”Your whole self could be understood by the landscape of your scalp, a powerful idea with incredible potential for making the world work better.

Not everyone thought phrenology was a great idea, though. Lots of people thought it was a crazy idea, which is likely where the phrase “getting your head examined” got it’s uncomplimentary overtones.

Cold Climates

When it’s 0 degrees outside, do you ever wonder why your ancestors decided to stick this out rather than move somewhere warmer? Why stay in the upper Midwest or New England when California, the Carolinas, or New Orleans beckoned?

In Wisconsin, the reasons people came and stayed had a lot to do with the kinds of people they were. Many of the state’s first immigrants came from cold places originally (Norway, Germany, Finland, Canada) so a cold winter was nothing they didn’t already know.

Warmer climates were also more susceptible to devastating outbreaks of disease, particularly yellow fever and malaria. Mosquitoes, the main carriers of these diseases, couldn’t survive our cold winters so the outbreaks were never as severe or as long-lasting here as there.

Wisconsin also looked like home to many immigrants. Something about the lay of the hills and fields reminded many of them of Norway, Germany, or Switzerland. Sure, they’d been on a boat for a while and maybe the time away and the deliriousness of travel had twisted their memories, making anything seem inviting after time spent crammed on the lower decks of a ship, but countless letters home described a new place that recalled a beloved homeland. Norwegians wrote glowing letters about the area just west of Madison near Blue Mounds, Mt. Horeb, and the town of Vermont. The Swiss loved the green hills of today’s Green County.

It also helped that many of the warmer places were not yet part of the United States in the 19th century or at least not yet as secure from potential Spanish takeover or other threats. Arizona didn’t become a state until 1912. Texas wasn’t sure it didn’t want to be an independent republic until the mid-19th century. Things were more settled in the north for the most part.

So thank your ancestors for settling somewhere cold. They may have kept your bloodline safe from yellow fever and found an easier new start in a place that seemed a lot like home.

A Nuptial Head Reading

In 1844, Lydia Folger married Lorenzo Niles Fowler in Nantucket. Fowler was, with his brother Orson Squire, the foremost proponents of phrenology in the United States, so it should probably come as no surprise that some head reading occurred at the ceremony. Lorenzo read the bumps on the head of Lydia’s uncle Walter, declaring, so the story goes, that his ego was nearly as large as his genius.

I ran across this story on the website of the Nantucket Historical Society while looking up more about Lydia Folger Fowler. Lydia was a remarkable woman. The second woman to graduate from medical school in the United States (and the first American-born–Elizabeth Blackwell was English), she lectured extensively on health, anatomy, physiology, and hygiene in addition to practicing medicine. She wrote books and taught courses to women.

In her lectures to women, Lydia praised their roles as mothers but also urged them to think about the other years of their lives, those not tied to child bearing and raising. She told them that an educated mother made the best mother. She constantly emphasized how important it was for individuals to study, practice, and perfect themselves. This was especially important for mothers, Lydia said, who had a responsibility to more than merely caretakers of their children and husbands.

Lydia’s story is little known, in part, because of her gender but also because of her marriage to one of the famous Fowler brothers. She was also the cousin of a much more famous woman: Lucretia Mott, the famed Quaker woman’s rights advocate.  But she made important contributions to the history of women and medicine. And who could forget a wedding that involved head bump reading?

New Domesticity

When you dive into the past, especially the history of women, you won’t go far before you run smack into the idea of domesticity. Domesticity belonged to women–the word encapsulated both what duties women had and the ideal of womanhood in the 19th century. Women were to be pious, pure, domestic, and submissive. A woman’s place was in the home, taking part in tasks and chores that maintained and fulfilled her piety and purity. Housework was one such “uplifting” task.

The idea of domesticity arose in the early 19th century when the growth of new industries, businesses and professions created a new class of Americans: the middle class. This new middle class did not have to make what it needed to survive. Men produced goods and performed services outside the home while women and children stayed home. A man going off to work out in the rough public world served to create the view that a man alone could support his family. Women were far too delicate to be out in the world. They needed to stay home and make the home a refuge for men from the unstable, immoral business world.
Even as more women moved out of the home and into the workplace in the 20th century, many of the ideas of domesticity and the equation of women with domestic work remained.

All of this was on my mind recently when I read a piece by Steph Larsen on Grist about the links between the DIY lifestyle (sewing and preserving food for instance) of today and domesticity of the past. Larsen recounts chaffing at her mom’s declaration of how domestic she’d become after she serves them a meal made up of foods she’d grown, harvested, preserved, and cooked. Many of my female friends preserve and cook for their families.  And I occasionally feel the same sense of unease that Larsen recounts as I happily make dinner for my husband many evenings and pack his lunch in the morning. Am I betraying my feminist forebearers? Or is somehow the fact that this is a choice rather than something women must do make it okay?

My desire to cook comes from a place of real enjoyment. As a kid, my mom hated to cook and so we ate many meals out in restaurants or from a box in the freezer. To my mom, cooking was drudgery. I feel the opposite but not because I feel any pressure to put food on the table. Cooking for me is a reprieve. One of the few things I do in my life that yields immediate results. Writing means waiting months if not years to see your efforts in its final form. Cooking and food are also, for me, a way of supporting local farmers and combatting an agricultural system I think is broken.

So while domesticity continues to include a body of home tasks associated with women and women alone, maybe the doors on the cage are more open now.

Healing the Spiritual and Physical

In colonial America, your minister often provided succor to those afflicted in faith and fever. Ministers were usually the most educated people in a community so it made sense that they would be entrusted with medical care as well. Not to mention that so little was known about disease and its causes that fusing religion and medicine made as much sense as any other theory (and probably more so to the deeply religious American colonists). As historian Richard D. Brown put it, “even the most rational and learned individual..saw no clear boundary between physical and spiritual phenomena.” The joining of these two roles, however, was not without its complications.


Cotton Mather, one of the most prominent Puritan ministers in New England, introduced smallpox inoculation in 1721, a move that ignited a fierce debate in the Puritan community. Puritans believed that every affliction was proof of God’s special interest in their affairs. Some suggested that smallpox (a serious and deadly threat in colonial America) was perhaps God’s punishment for sin and that to interfere with that through inoculation or anything else would only anger God and provoke more punishment. Mather did not waver, though. He wrote that “whether a Christian may not employ this Medicine (let the matter of it be what it will) and humbly give Thanks to God’s good Providence in discovering of it to a miserable World; and humbly look up to His Good Providence (as we do in the use of any other Medicine) It may seem strange, that any wise Christian cannot answer it. And how strangely do Men that call themselves Physicians betray their Anatomy, and their Philosophy, as well as their Divinity in their invectives against this Practice?”Eventually, inoculation did gain widespread acceptance. 


The spread of print and increased literacy helped bring elite medicine to the masses through home medical manuals. Many ministers as well as their followers had copies of some of the most popular manuals, including William Buchan’s Domestic Medicine (1769) and John Wesley’s Primitive Physick (1747). 


Wesley, better known as the founder of Methodism, was extremely interested in the democratization of medicine. Believing that medicine and doctors were often only accessible to the wealthy, he wrote his own medical manual to allow anyone to easily treat themselves. Wesley considered making medical knowledge and treatments available and comprehensible to the public was part and parcel of his pastoral duties. 


The minister-doctor predominated until at least 1750 and continued on to the early 19th century in some areas of the country. It was only with the rise of medical education, apprenticeships, and licensing laws that  the two forms of healing separated. But ministers continued to play an integral role in healing, particularly in many of the alternative health movements (homeopathy, hydropathy, osteopathy, etc) that flourished in the 19th century.

Poland Rising

How did the Irish get so popular? In the 19th century and even into the early 20th, the Irish were decidedly not a favorite immigrant group. They were often poor and, more damning, Catholic, a major strike against them in a very Protestant United States. The Irish were stereotyped as hot-headed drunkards, uncivilized and unskilled. Political cartoons were widely used to express negative opinions about Irish immigrants. And yet something changed…

Today, everyone wants to be Irish. There are stores selling Celtic this and that all over the country, Irish bars, and Irish music festivals. Maybe it was sheer numbers. Millions of Irish came to the United States so their influence became impossible to ignore. And they didn’t remain poor and unskilled. Many became politically, economically, and religiously powerful in their new country. John F. Kennedy–Irish and Catholic–probably had something to do with it, too.

All of this leads me to the Poles (and myself as I’m a quarter Polish). Why haven’t the Polish experienced this renaissance of opinion? Poles were often poor and Catholic, too. They didn’t come in quite the same numbers as the Irish but most settled in urban areas like the Irish, especially heavily in the Midwest. Chicago today bills itself as the largest Polish city outside of Poland. And yet Poles are often still the butt of jokes rather than a beloved culture. Everyone I know who has been to Poland raves about its beauty and culture, but rarely do you see a special issue of Conde Nast Traveler or Budget Travel telling you where to go now in Poland.

Southwest Poland, near Jelenia Gora

It all makes me want to start a Polish rehabilitation project.  Someday, maybe we’ll all start wearing purple on Polish Independence Day (November 11th, for those who haven’t already marked their calendars).

History = War

What counts as history? And why is the answer for so many people one war or another?


As someone who writes about history, I often hear from people how much they love history (and to clarify, it’s almost always an adult, usually male, aged 50+), especially the Civil War or World War II. Those are important and fascinating events in history, but I often wonder why that’s the only history that people seem to know and care about–why it’s the only history that seems to matter.


In part, I blame The History Channel. Or as I like to call it, the channel of never-ending war. It’s battles and technologies of war (how guns work and who invented the cannon) or stories of great generals and war heroes. Reading a review of a new history book in the New York Times this weekend, the reviewer hit on what I think is one of the main problems with the way history is depicted on TV:


This may be realism, but it is History Channel realism, where the rawest facts of combat on the ground become the only facts that really count. Entertaining anecdotes abound, but there are more descriptions of mangled bodies than information or insights about strategy.

That’s it exactly. The spectacle overshadows the actual context and motivations behind it. I get it, I really do. I know that battles are exciting and that draws people in. But that’s not what history is ALL about. And I think that’s why I grow weary of hearing about the Civil War or World War II, not because they aren’t pivotal historical events, but that the battle scenes, often the only glimpses we get and often the only ones we seek, become the dominant (and only) story. Ideas are exciting, too. Ideas are what led to the bloodshed yet that piece doesn’t get the same attention even though it is the explanation for it all. 

And much of history has nothing to do with war at all. Most of history, as Bill Bryson said in his new book At Home, consists of people going about their daily lives, cooking, sleeping, and bathing. I realize that on its surface, our domestic lives are no where near as exciting as a pitched battle between heroes and villains. But really, it’s all a matter of presentation. War may have an inherently interesting package so it’s easy to present, though I might argue that the package is often missing the ideas and the critical thought behind it for the sake of explosions, confusing the real importance with the flashy front. 

And maybe why you don’t hear as many young people proclaiming their love of history or why history books aren’t marketed as Mother’s Day gifts as they are for Father’s Day, is because the history of battles and war isn’t as relatable as how people like you and me went about their daily lives to these groups. That’s not the whole reason, but it’s one element that I think plays a significant role. 

I continue to wait for the day that someone tells me they love history and follows it up by saying, “especially the utopian movements in the 19th century.”



Giving Fruitcake a Good Name

There are few gifts more vilified, more dreaded or ridiculed than fruitcake, which is too often mass produced with cheap ingredients. Who doesn’t join the laughter when a coworker opens the gleaming, bejeweled brick of cake at the office gift exchange?

It’s a shame, really, because fruitcake, at its best, is a delicious mix of dried fruits and nuts, bound together by sugar, flour, eggs, and spices. Most of us only know the cake at its worst, rock hard, laced with day-glo candied fruit and bitter citron. Liberally bathed in alcohol, a fruitcake can last more than ten years, a fact that only adds to its supernatural horror. No wonder people in Manitou Springs, Colorado toss them every winter during the Great Fruitcake Toss.

The idea of making cakes with dried fruits and honey dates back to ancient times. Fruitcakes were a means of food preservation. Not only could fruits be conserved, but they could be served out of season, when fresh fruit was unavailable. Egyptians considered fruitcake an essential food for the afterlife (and some of the cakes could outlast you), while the conquering force of the Roman legions was fruitcake-powered.

The fruitcakes we know and… well, love… came from the Middle Ages, when sweet ingredients like honey and spices became more widely available. The arrival of cheap sugar in Europe from the colonies, beginning in the 16th century, resulted in a flourishing of sweet, fruitcake-like breads, including Italian panettone, black cake (common in Jamaica and Trinidad), dreikonigsbrot, king cake, babka, and my personal favorite, stollen.

Stollen… yum

So what makes something a fruitcake? The fruit-to-cake ratio is pivotal. Anything less than 50% fruit is not really a fruitcake. The fruitcakes from Swiss Colony in Monroe, Wisconsin, contain around 75% fruit and nuts.

And despite what you commonly see in grocery stores, candied fruits in colors that suggest some kind of nuclear disaster are not obligatory and should be avoided. Naturally sweet, dried fruits are the key to turning fruitcake hate into love.

The fruit and nut to cake ratio appears right but those colors only
reinforce fruitcake’s poor reputation.

Alcohol allows for long-term storage and also helps to mellow the sweetness of the ingredients. Fruitcakes actually do taste better with age because the dried fruit contains tannins, like wine, that are released over time to create complex flavors and aromas.

Like all things, fruitcake can be great, amazing even, done right. Don’t let the imitations fool you.

What’s so heroic about "heroic medicine?"

Heroes don’t usually make you bleed or vomit. You’ll never see Superman fight evil doers by lancing someone or forcing large doses of calomel down Lex Luther’s throat.

And yet “heroic” was what medicine in the 18th and 19th centuries was called. At least the medicine practiced by so-called “regular” doctors through at least the mid-19th century (those who didn’t practice heroic medicine were known as alternative or sectarian practitioners–or in unkinder moments, quacks).

Heroic medicine consisted of bleeding, purging, leeching, blistering, and sweating patients to release disease. Calomel, mercurous chloride, was one of the most commonly used mineral concoctions, a harsh treatment that would induce vomiting and purging. At the time, most diseases were seen as systemic imbalances caused by something being either over- or under-stimulated in the body. Until the 1840s, most doctors believed that diseases overstimulated the body so most treatments involved lowering the overexcited patient back to a normal, healthy state. Bleeding often became the first “therapeutic” line of attack, being a seemingly easy way to get whatever disease was poisoning the system and knocking it out of balance back into place.

Heroic medicine gave clear evidence that it worked. Or at least that the treatment was doing something–and that something was often healing in of itself. People felt better just knowing that they were being treated, even if that treatment could sometimes kill them. It also gave the doctor the appearance of being in control of the situation. There’s just something about doing that feels a whole lot better than waiting and watching for nature to run its course, as it often does in disease. The body is an amazing healing machine.

So why heroic? The word apparently comes from the large dosage size and effect of the therapies. They didn’t just give you a bit of calomel. They gave you A LOT to produce a near instantaneous effect–which, in this instance, was mostly a lot of vomiting. According to the dictionary, “heroic” is “behavior that is bold and dramatic.” These treatments certainly were bold and the results often dramatic if not always healing.

But they were also rather harsh and public outcry against them helped lead to their demise in the 19th century.