How to Keep Your Sunny Side Up During the Darkest Times

Kids onstage at the Orpheum, a few years before Mike Sanchelli made his debut

Ninety years ago, a few kids from St. Paul’s East Side got up on a downtown stage and brightened the bleakness of the Great Depression.

Every year, during the week leading up to Easter, the RKO Orpheum Theater on Seventh Street near Wabasha in St. Paul invited the kids of the city to come by and show off a little. Dozens of pint-sized performers crammed into the theater’s backstage area dreaming of showbiz glory. They hugged the walls, awestruck. They whispered to each other and stole furtive glances. The Orpheum’s backstage was a strange and wonderful place.  Professional entertainers—“showgirls in skimpy outfits and men all painted up with makeup,” recalled one young performer—smoked cigarettes, drank questionable liquids, and played cards between shows. For some kids, amateur hour at the Orpheum provided a rare glimpse of grown-up decadence. For others, it was a refuge of sorts, a welcome respite from the daily drudgery of the tough times known as the Great Depression.

Mike Sanchelli was fourteen years old when he, his brother, and three friends showed up at the Orpheum on Good Friday evening, 1931. The boys comprised one of three bands invited back for the finals of the annual group harmonica competition. They hadn’t practiced together long. Their group was much smaller than those of the other two finalists. But that didn’t matter much—at least not to the Sanchellis. They were from Swede Hollow, an uncommonly tight-knit community of ramshackle dwellings clustered along Phalen Creek on St. Paul’s east side. Like most of the kids who lived in the hollow, the Sanchellis were getting by as best they could during the early years of the depression. Their father had epilepsy. Their mother had tuberculosis. Neither could work. The boys were the family’s breadwinners, scrounging for odd jobs just to put food on the table. Their dire straits gave them something of an advantage in the harmonica competition: they had nothing to lose.

The Sanchellis and their friends had worked up two songs for the occasion. The first one was a love ballad called, “Should I.” The second tune was an upbeat little number titled “Sunny Side Up.” Like the more popular, “Happy Days Are Here Again,” it served as a kind of optimistic anthem for hard times. The boys didn’t sing the lyrics—they were harmonica players, after all—but many members of the audience already knew the words.

Keep your sunny side up,up!
Hide the side that gets blue.
If you have nine sons in a row.
Baseball teams make money, You know!
Keep your funny side up,up!
Let your laughter come through, do!
Stand up on your legs;
Be like two fried eggs;
Keep your sunny side up.

When the song was over, a judge asked the audience to vote for a winner by applauding separately for each band. When he came to Mike Sanchelli and his harmonica-blowing cohorts, the Orpheum erupted in clapping, cheers, and whistles. “The three kids from Swede Hollow and the two from up on the street were a winning combination,” Mike recalled with pride years later. No one knew whether the decision to play “Sunny Side Up” had anything to do with the victory. But one thing was for sure: it didn’t hurt.

For many Minnesotans who grew up during the Great Depression, the ability to dwell on the sunny side of life, even when conditions seemed darkest, was a talent that made daily existence just a little more bearable. But optimism did not come easily. The depression grabbed hold of Minnesota—and the rest of the country—during the earliest months of the 1930s, and did not let go for nearly a decade. The children who lived through those years never forgot what it was like. The experience shaped their lives. The joys and heartaches of growing up in Minnesota during the 1930s stayed with them over the decades as they fought a world war, built their careers, and raised families of their own.

Bowling for Equality

Bill Rhodman, Maurice Kilgore, C.W. Williams, Lafayette Allen, Len Griffin, and George Williams of the Allen Supermarket Team at the 1951 American Bowling Congress tournament in St. Paul, 1951

St. Paul played a role in desegregating organized bowling in the mid-twentieth century.

About 20 years ago, while researching Hubert Humphrey’s early civil rights advocacy, I came across a reference to his participation in a group called the National Committee for Fair Play in Bowling (NCFPB). Intrigued, I set off on a long journey of historical investigation that eventually uncovered this photograph, taken in St. Paul in 1951. It shows the members of the Allen Supermarket Team, six bowlers from Detroit who were just as much trailblazers in their sport as Jackie Robinson was in baseball.

In 1947, just four years before this photo was taken, the American Bowling Congress—the organization that controlled nearly every aspect of the country’s most popular amateur sport—voted to uphold its long-held “Caucasians-only” membership policy. Amid the stirrings of an incipient civil rights movement, the ABC was determined to show that African Americans and all other people of color were still, and always would be, unwelcome in the nation’s bowling alleys.

Opposition to the ABC and its racist policies burned brightest in Detroit, which was home to an impressive array of black-owned and black-welcoming bowling alleys unaffiliated with the ABC. With support from the NCFPB—an initiative of the United Auto Workers—African American leaders from Detroit and elsewhere began pressuring the ABC to change its ways. The organization finally capitulated and removed the racial ban from its constitution in 1950. Its first integrated tournament took place in St. Paul.

The Allen team finished in the middle of the pack in the St. Paul tournament, but their scores were almost beside the point. They had become, in the words of the team’s sponsor and captain, Lafayette Allen, “the first to be admitted as competitive individuals in the sport we love, for participation, regardless of the color of our skin.”

Minnesota’s role in desegregating bowling was mostly incidental, but Minnesotans can still take some pride in knowing that a significant moment in civil rights and sports history happened here. “In my book, the Twin Cities are the ‘Queen Cities’ of America,” Lafayette Allen wrote of his team’s experience in St. Paul. “Congeniality and fair play are the passwords in those wonderful cities.”

The Roots of Vaccine Skepticism

Smallpox vaccinations in Minneapolis, 1904. Via the Minnesota Historical Society

Some reluctance is historically understandable, even if it’s currently misguided.

One day in early 1901, a Minneapolis health inspector named Luxton visited the city workhouse to administer smallpox vaccine to all inmates who had not previously been vaccinated. Deadly smallpox was spreading rapidly across the nation, and had recently taken hold in Minnesota. The race was on to prevent the epidemic from getting out of hand. Public health officials in the state had no legal authority to force vaccinations on those who objected, but that didn’t stop them from occasionally wielding what were supposed to be prohibited powers against the powerless. The inmates at the workhouse fell into the powerless category.

While most of the men Luxton encountered that day submitted, one did not. “A muscular fellow with a determined look in his eye” declared that he was not about to let the vaccinator touch his arm. Luxton offered to administer the vaccine to the man’s leg instead. The man agreed. But it turned out the man was a trickster. He had a wooden leg. He insisted that Luxton do as promised, and perform the vaccination on the artificial limb. Luxton was not amused. He called in a couple jailkeepers and had them strip down the man and subdue him while vaccine was administered against his will. The story, which ran in the Minneapolis Tribune, served the health department’s purpose. It reassured readers that the city would do whatever it took to protect good people from the less good—those on the margins—who, by implication, could not be trusted to do the right thing.

It’s one of many such stories from our past that help explain why some people might be reluctant to get vaccinated against COVID-19.

A recent study conducted by the Pew Research Center shows that only 42 percent of African American adults plan to get the coronavirus vaccine. That compares with 60 percent of Americans overall who are inclined to do so. Black people have plenty of reasons to be skeptical about the government’s public health intentions. Many are well aware of the Tuskegee syphilis study and other racial outrages perpetrated over the years in the name of medical advancement. And they are not alone. Other Americans marginalized due to race, ethnicity, and economic circumstances have reason to be wary as well. The forced vaccination of the man with the wooden leg is just one example from Minnesota’s early 1900s smallpox epidemic of how social inequities can taint efforts to protect public health.

The homeless and transient—those described by one St. Paul health official as belonging to the “lower world”—were among the first be singled out for coerced vaccination. Raids on Twin Cities lodging houses, dingy buildings packed with low-wage working men, became increasingly common. When a smallpox case was discovered at lodging house on South Washington Avenue in Minneapolis, a phalanx of city health workers moved in and “thoroughly fumigated and vaccinated” all lodgers, while police guarded the doors “to prevent exit or egress.” In this and other similar cases, health authorities got around restrictions on forced vaccination by threatening forced quarantines.

Recent immigrants were also subjected to early and uncompromising vaccination. In early 1901, the neighborhood known as Little Italy on St. Paul’s upper levee found itself under a “state of siege” when a child there was diagnosed with the most virulent strain of smallpox. In explaining the need to strictly enforce vaccination in the community, the St. Paul Globe explained that the residents of Little Italy were “clannish” and “little given to the most ordinary rules of cleanliness.”

Communities of color were likewise targeted with heavy-handed treatment during the epidemic. In northern Minnesota, federal authorities threatened to withhold treaty payments to members of the White Earth Nation of Ojibwe if they did not get vaccinated. In Minneapolis, health officials raided an African American social club on Hennepin Avenue and prevented all members from leaving until they submitted to vaccination. These were not isolated incidents. Although authorities in Minneapolis acknowledged that smallpox made no distinction between toney Lowry Hill and downtrodden Bohemian Flats, they insisted it made sense to focus their attention on “poorer quarters” where the disease was most likely to spread. Such explanations provided little comfort to those who felt besieged by both disease and unequal treatment by the government.

This is not to say the inequities that surfaced during Minnesota’s mostly-forgotten turn-of-the-century smallpox epidemic can explain away the wariness that some people now have about the COVID-19 vaccine. Too many things have changed since the early 1900s to draw clear parallels. For one thing, there was no formal process to ensure the safety and efficacy of vaccines back then. Now there is. Beyond that, we have little reason to suspect that our current local, state, or federal authorities will stoop to forcing or coercing or anyone to take the coronavirus vaccine against their will. But public health missteps, no matter how infrequent or dated, add up in the collective memory. We should keep that in mind. It’s no surprise that some people think twice before saying yes to the latest vaccine.

On This Day in 1974: Stud or Dud?

Advertisement Featuring Secretariat’s First-born Foal, First Secretary.
Photo via Equiery, December 2003

Secretariat didn’t have much luck in the paternity business until a Minnesotan took a chance on him.

1973 Triple Crown champion Secretariat was one of the greatest racehorses of all time, but his performance in the breeding shed didn’t live up to his reputation on the track—at least not at first. His initial failure as a stud inspired all sorts of jokes (President Gerald Ford even got in on the act, cracking that his critics, like Secretariat, were “fast on their feet, but not producing much”) and threatened to reduce his value. But Secretariat’s reproductive struggles weren’t enough to stop Jack Nankivil, a breeder from Winona, Minnesota, from taking a chance on him. Secretariat had recently been bred to a Kentucky Appaloosa named Leola, and Nankivil purchased the pregnant mare in hopes that she would successfully carry her foal to term. He brought her to Winona, and on the night of November 15, 1974, with television news crews from NBC and CBS waiting in a nearby tack room (Nankivil had banned ABC in reaction to sportscaster Howard Cosell’s disparaging remarks about the Minnesota Vikings), Leola gave birth to a chestnut colt with a white blanket and three white stockings. Nankivil named Secretariat’s first foal—surprise—First Secretary.

The DFL’s Shot at Something Really Big

DFL leaders including Lt. Gov.-elect Rudy Perpich and Gov-elect Wendell Anderson (seated, facing camera) celebrate victory in 1970—the last time a Minnesota election produced a government capable of redistricting.

Forget Trump and Biden for a moment, and consider what the election means for Minnesota’s congressional map.

I realize that most Minnesotans, when they ponder the upcoming election—if they ponder it at all—think in terms of the White House and Congress, but there’s something else at stake on November third that has been mostly overlooked. For the first time in a long time, the DFL has a realistic chance of gaining full control of state government going into a redistricting year. To do so, the Democrats need only flip a single legislative chamber—the Senate—since they already control the governorship and are likely to hold onto the House. If they succeed, they will be able to draw new congressional and state legislative boundaries however they choose and, if they’re so inclined, do it in a way that locks in their electoral advantage for the rest of the decade. Such a scenario would also ensure—well, maybe—that the courts won’t have to step in and do the map drawing.

The last time Minnesota’s legislature and governor agreed on a redistricting plan was 1971. They managed to do so despite the fact that Republicans (then known as Conservatives) controlled both legislative chambers and a DFLer, Wendell Anderson, was governor. The two parties, for some reason, saw fit to draw up maps that were acceptable to both sides.

Since then, divided government has consistently led to redistricting gridlock.

In 1980, Democrats retained control of the state House and Senate in the November election, but had no choice but to continue working with the incumbent Republican governor, Al Quie. With government still divided and the Democrats unable to agree among themselves on a new plan, the job of redistricting eventually fell to the courts. The new, court-ordered maps took effect in March 1982.

In 1990, the Democrats controlled state government across the board and hoped to retain control going into redistricting, but incumbent Governor Rudy Perpich lost to Republican Arne Carlson in the November election. After a year of failed redistricting efforts, the House and Senate finally passed a plan in January 1992, but Carlson promptly vetoed it. State and federal courts (including, eventually, the U.S. Supreme Court) stepped in. The maps weren’t finalized until February of 1993, a few months after the 1992 off-year elections, during which a federal court’s temporary plan was in place.

In 2000, Minnesota government was partisan muddle, and it stayed that way after the election. Republicans controlled the House. Democrats controlled the Senate. And an independent iconoclast, Jesse Venture, was governor. Not surprisingly, redistricting did not go well. The House and Senate never even produced a plan for Ventura to sign. A five-judge panel issued a new map in March 2002.

In 2010, state government did a complete flip after the election. The House and Senate went from Democratic to Republican. The governorship went from Republican to Democrat, with Mark Dayton defeating Tom Emmer. The legislature passed a redistricting plan during the 2011 session, but Dayton vetoed it. Once again, the courts stepped in. A special redistricting panel adopted a new plan in February 2012.

The lesson here: If you’re a Minnesotan and you want to avoid another redistricting debacle, find out who’s running for state senate in your district, and vote for the Democrat. You might actually get your wish.

A Pretty Nice Rich Guy

Bill Gates Sr. has died. Seventeen years ago, I profiled him for Equal Justice Magazine. He obviously had his head on straight.

It’s not always easy heading up the largest philanthropic organization the world has ever seen. People tend to think of you first when they need money, and these days, that covers just about anyone—especially anyone who works for a cause. They know you have billions of dollars at your disposal and figure: The worst you can do is say no, right?

William H. Gates Sr. has learned to say no. He has had to. As co-chair of the $24 billion Bill & Melinda Gates Foundation and father of the world’s richest man, Bill Gates Sr. fields a never-ending stream of funding requests from friends, acquaintances, and strangers. Most of the requests address important societal needs, yet not even the Gates Foundation can cure all that ails the world. Priorities, therefore, must be set.

In general, Gates Foundation grants are earmarked to address the foundation’s carefully conceived priorities (global health, education, the digital divide, and projects based in the Pacific Northwest). As a consequence, gates finds himself saying “no” more often than he’d like. It’s a sobering task, he says, especially when the request involves an issue that’s close to his heart: like the law.

Bill Gates Sr. is an attorney and has been for more than half a century. He graduated from the University of Washington School of Law in 1950 and worked his way up through two law firms before heading out on his own in 1964 to become a founding partner of Seattle’s Preston Gates & Ellis. Over the years, Gates has demonstrated an uncommon devotion to the institutions that govern his profession, building a resume that’s a testament to the life’s work of a self-described “typical bar association do-gooder.” Gates has served as President of the Washington State Bar Association, President of the Seattle/King County Bar Association, Chairman of the Commission on Washington State Courts, a member of the American Bar Association’s Standing Committee on the Federal Judiciary, and a trustee at the National Center for State Courts.

While Gates’ devotion to the law spans decades, it is no longer his primary allegiance. Despite spending a long career as a champion of justice, he has been charged with embracing a different agenda as head of his son and daughter-in-law’s foundation. His new priorities are no less important, particularly in the area of global health; the Foundation’s No. 1 goal is curbing the transmission of HIV, which infects 14,000 new people across the globe every day. It also supports the development of vaccines for the 11 million children around the world who die every year of preventable disease, as well as addressing disparities in reproductive health (an African mother stands 200 times the risk of death during pregnancy of her U.S. counterpart). The Foundation also directs significant resources to education initiatives— reducing financial barriers to higher education, creating high schools with smaller classes, and bridging the digital divide by putting computers in libraries.

Do these priorities mean that organizations interested in the administration of justice need to accept the fact that they’ll never crack the coffers of the world’s largest foundation? Gates Sr. admits that he struggles over the right thing to do when legal organizations come to him with well-conceived proposals for help that do not fall within the Gates Foundation’s funding priorities. With a heavy heart this May, for example, Gates dictated a rejection letter to fund a proposal of a national legal organization that he had at one time personally supported. “I’m still feeling a little bad about that one,” he says.

Yet Gates has also shown his ability to say “yes” to worthy legal causes, like civil justice for all, proving you can’t take the lifetime commitment to the law out of the lawyer. In October 2000, the Foundation awarded a $4 million “special projects” grant to Friends of Legal Services Corporation to purchase a new headquarters building for LSC.

“After more than a quarter-century of leasing expensive Washington, D.C., office space, we’re thrilled that we now have a permanent home to match America’s permanent commitment to equal justice,” LSC President John Erlenborn says. “This would not have been possible without the support of the Gates Foundation.”

In Gates Foundation parlance, “special projects” tend to be those that someone in the foundation leadership wants to support, even though the cause doesn’t fall neatly within its funding priorities. In this case, LSC’s champion on the inside was the elder Gates himself.

Gates acknowledges that the LSC grant was a rare exception to the foundation’s strict funding rules. “One of the nice things about having a foundation with a sole trustee is that there’s a degree of flexibility that’s possible to bring to bear on a situation,” he says. “This was an instance of that.” Gates is also quick to point out that the grant would never have been made if he and his son were not strong believers in LSC’s primary mission—to ensure equal access to the nation’s civil justice system. “Obviously [the building grant] arises, in part, from my fairly long interest in this subject,” he says. “And while those interests have not been articulated in the Foundation, when this opportunity came along, it seemed like something that the Foundation should do. It was kind of a special opportunity.”

In retrospect, it’s not surprising that Bill Gates Sr. decided to support a project that didn’t perfectly match the Foundation’s stated priorities. It turns out he has a long-running interest in promoting equal justice. In 1990, during his tenure as chairman of the Washington Bar’s Long Range Planning Committee, Gates led a move to declare “access to justice” the most pressing issue facing the state bar. In its report, the committee warned that the state’s justice system would eventually break down if something wasn’t done to make sure that the legal needs of all Washington residents were met. “I had an opportunity to put it at the top of the pile in terms of what the bar’s objectives ought to be, and I seized it,” remembers Gates.

In the wake of the Gates Committee report, the bar created a special task force to develop solutions to the problem of unequal justice. In 1994, the state supreme court—acting on the task force’s recommendations—established the Access to Justice (ATJ) Board to facilitate the delivery of high-quality legal services to low- and moderate-income people. The ATJ Board now serves as a model for similar state programs nationwide.

Gates’ strong feelings about equal justice also show up in his approach to pro bono work. “It’s a simple question of equity,” he says of attorneys’ responsibility to serve the public. “To whom much is given, much is expected. And much is given to us who have had the opportunity to get a good education, get a law-school education, work in a law firm, make an adequate or better living, and enjoy the fruits of the great education …. You absolutely owe something back to the society that made it possible.” Gates helped instill what he calls an “ethos” of public service at his old firm that is still evident today. Preston Gates & Ellis attorneys handle more than 200 pro bono cases a year. “They’re one of the better pro bono firms in Seattle,” says Patrick McIntyre, executive director of the LSC-funded Northwest Justice Project.

Equal access to justice is a concept that dovetails nicely with Gates’ larger worldview—a philosophy that seeks to bridge the gaps between the poor and the privileged. At the root of Gates’ philosophy is a faith in the power of the public sector and his belief that many Americans have developed an unhealthy disdain for their government.

Even with $24 billion in the till, the Bill & Melinda Gates Foundation can focus its attention on only a handful of the world’s most pressing problems, the CEO says. In a Washington Post piece, Gates once called the U.S. government “the greatest venture capitalist in the history of the world,” hammering home the point with an example that—as father of the founder of Microsoft—he knows quite well. “There’d be no Internet today but for the federal government,” he claims. “Zero. The software industry [was] to a large extent Dependent on things that happened on college campuses, where research by smart people was being supported by the federal government.”

As Gates sees it, many vital missions, including the pursuit of equal justice, must be left for someone else to address—and that someone is often the government. “I think there are among our people some who think of philanthropy as doing so much good stuff that it really is the answer to mankind’s travails,” he says. “That’s a totally unrealistic measurement of the power of philanthropy. The budget of anybody’s city or county or state—and certainly of the United States—when stood alongside all the philanthropy in the world makes philanthropy look like a peanut. And that’s exactly the relative power of philanthropy.”

Ignoring a Big Birthday

Fort Snelling, about 1855, artist unknown. Via the Minnesota Historical Society

Shhh. Fort Snelling turns 200 years old this week!

This week marks the 200th anniversary of the laying of the first cornerstone at Minnesota’s oldest and most famous military installation, Fort Snelling, at the confluence of the Minnesota and Mississippi rivers. The fact that barely anyone today is commemorating the bicentennial proves that celebratory history often loses its appeal as our understanding of the past—and all its complexities and difficulties—evolves over time.

A couple weeks ago, the Minneapolis Star-Tribune published an essay by Stephen E. Osman, Fort Snelling’s preeminent historian. In it, Osman bemoaned the bicentennial’s relative anonymity. He recalled skipping class in 1970 to help celebrate the fort’s 150th anniversary, made note of an earlier centennial celebration in 1920, and called out the politicians, historians, and others who willfully chose to “silently ignore this anniversary.” He recounted the fort’s crucial role in opening the land that became Minnesota to “a flood of immigrants.” And he wished that a more widespread recognition of “yesterday’s hard-won achievements” at Fort Snelling would “inspire us today to draw together and to dream of the next 200 years in this shared garden called Minnesota.”

I have drawn on Osman’s scholarship in my own work. He is among those rare and essential historians who devote much of their lives to pursuing their singular passions, and sharing what they learn with the rest of us. No one is worthier of advocating for Fort Snelling’s historical significance. But in his essay, Osman only alludes to the uncomfortable truths behind the ignoring of the fort’s bicentennial. He mentions the fort’s role in enforcing treaties between indigenous people and the U.S. government, but does not acknowledge the injustices subsequently perpetrated on the Dakota and Ojibwe through government coercion, lying, neglect, and violence. He does not refer to the hundreds of Dakota refugees who were forced to live and die in a concentration camp below the fort during the winter after the U.S.-Dakota War of 1862. Fifty years ago, when Osman participated in Fort Snelling’s sesquicentennial celebration, those troubling facts were rarely acknowledged. Now they are. And that’s why we’re not celebrating this time around: It’s hard to throw a birthday party for a place that we have come to know as a site of a painful—not just a proud—past.

Osman knows all this. But I’m sure it must hurt to see people ignore a major anniversary of the place he’s spent so much time researching and writing about. The fact is, we seem to have moved past celebrating many of the people, places, and moments that we once were eager to remember. Twelve years ago, I figured I would be busy working on all sorts of projects to commemorate Minnesota’s 150th year of statehood. Instead, the anniversary passed with hardly a mention. There were many reasons for the lack of remembering, including state budget considerations, but our evolving understanding of history’s complexities figured prominently. Minnesota, like Fort Snelling, isn’t as easy to celebrate as it used to be.

The War Over Pandemic School Closings in Minneapolis (1918 Version)

Minneapolis Tribune, October 21, 1918

At least no one’s threatening to throw school board members in jail—yet.

As we in Minnesota get ready to send our kids back to school—either in-person, virtually, or through some combination of the two—we should try not to let our hopes get the best of us. If history is any guide, we will eventually be forced to acknowledge what we already know: that pathogens do not respect school calendars. The experience of the Minneapolis public school system during the height of the 1918 “Spanish flu” provides a sobering reminder of how reality tends to stomp on our best intentions.

The influenza pandemic that arrived in United States in the spring of 1918 did not hit Minneapolis until the fall of that year. But when it did arrive, it spread quickly, and it convinced public health officials to act. On October 11, 1918, the city health department ordered all places of public gathering—churches, theaters, dance halls, pool halls, and schools—closed indefinitely, until the pandemic was under control. The shut-down put Minneapolis’s top health official, Dr. H.M. Guilford, in direct conflict with the city’s school board, whose members felt they knew what was best for children, teachers, and staff. And what was best, they insisted, was that kids keep going to school, the flu be damned.

Many of the school board’s arguments for keeping students in school—some of which later proved dubious—sound familiar today: the disease did not seem to be nearly as dangerous to children as it was to adults; kids were actually safer in school than they were playing on their own in “streets and alleys;” schools should not be forced to close while some businesses, including bars, remained open. About a week into the shut-down, the school board voted to defy the health department and reopen the schools, but Guilford refused to budge. The board’s members backed down two days later when faced with possible arrest.

But as the schools remained closed, public sentiment against the shut-down grew. Guilford finally relented. Minneapolis’s public schools reopened on November 18, five and half weeks after they were ordered to close. While celebrating the reopening, school superintendent B.B. Jackson sought to preemptively shift the blame for any influenza outbreak that might occur in the coming weeks. If the flu took hold in the schools, he said, it would be the fault of the “questionable activities” of the children themselves. Jackson also acknowledged that Guilford had warned the board that reopening “would not be wise.”

And, as it turned out, Guilford was right.

Two weeks later, schools throughout the district were reporting an alarming number of influenza absences. In one school alone, Fulton Elementary in North Minneapolis, 126 students were out with the disease. The latest spike seemed to confirm that the “Spanish flu” was not, as many people believed, an illness that skipped over children. Guilford responded by re-closing the schools until the end of the year. This time the school board raised hardly any objections. When the schools reopened at the start of the new year, the 1918 wave had effectively dissipated. Students resumed their education with few disruptions.

Comparisons between the 1918 influenza and the COVID-19 pandemic are admittedly imperfect. The “Spanish flu” was deadlier than the disease caused by the new coronavirus. Minneapolis’s schools had to shut down suddenly, while classes were underway, and did not have a chance to institute the kind of safety measures that our schools will use this fall. But the events of the fall of 1918 still carry an important lesson for those of us hoping that this coming school year will move forward without major disruptions. Unfortunately, it probably won’t.

Posing with Stooges

Minneapolis, 1949: (back) Larry Fine, Moe Howard, and Shemp Howard; (front) Bob Crosby and Penny Edwards (MINNEAPOLIS STAR/MNHS COLLECTIONS)

Every weird photo has a story behind it.

Originally published in the Summer 2015 edition of Minnesota History

I was never much of a fan of the Three Stooges and their eye-gouging brand of slapstick comedy. Growing up in the Twin Cities during the 1960s and 1970s, I much preferred the more controlled humor of the old Laurel and Hardy features I watched every Sunday morning on WCCO television. The Stooges were too unpredictable, too dangerous. I never understood them.

And now, as I consider this photograph, I realize I still don’t understand them.

The Three Stooges were past their prime by the time this publicity still was taken in late August 1949. The most popular Stooge, Curly Howard (the bald one), had suffered a debilitating stroke three years before and had been replaced by his brother, Shemp. Moe Howard, Shemp Howard, and Larry Fine were still making enjoyable two- reel shorts during the late 1940s, but they were no longer headliners. They had come to Minneapolis as part of an eclectic lineup of second-tier live performers playing the palatial Radio City Theatre.

Top billing on this weekend went to the seemingly oblivious gentleman in the photo’s foreground, the bandleader and radio personality Bob Crosby. Crosby, the youngest sibling of crooner Bing Crosby, served as the master of ceremonies of what the Radio City’s proprietors touted as a “Giant State Fair Week Stage- Screen Show!” During the stage portion of the two- hour extravaganza, Crosby and and fellow bandleader Ted Weems handled the music. Penny Edwards, the young starlet shown here nuzzling up to Crosby, danced. The Stooges provided comic relief. The screen offering—almost an afterthought—was Africa Screams, Abbott and Costello’s latest romp.

I wish I could tell you what kind of comic message Larry, Moe, and Shemp are trying to convey here. The scene probably came to be only after the photographer, sent to the Radio City by his bosses at the Minneapolis Star, asked the performers to strike a memorable pose. I like to think the Stooges are feigning homicidal intent because they’ve finally grown weary of playing second fiddle to “Der Bingle’s” little brother.

Stories of Resilience from Disasters Past

Minneapolis’s I-35W bridge collapsed into the Mississippi River on August 1, 2007 (photo via MPR News)

Thirteen years ago, I took my daughter to see the collapsed I-35W bridge.

My eleven-year-old daughter, Grace, wanted to see the fallen bridge. When I asked her why, she told me that the calamity just “didn’t seem real.” The pictures on television and in the newspaper were unfathomable, insufficient. I understood. We drove down to Minneapolis together. I knew exactly where I wanted to take her.

We parked on South Second Street and walked toward the river, skirting the Guthrie, and onto the timber footpath of West River Parkway. A half block down, we veered onto a scrubby patch of grass and came to a stop. In front of us the Stone Arch Bridge swept across the Mississippi. To our right, in the distance, the crumpled, greenish remains of the 35W bridge slouched into the river’s north bank. It was a soggy afternoon. We stood wordless, under umbrellas, looking. After a minute or so, I broke the silence. “Come over here,” I said. “I want to show you something.”

We walked across the parkway to the arched rear entrance of the Mill City Museum. I pointed to an old marble slab, embedded in the stone wall above our heads. It was engraved with words, some nearly obliterated by time and the elements. It told the story of another disaster—the explosion and fire that destroyed the great flourmill known as Washburn “A” on May 2, 1878. “Not one stone was left upon another,” it explained, “and every person engaged in the mill instantly lost his life.” It ended with a list of the fourteen “A” Mill employees who died that day:

            E.W. Burbank.

            Cyrus W. Ewing.

            E.H. Grundman.

            Henry Hicks.

            Charles Henning.

            Patrick Judd.

            Charles Kimball.

            William Leslie.

            Fred A. Merrill.

            Edward E. Merrill.

            Walter Savage.

            Ole Shie.

            August Smith.

            Clark Wilber.

This was why, among all the places I could have brought my daughter, I brought her to this place. I knew that the fading, marble memorial was here, and I found it comforting. Eighteen people, including the fourteen “A” Mill employees, had died in what was then the worst disaster in Minneapolis’ short history. Several other buildings had been destroyed. The city had lost about a third of its milling capacity. Today, nearly 130 years later, most of the victims’ names can still be remembered and honored within sight of the collapsed 35W bridge. The names are hard to read, but they manage to stave off the troubling tendency to forget.

They also serve as a reminder of the city’s resilience. Within hours of the “A” Mill’s destruction, its owner, Cadwallader C. Washburn, declared that he would rebuild “without delay” and that his new mill would be bigger and better than the old one. Two months later, all the rubble was removed and reconstruction was set to begin. The new “A” Mill opened the following year. “The great explosion, although a terrible catastrophe, in no way disheartened or discouraged the capitalists and business men of the city,” the Minneapolis Tribune reported, “and they have shown an abiding faith in the future greatness of the metropolis and manufacturing center of the Northwest.”

I’m not sure I properly conveyed to Grace the connections I saw between the calamities of 1878 and 2007. She was most taken with the rough coincidence of the numbers: fourteen people killed in the “A” Mill; thirteen killed in the bridge collapse. But at least now she has a better understanding of how the people of Minneapolis—like most people in most places—can rebound from disaster. Perhaps years from now she’ll be able to take a child or grandchild down to this same place by the river and read the names of the people who died when the bridge fell down—along with the fourteen names on that old slab of marble.

The Lesson of St. Paul’s “N-Word” Lake

Loeb Lake, Marydale Park, St. Paul

Of course we have to rename things sometimes. We always have.

At the southeast corner Dale Street and Maryland Avenue in St. Paul, a lovely little body of water called Loeb Lake is the centerpiece of a lovely little public space known as Marydale Park. You can take a leisurely stroll around the lake. You can fish in it. It and its surrounding park constitute a true urban oasis. But Loeb Lake hasn’t always been so lovely.

And it used to have a far uglier name.

During the late 1800s, the lake at Dale and Maryland was bigger than it is now. It extended as far as Farrington Street on the east and across Maryland to the north. It was one of three lakes—the others being McCarron and Sandy—that fed Trout Brook, which joined with Phalen Creek on its way to the Mississippi River. Young people gathered there for skating parties during the winter. The Northwest Ice Company and, later, People’s Coal & Ice Company ran big ice harvesting and storage operations there. But over time, the lake succumbed to misuse and neglect. Industrial development siphoned off much of its water and polluted what was left. By mid-century it was known mainly as a runoff pond. It has since been restored.

I’ve pinpointed the lake on many old maps of St. Paul, but I have yet to find it identified by name. When it appears—which is not always—it’s just a small, blue blob. But the little lake did have a name, even if cartographers didn’t like to make note of it. And although I know what it was, I can’t use it here. It was called “N-Word” Lake.

I haven’t been able to find any explanation for why the body of water at Dale and Maryland was known by that name. All I know is that St. Paulites continued to refer to it that way until at least the second decade of the 20th century—and probably much later. The fact that a geographic feature of Minnesota’s capital city was known for so long by such a repellent name might rate as a mere curiosity if we were all willing to acknowledge today that names periodically need to change. But as we reckon anew with other names and monuments that offend evolving sensibilities, it’s clear that some people need reminding. We rename things all the time for all sorts of reasons. Sometimes we just get tired of the old monikers. (Think of all the Mud Lakes that have been renamed over the years.) But sometimes we replace them because they’re ugly, embarrassing, and hurtful.

The lake in Marydale Park is not the only Minnesota body of water that, at one point in its history, was known by the “N-Word.” Burns Lake in Anoka County, for example, didn’t shed its old, offensive name until 1977. And other places with odious names—especially those referred to by derogatory terms used to describe indigenous people—have taken even longer to remove. Earlier generations of white Minnesotans saw nothing wrong with using such names. We know better. And people who, today, demand that we leave well enough alone are simply ignorant of the history they claim to be protecting.

While We’re On the Subject of Racist Images…

Painting by Edward Brewer, 1921

It’s probably time to consider the mysterious longevity of this Minnesota-born corporate symbol.

Quaker Oats’ recent decision to abandon its Aunt Jemima brand and acknowledge its origin as a racial stereotype got me thinking about another well-known corporate mascot—and the concept of complicity.

For more than a century, the breakfast cereal Cream of Wheat (manufactured for many years in Minneapolis) has featured a fictional African American chef named Rastus on its packaging and in its advertisements. Like Aunt Jemima and Uncle Ben, Rastus has, since his birth in Minnesota, reflected and reinforced white stereotypes, rooted in slavery, of happy and docile black servitude. Confronted periodically by critics, Cream of Wheat’s corporate overseers have defended Rastus’s marketing endurance by touting his high recognition among generations of consumers. The product’s current owner, B&G Foods, has yet to encounter any intense pressure to retire Rastus, but that could easily change now that Aunt Jemima is leaving. But even if B&G ultimately accepts responsibility for perpetuating such a long-running racial stereotype, we shouldn’t forget that Cream of Wheat’s various owners are not the only ones who have made it possible for such a problematic brand to persist.

What about the ad agencies that, over the years, accepted and developed Cream of Wheat founder Emery Mapes’s vision of an African American chef mascot? Two firms—the J. Walter Thompson Advertising Company of New York and the Mac Martin Advertising Agency of Minneapolis—were most responsible for turning Rastus into a marketing phenomenon during the early 1900s. They eschewed what were called “reason-why” campaigns (those touting health benefits, for example) in favor the kind of pure emotion that they felt Rastus conveyed. And their unconventional marketing approach was widely admired within the advertising industry. A 1917 article in the trade publication Printers’ Ink—written in jarringly racist language— described the Rastus ads as, “delightfully tender and human reflections of real life.” Today we recognize them instead as gauzy representations of a servile black man doing the bidding of white children and adults.

What about the artists who used a photograph of a real person—an African American lunch counter attendant from Chicago—as the inspiration for dozens of Cream of Wheat paintings? Several famous American illustrators, including James Montgomery Flagg, created versions of Rastus, but none was as prolific as Minneapolis artist Edward Brewer, who painted his first one in 1911. Whatever misgivings Brewer and his fellow illustrators had about using a real-life black subject to sell breakfast cereal were apparently easily outweighed by the $500 commission each painting generated. The character they created still smiles out from boxes of Cream of Wheat today.

What about the journalists and historians who, in the past, wrote glowingly about the endurance of the Cream of Wheat brand without acknowledging the damaging stereotype it propagated? An admiring 1980 profile of Edward Brewer in the Minnesota Historical Society’s journal, Minnesota History, featured multiple reproductions of Brewer’s Rastus paintings without once mentioning their racial component. Such an omission would probably never make it past an editor today, but anyone who writes about history—myself included—knows how easily unexamined prejudices can make their way into copy. We need to acknowledge that our choices as writers and historians sometimes add to the problem.

And finally, what about those of us who actually like Cream of Wheat, who have happily purchased and eaten it without ever really thinking about the face on the box and what it represents. Perhaps we can forgive ourselves for not knowing the truth about Rastus’s origins, for not knowing that his creator, Emery Mapes, joked he could sell sawdust if his “n____s”—meaning dark-skinned mascots like Rastus—led the marketing campaign. But we could easily have figured it out if we had chosen to. The question is: If we had known, would it have mattered? Would it have convinced us to stop buying Cream of Wheat or demand Rastus’s removal? Consider me skeptical.

It’s easy to dismiss such questions as a distractions. (Does anybody even eat Cream of Wheat anymore?) But Rastus and Aunt Jemima and Uncle Ben are, for all their marketing stamina, cultural relics. Whatever fondness we might have for them, whatever value they might have as brands, they’re not worth the trouble anymore. It’s time for Minnesota-born Rastus to take off his chef’s hat and retire.

Police and the Weapons of War: A Love Story

The original AR-15

Police militarization goes back at least to the 1960s.

The militarization of police departments may have accelerated over the past two decades thanks to the federal government’s “1033 Program”—through which law enforcements agencies can obtain “demilitarized” fighting vehicles, aircraft, and other accoutrements of war from the U.S. Defense Department—but the process has been going on much longer than that. In early 1968, the police departments in Minneapolis and St. Paul both decided to arm themselves with AR-15 semi-automatic rifles. The AR-15 was, at the time, considered the twin of the M16, a military weapon used heavily by U.S. troops in Vietnam. The police said they needed such weapons to maintain law and order during certain circumstances, including “civil disturbances.” These were times of unprecedented civil unrest in America (the Martin Luther King, Jr. assassination and its immediate aftermath occurred as this issue was coming to the fore) and for many African Americans in the Twin Cities, the decision to arm police with weapons typically used during war smacked of deliberate intimidation. Minneapolis NAACP President Sam Richardson said the acquisition of AR-15s showed that the city was interested only in putting down unrest—not in eliminating its causes.

Maybe the most interesting thing about all this is how the two cities resolved the controversy.

In Minneapolis, Mayor Art Naftalin, a liberal DFLer, quickly sided with the Black community and ordered the police department to return the rifles. The head of the police union, future mayor Charles Stenvig, huffed that officers might have to buy their own.

In St. Paul, Mayor Thomas Byrne, a DFLer up for reelection, chose to back the police. Even a sit-in by protesters that shut down his office could not make him budge. Byrne pushed the controversy to the city’s civil rights commission, which held a few hearings before letting the issue die.

The controversy over the AR-15 purchases seems almost quaint today, given that AR-15-style rifles have since become nearly ubiquitous among gun-loving civilians. But it’s a reminder that our police have a long history of trying to arm themselves with the most powerful weapons they can find.

50 Years Ago: David Crosby at the Minneapolis Radisson

David Crosby, Minneapolis, 1970 (by Henry Diltz)

From the Minnesota trivia files

Crosby, Stills, Nash & Young supposedly split up for good after playing a final concert at the Met Sports Center in Bloomington on July 9, 1970. (The divorce actually lasted only four years.) While the concert at the Met will always be remembered as CSN&Y’s temporary swan song, it was also a subject of controversy.  Some of the band’s Twin Cities fans were outraged at the concert’s $10 top ticket price (Egad!) and they expressed their displeasure by organizing a boycott. The promoters responded by dropping the price for the best seats to $7.50. Photographer Henry Diltz snapped this classic portrait of David Crosby during after-show partying at the Minneapolis Radisson.

Photo via Morrison Hotel Gallery

The Power of Pressure Campaigns

Pillsbury thought these Funny Face flavors were a good idea—until activists spoke up.

As pro teams in DC and Cleveland consider dumping names offensive to indigenous people, here’s a reminder to keep up the pressure.

Originally published in the Minneapolis StarTribune, during a previous, unsuccessful attempt to force the “Redskins” into retirement

Sometimes it seems the long struggle to force the NFL’s Washington Redskins franchise to change its name is destined to fail. Many Minnesotans remember the scene in 1992, when the American Indian Movement and its supporters organized symposiums and protests to coincide with the Washington team’s Super Bowl appearance at the Metrodome. Nothing much changed then. The Redskins remained the Redskins. Today, the team and its fans continue to embrace a name and logo that many others consider blatantly offensive. It would be easy to conclude that protest against corporate misappropriation of American Indian culture is futile.

But it’s not. It’s worked before. And a few of the most memorable success stories played out here in Minnesota.

In 1964, Minneapolis-based Pillsbury introduced a new line of powdered soft drink mixes to compete with Kool-Aid. Pillsbury called its new, sugar-free product Funny Face. (Its artificial sweetener, sodium cyclamate, was later linked to cancer and banned by the FDA, but that’s a different story.) Each Funny Face flavor was named for a silly character: Goofy Grape, Loud-Mouth Lime, Freckle-Face Strawberry, Rootin’-Tootin’ Raspberry, and two others that soon created major public relations headaches at Pillsbury—Chinese Cherry and Injun Orange.

The Chinese Cherry character was a slant-eyed red cherry with buckteeth and a pigtail. Injun Orange, with his crossed eyes, skewed war paint, and limp feathers, smiled insipidly in a nearly perfect distillation of negative Native American stereotypes. Both characters, along with their less offensive product line-mates, were big hits with kids and parents alike.

But this was the height of the modern Civil Rights era. Minority groups were finding their voice, and they were not inclined to let corporate America get away with insensitive and insulting marketing campaigns. In early 1966, the Association on American Indian Affairs (AAIA) called on Pillsbury to dump both Injun Orange and Chinese Cherry. The group claimed that the two characters’ “derogatory nature” was “highly objectionable.”

It was only after the AAIA went public with its anti-Funny Face campaign that Pillsbury announced it was already phasing out the two characters. It turned out that several other groups, including Chinese-American grocers in Sacramento, had filed similar complaints. “We admit guilt all over the lot,” a Pillsbury spokesman said. “It was in poor taste. We quickly saw our fault and as early as last July we decided to change the names.” Chinese Cherry became Choo Choo Cherry. Injun Orange turned into Jolly Olly Orange.

If the campaign to dispatch Injun Orange had constituted the only successful protest against corporate-perpetuated Native American stereotypes, then perhaps it would make sense to write off as unwinnable the current campaign against the Washington Redskins name. But Pillsbury was just one in a string of companies that were convinced to change their stereotyping ways in the 1960s and beyond. In 1965, Calvert Distillers, under pressure from the AAIA and other American Indian groups, dropped a “soft whisky” ad that read in part: “The Indians didn’t call whisky firewater for nothing. Why do you think they were yelping all the time?” A year later, the AAIA scored a similar victory against the Marx Toys Company, manufacturer of a vulgar children’s doll called “Nutty Mad Indian.” And in 1967, Roger Jourdain, chairman of Minnesota’s Red Lake Band of Chippewa, helped convince General Electric to pull an ad featuring rambunctious white youngsters dressed in Indian costume. The copy for the ad read, “When you decide to shoot wild Indians, you can’t afford to miss.”

Of course Pillsbury, Calvert, Marx, and General Electric were not football franchises. They did not have rabid fans, steeped in the “traditions” that a team name and logo can come to represent. But those companies—and many others over the years—have succumbed to pressure after realizing they failed to consider the damage their marketing might do. It’s not hard to imagine that the NFL team from Washington will eventually do the same.

Indigenous Rejection of Mount Rushmore Goes Way Back

AIM’s Dennis Banks at Mount Rushmore, 1970

Backlash against Trump’s Black Hills stunt brings back memories of an early American Indian Movement protest.

Excerpted from Minnesota in the ’70s, which I co-authored with Thomas Saylor

In the late summer of 1970, a contingent of AIM activists including Dennis Banks and new member of the leadership group, Russell Means, headed to South Dakota to join a protest at Mount Rushmore. The protest, originally started by a determined group of Lakota women, had begun small with a simple goal: to demand that the U.S. government honor its commitments under the 1868 Fort Laramie Treaty…and acknowledge that the Black Hills belonged to the Lakota people. But the Mount Rushmore demonstration began attracting sympathizers from far beyond South Dakota, including the young activists from AIM. At first, the newcomers were content to support the local protesters by holding signs and making speeches. But soon they began itching to do more.

On the evening of August 24, Russell Means and a few other protesters broke away from the main group and headed up the mountain, determined to make a statement that no one could ignore. Chased by forest rangers, they made it to the summit and set up camp behind Teddy Roosevelt’s head. They painted a homemade banner declaring “Sioux Indian Power” and unfurled it over George Washington’s face. Means, standing on a ledge high above the monument’s amphitheater, delivered voice-of-God lectures on U.S. government perfidy to confused tourists below.

The government…opted to wait out the militants rather than risk a violent confrontation. The protesters, for their part, were in no rush to leave. They remained on the monument for several months until cold weather forced them to abandon camp. The occupation of Mount Rushmore failed to gain any concessions from the government, but it did generate a good bit of national publicity. It also built a foundation of camaraderie that would nurture the movement in the months to come. “Some folks fell in love up there, and a few babies were made on that mountain,” Means later wrote. “Occupying Mount Rushmore was fun.”

Freedom’s NOT Just Another Word for “F#@% You”

Brainerd fluoride opponents, 1979. Photo via Minnesota Historical Society

Self-proclaimed liberty lovers have always had a problem with public health.

If you want to sample COVID-19-era grievances in Minnesota, just make your way to the Twitter accounts of some of the state’s most prolific social media “freedom fighters.” (I’m tempted to identify a few for your reading enjoyment, but I’m no masochist.) These days, Minnesota’s self-proclaimed liberty lovers seem most exorcized about Gov. Tim Walz and his brandishing of dictatorial powers. A few samples:

If @GovTimWalz tries to enforce mandatory mask…there’s going to be push back…go ahead arrest me… fine me…put me in jail…but there will be a response for draconian rule and violating the Constitution.

The Sheepeople of America have dropped to there [sic] knees over the Corona Virus which has a death rate of the Flu…If you’re to [sic] stupid to believe this [is] not a plan for destroying American capitalism then you deserve to be exploited and abused in Socialism….I will never kneel.

Liberals get so pissy when people won’t buy their crap. It’s why they need the power of the government to make you buy it. They already run the schools, Hollywood, sports, the culture….it’s not enough until YOU comply. Well, fuck off, I’m not complying.

For these people, freedom, not public health, is the primary concern. They believe personal liberty should supersede all efforts to contain the coronavirus. And while they may feel oppressed by liberal sheepeople intent on muzzling them with masks, they can at least take solace in the knowledge that resistance to public health initiatives—even those judged now to be huge successes—has always existed in Minnesota.

But unfortunately for them, their side usually loses—at least in the long term.

For decades after the development of vaccines, a persistent minority of Minnesotans resisted any efforts by authorities to institute mass vaccinations. As the anti-vaccination St. Paul Globe saw it, too many liberty-loving Minnesotans were happy to “assert the right to jab a virulent virus into the person of his neighbor, willy nilly.” In 1903, anti-vaccination activists used the language of personal freedom to win the repeal of an existing state law requiring children be vaccinated against smallpox. But that early anti-vaccination victory was short-lived. Twenty years later, a severe smallpox outbreak hit the Twin Cities, and suddenly vaccines didn’t seem so bad.

Around that same time, opponents of another public health measure—the chlorination of Minneapolis’s water supply—were making similar arguments. Over the years, Minneapolis had experienced multiple typhoid epidemics caused by polluted Mississippi River water. But in 1910, the city started treating its drinking water with a disinfectant, hypochlorite of lime. Libertarians, more tolerant than others of fecal contamination, argued against the treatments, but the Minneapolis Tribune dismissed them as “well poisoners” and “child murderers.” It didn’t take long for even the most adamant chlorination skeptics to admit that non-lethal drinking water was probably in everyone’s best interest.

The pasteurization of milk was another public health priority that Minnesota’s leave-us-alone contingent attempted to undermine. When several Minnesota communities, including Minneapolis, instituted compulsory dairy pasteurization to control communicable diseases, opponents mobilized. One of their most common arguments was that housewives should be free to purchase unpasteurized dairy products. The inability to buy raw cream struck one Minneapolis Star reader as especially “undemocratic.” Although opponents succeeded in stalling the movement for a few decades, “universal pasteurization” became state law in 1949.

But libertarian opposition to public health efforts may have reached its apex in the early 1950s, when communities throughout Minnesota started adding cavity-fighting fluoride to their water supplies. Prominent conservative activist Donald F. Raihle spoke for many like-minded Minnesotans when he called community fluoridation a violation of “the fundamental principle that no person or agency shall have authority over the body of a human other than himself.” Raihle and his fellow fluoride fighters continued to agitate throughout the 1950s, but their efforts ultimately proved futile. A state law passed in 1967 mandated fluoridation in most Minnesota cities. Brainerd, the state’s last prominent fluoride holdout, started treating its water in 1980 under court order.

Of course, libertarian opposition to public health measures never completely vanished. Anti-vaxxers, raw milk-drinking “food freedom” fighters, and fluoride conspiracy theorists still make their presence felt with various degrees of effectiveness. But they remain, at best, minority voices. As polls taken during our current public health crisis suggest, most Minnesotans continue to believe that tempering personal liberty makes sense if the goal is an undeniable public good—like, I don’t know, saving lives? It’s possible that the Minnesotans who believe Tim Walz is tyrant shepherd leading his flock into liberal captivity will ultimately win the battle of public opinion, but the historical record suggests they won’t.

Mississippi Bests Minnesota

Mississippi just got rid of its state flag. Why can’t Minnesota?

There’s a saying, especially common in the Deep South—“Thank God for Mississippi!”—that suggests that, no matter how bad things are where you live, they’re probably worse in the Magnolia State. It’s an all-purpose put-down that applies to everything from educational achievement scores to infant mortality rates. Here in Minnesota, we rarely stoop to such belittlements because we automatically assume our state’s superiority. But perhaps we should reconsider our arrogance. After all, Mississippi just got rid of its racist state flag. We’re still stuck with ours.

Minnesota’s state flag has never been particularly beloved. It features a cluttered design that’s impossible to make out from a distance. In fact, 19 years ago, a collection of flag experts (yes, there are such people) voted Minnesota’s flag the fifth worst in the nation, design-wise, behind only New Hampshire, Idaho, Wisconsin, and Kentucky. Like those other loser states, Minnesota chose to incorporate its state seal in its flag’s design. And it’s the seal that poses the problem.

The state seal, based on a sketch by soldier and artist Seth Eastman, dates back to the late 1840s, before Minnesota was even a state. Its original design featured, in the foreground a farmer plowing a field, with a rifle and ax leaning against a tree stump. In the background, an Indian man on horseback rode into a sunset. There was nothing subtle about the message: Minnesota was to be a land of white people and its indigenous residents were to go away. A poem by Eastman’s wife, Mary, left little doubt about the seal’s prevailing theme of white people’s Manifest Destiny:

Give way, give way, young warrior,
Thou and thy steed give way;
Rest not, though lingers on the hills
The red sun’s parting ray.
The rock bluff and prairie land
The white man claims them now,
The symbols of his course are here,
The rifle, axe, and plough.

The seal appeared on Minnesota’s first official flag in 1893, and has remained its central design element ever since. There have been a few changes, including, most significantly, a course correction for the Indian rider instituted in 1983. He now rides to the south instead of the west, which is apparently meant to neutralize objections to the “riding into the sunset” message.

Critics have periodically tried—and failed—to change Minnesota’s flag and remove its racist imagery. This year, a bill calling for a study of its design died in legislative committee. When it comes to its flag, the North Star State remains a state of inertia. But at least we’ve given the good folk of Mississippi an excuse to say, “Thank God for Minnesota!”

What We’ve Lost: Grandma’s

While we’ve lost way too many lives to COVID-19, we also have lost many things that we previously took for granted.

The inaugural running of Grandma’s Marathon between Two Harbors and Duluth took place 43 years ago today, on June 25, 1977. Only 150 runners participated in that first race. The winner was Olympian and Duluth native Garry Bjorklund. The gentleman in this photo, Alex Ratelle, was, at 52, the oldest runner in the 1977 race. He finished fourth. He went on to compete in 21 straight Grandma’s.

Photo via Duluth News-Tribune

Minnesota’s Worst Monument

Minnesota has never had to deal with Confederate monuments on its soil, but it did have to deal with this.

Originally published by MINNPOST on August 25, 2017

It’s easy as Minnesotans to sit back and watch from a safe distance as Americans elsewhere struggle to decide what to do about the Confederate monuments in their midst. Minnesota, after all, is apparently among the minority of states without at least one such monument within its borders. But before we congratulate ourselves for our sophisticated and nuanced understanding of history and race, we would do well to remember what happened in our state just over a century ago.

On the day after Christmas, 1912, several hundred people gathered at the corner of what was then Front and Main streets in downtown Mankato to mark the fiftieth anniversary of the largest mass hanging in U.S. history. Thirty-eight Dakota men had been executed at that exact spot for crimes allegedly committed during the U.S.-Dakota War of 1862. The highlight of the fiftieth anniversary commemoration was the dedication of granite monument inscribed with the words, “Here Were Hanged 38 Sioux Indians.”

In his dedication address, Judge Lorin Cray rejected any notion that the monument inappropriately glorified a mass killing. As the Mankato Free Press reported, he “wished to have it understood that the monument [had] not been erected to gloat over the deaths of the redmen,” but was instead meant “simply to record accurately an event in history.” The judge’s assessment held sway for more than four decades.

But slowly opinions changed.

In the late 1950s, a small group of Mankatoites began advocating for the monument’s removal. Its presence, they argued, gave the city an “unwholesome reputation.” Pharmacist Allen Mollison went so far as to ask Governor Orville Freeman to join the campaign. Although Freeman did not respond, the head of his human rights commission, Clifford Rucker, did. Rucker wrote that he and his fellow commissioners agreed the “eyesore” should be removed. He also complained that the Minnesota Historical Society felt otherwise and was determined to block any removal efforts.

The monument remained in place for another fifteen or so years while the arguments over its fate ebbed and flowed. In 1971, the city finally removed it—but only to make way for urban removal. The contentious granite slab was placed in storage with plans to reinstall it at some point in the future.

By that time, however, many Mankato residents had come to believe that commemorating a mass execution was, at the very least, tacky and culturally insensitive—and possibly even an incitement to violence given the rise of militant organizations like the American Indian Movement. The monument never reappeared in public.

The current location of the Mankato monument remains unknown. About ten years ago, a history class at Mankato State University tried without success to track down the slab. The students uncovered some evidence that a city employee had given the monument to Dakota elders, but the investigative trail ended there.

Now a new memorial exists near the place where the vanished monument once stood. Reconciliation Park features a statue of a buffalo and a large “scroll” inscribed with the names of the thirty-eight men who were executed there.

It’s hard to imagine what it must have felt like for an American Indian—especially a Dakota—to walk past that granite slab at Front and Main, knowing that many Minnesotans considered it simply a piece of history, etched in stone. Thankfully it’s gone now, even if it was meant to be removed only temporarily. Its disappearance has not erased history. But the fact that it existed at all should serve as a reminder that we Minnesotans have always struggled, and often failed, to live up to our ideals of racial and cultural justice. It’s not just southerners. We northerners are capable of erecting “eyesores,” too.

Riding the Waves

Are we in COVID-19’s first wave? Second wave? From 1918 to 1920, Minnesotans endured four waves of the “Spanish flu.”

On the Monday before Thanksgiving 1918, business leaders in Albert Lea, Minnesota, declared victory of sorts on the front pages of two local newspapers. “Good cheer is in order,” they proclaimed. “Smile and the world will smile with you.” Indeed, they and nearly everyone else in Albert Lea had legitimate reasons to give thanks. For one thing, the Great War in Europe had recently come to an end. But Albert Lea’s businessmen seemed especially giddy about something else: the apparent defeat of what had become known as the “Spanish flu.” Over the previous two months, the influenza virus responsible for a global pandemic had sickened hundreds of people in the city and killed at least 14. But now the disease seemed to be retreating, and the businessmen were in the mood to celebrate. “Life is coming back to what it used to be,” they asserted. “We shall soon forget there ever was an influenza epidemic. Why shouldn’t we? Restrictions that were placed upon communities several weeks ago have been removed. In other words: the ban has been lifted.”

But the people of Albert Lea, like people everywhere—in Minnesota, the United States, and the rest of the world—were soon to learn that there was no way to truly defeat a virus to which few people were immune and for which there was no vaccine. The “Spanish flu” ebbed and flowed, but it never really went away.

The pandemic’s first wave hit the United States in the spring of 1918, but it was mild, and pretty much skipped over Minnesota. The second wave arrived in September 1918, and it was that one that walloped Albert Lea and so many other communities in Minnesota and the rest of the United States. Municipalities responded by setting up emergency hospitals, quarantining households where the disease appeared, closing schools, shutting down certain businesses, and banning public gatherings. Citizens chafed at the restrictions, but they generally recognized the need for patience. After two weeks of restrictions in Albert Lea, the Freeborn Standard summed up the city’s glum mood. “Yesterday was probably the longest Sunday people here ever spent,” it lamented. “There was no church nor moving pictures to pass the time, and added to that it rained all day.” Still, the newspaper cautioned its readers that “every care must be exercised until the malady is entirely stamped out.” No wonder then that, a few weeks later, the city’s business leaders were eagerly declaring that the disease was vanquished.

But they were wrong. There were more waves to come.

The first sign that the influenza would persist came in mid-December, when an outbreak occurred at a local boarding school, Luther Academy. The city responded by briefly banning dances and imposing a few other restrictions, but the spike in cases was short-lived. The December uptick, which also occurred in other Minnesota cities including Minneapolis, turned out to be the last gasp of the “Spanish flu’s” second wave.

The pandemic’s third wave hit the United States in March and April of 1919, but Albert Lea, like most communities in Minnesota, avoided the worst of it.

It wasn’t until January of 1920—15 months after the “Spanish flu” first emerged in Albert Lea— that the city experienced the fourth wave. This final surge, which affected communities throughout the United States, is often forgotten today, but it serves as a grim reminder that viruses, with little to stop them, can reemerge at any time. In Albert Lea, the fourth wave turned out to be less deadly but at least as pervasive as the second wave of autumn 1918. Once again, the disease emerged at Luther Academy and quickly spread into the community. The city cancelled its winter sports carnival, banned public gatherings, and shut down all dance halls, movie theaters, lodges, schools, and houses of worship. It set up a new emergency hospital in a Masonic hall, which quickly became what the Albert Lea Tribune called “one of the busiest places in the city.” Although the fourth wave, which continued through most of February, apparently killed only a small number of people in Albert Lea, it still did considerable damage statewide. Statistics compiled by the state board of health shortly after fact showed that more than 2,100 Minnesotans died of influenza in January and February 1920, the two months that encompassed the fourth wave. Those numbers almost certainly represented an undercount.

As it turned out, the businessmen who predicted in the fall of 1918 that the people of Albert Lea would “soon forget there ever was an influenza epidemic” had engaged in wishful thinking. While life in the city did return to a certain level of normal once the “Spanish flu’s” second wave petered out, the deadly disease continued to lurk until it reemerged more than a year later.

Today, as Minnesota loosens restrictions imposed during a new pandemic, the century-old experience of Albert Lea—and the similar experiences of countless other communities in Minnesota and elsewhere—can instruct us. It’s natural to identify with those business leaders who in 1918 called on the people of Albert Lea, shell-shocked by the “Spanish flu,” to stop “moping around” and “look on the bright side life.” After all, we want to get on with our lives, too. But we, unlike those southern Minnesota optimists, also need to temper our expectations. The new coronavirus is not done with us, and probably won’t be for a long while. It may not be as deadly as H1N1 virus behind the “Spanish flu,” but it has a longer incubation period, which means it will probably burn through communities more slowly than the “Spanish flu” virus did. And once it completes its burns, it will almost certainly hang around until conditions are right for a resurgence. That’s how the influenza pandemic played out a century ago. We should be ready for the very real possibility that, at least in this respect, history will repeat itself. 

The Killing of Tim Graham

In the wake of George Floyd’s murder, I went looking for the earliest case of an officer killing a Black person in Minnesota. This is what I found.

Published by MINNPOST on June 19, 2020

Distrust, fear, and loathing of law enforcement has permeated African American communities in the Twin Cities—with good reason—since long before the killing of George Floyd by Minneapolis police. But even those most familiar with local law enforcement’s well-documented record of violence against Black, Native, and other marginalized communities may not realize just how far back those sentiments reach. Although it’s impossible to say with any certainty, evidence suggests that the relationship between Black people and law enforcement began seriously deteriorating in the Twin Cities in the late 1880s, when an African-American man was shot to death by a cop. It may well have been the first case of a law enforcement officer killing a Black Minnesotan—and it’s remained largely forgotten.

On the evening of September 27, 1887, Ramsey County Sheriff Fred Richter shot and killed Tim Graham in the basement of the county jail in downtown St. Paul. Graham was 35 years old, married, and had worked until recently as the jail’s janitor. He had spent at least part of the evening he died drinking beer and playing cards in a nearby saloon. After leaving the bar, he had returned to his former place of employment and somehow gotten inside. It was there, in the jail’s basement, that he had his deadly encounter with Richter.

As the sheriff would later testify, he didn’t realize at the time that the man he discovered in the basement was Graham. He insisted he thought Graham was an inmate trying to escape:

I commanded him to him to come out and show himself. He made no effort to do so at first, and I pulled aside some of the clothing in order to obtain a look at his face. Just as I did so, the fellow bumped up against me, still endeavoring to screen his face, and pushed on out into the hallway leading to the street door. He did not heed my command to halt, but persisted in trying to reach the front door, as I supposed, whereupon I fired, and there is the result.

Most local newspapers readily accepted Richter’s version of events. They also played up circumstances that reflected poorly on Graham—especially his alleged fondness for liquor. The St. Paul Globe reported that “a telltale flask in the right pocket of [Graham’s] sack coat showed that he had stimulated himself for the accomplishment of some purpose, probably murder.” But St. Paul’s most influential African American newspaper, the Appeal, hinted that other, more nefarious motives on the sheriff’s part, were at play. “There is something beneath the surface that has not been brought to light,” the newspaper’s editor, John Quincy Adams, wrote, “and therefore a full investigation of the affair should be had.”

A coroner’s jury met two days after the killing to examine its circumstances, but the outcome was never really in doubt. Five of the six jurors were white. The jury’s sixth member—African American businessman Thomas Lyles—was the only one who seemed interested in pursuing the case further. After an hour and a half of deliberations, the jury ruled 5-to-1 that Richter was justified in killing Graham. Lyles was the sole dissenter.

For many Black Twin Citians, the jury’s decision confirmed what they had always suspected—that justice was anything but colorblind in Minnesota. “Nothing that has transpired in this city since our advent seems to have so aroused the Colored citizens as the killing of Tim Graham by Sheriff Richter,” the Appeal declared. “The sentiment among them is almost unanimous, that the killing was unjustifiable, the verdict of the majority of the coroner’s jury to the contrary notwithstanding.” Compounding the injustice, the Appeal lamented, was the fact that outrage at the verdict was shared “by a large number of whites who are not so hide bound and affected with color prejudice, that they cannot render a just opinion where the two races are concerned.”

On the evening after the jury’s decision was announced, Thomas Lyles and Rev. William Gray of Pilgrim Baptist Church called a community meeting to discuss possible responses. About three dozen people showed up. The participants discussed ways to bring the case before a grand jury, but they knew chances of a reversal were slim. If anyone advocated a public protest, the suggestion was not recorded. With so few options for justice available, the Appeal expressed the helplessness that many of its readers felt. “We have come to the conclusion,” John Quincy Adams wrote, “that that because a county officer, a white man, has killed only a Negro, nothing [can] be done about the matter,”

A week later, a grand jury affirmed the decision of the coroner’s panel and cleared Richter.

“And that settles it,” the Appeal concluded.

From our vantage today, with outrage over the death of George Floyd spreading across the globe from Minneapolis, it’s hard to imagine how another killing at the hands of law enforcement, 133 years ago, failed to generate more than a few days’ headlines. But 1887 was a different time. With the previous census showing fewer than 500 Black residents in both St. Paul and Minneapolis, the Twin Cities African American community did not possess the sheer numbers or political clout needed to effect major change. And despite the Appeal’s assertion of substantial support among white Minnesotans, there was little evidence to suggest that any of them were willing wage a fight for racial justice in the name of Tim Graham or any other Black victim. Those were the realities that Black Twin Citians had to cope with in 1887. And while succeeding generations have despaired that the outrages perpetrated by our criminal justice system continued as decades passed, we can at least hope that the energy evident in today’s protests will bring the kind of justice that Tim Graham was denied.

Good Grief: Why Pandemics Aren’t Funny

Until COVID-19 got me thinking about previous pandemics, I didn’t realize there even was such a thing as the 1958 “Asian flu.” But Charlie Brown did.

Published by MINNPOST on May 1, 2020

On March 9, 1958, a global flu pandemic unofficially became a national joke. When Americans turned to the funny pages of their Sunday morning newspapers, they found Charlie Brown, the sad sack hero of Minnesotan Charles Schulz’s Peanuts comic strip, lamenting his shortcomings on the hockey rink.

“I don’t feel good,” Charlie Brown told his nemesis, Lucy. “I think maybe I’m getting the Asian flu.”

Lucy, as usual, showed no sympathy. She informed her downcast companion that the “Asian flu” was yesterday’s news. “What a guy!” she exclaimed as she walked away. “Everyone else got the Asian flu six months ago, and he’s just getting it now!”

Charlie Brown was left to deliver the punchline to himself. “Good grief!” he sighed. “I can’t even get sick right!”

It was a funny joke—at the time.

But decades later, with the benefit of hindsight, it doesn’t seem quite so funny. That’s because we know something now that Charles Schulz and his readers didn’t know back then. According to the Centers for Disease Control, the “Asian flu” pandemic of 1957 and 1958 killed an estimated 116,000 Americans and around 1.1 million people worldwide. It was one of the worst public health crises of the 20th century. So how did it become the butt of a joke in the Sunday morning comic section? The answer lies in what was, at the time, a widespread public misunderstanding about the scope of the pandemic—a disconnect between perception and reality. It was the kind of disconnect that we should keep in mind today, as we try, in real time, to make sense of the havoc the new coronavirus is causing.

In March 1958, when the Peanuts comic strip appeared, the “Asian flu”—an influenza caused by a new strain of the H2N2 virus—was a well-known fact of life in the United States. It had emerged about a year before in Singapore and Hong Kong, and had spread to U.S. coastal cities in the summer of 1957. Scientists determined early on that very few Americans were immune to the new virus, and they raised public alarms about its potential lethality. (They also rushed to create a vaccine that played a crucial role in limiting the virus’ spread.) When the “Asian flu” arrived in Minnesota in September 1957, people were ready for the worst.

But the “Asian flu” never seemed to live up to its advance billing. State and local health officials reported occasional outbreaks, including one that clobbered Minneapolis’s North High School in October, but the information they shared with the public inadvertently downplayed the virus’ severity. Laborious testing protocols made it difficult to confirm that a patient with flu-like symptoms had actually contracted the “Asian flu,” so mortality statistics released to the public included very few “Asian flu” deaths. At the end of 1957, when the pandemic’s first wave had subsided, Minneapolis’s health commissioner reported a total of just 18 confirmed deaths from the new H2N2 virus—a sobering number, certainly, but nothing close to the numbers that Minnesotans had been primed to expect. And the national death toll was similarly underwhelming: about 6,000 deaths attributed to the new H2N2 virus throughout the entire United States.

So how do we explain the discrepancy between the 6,000 fatalities ascribed to the “Asian flu” at the time of the pandemic, and the 116,000 deaths that the CDC estimates now? It all comes down to how fatalities are tabulated and estimated. In an ongoing pandemic, death tolls are based on reports of fatalities officially attributed to the pathogen in question. But those real-time numbers almost always turn out to be undercounts. The true scope of a pandemic’s deadly toll becomes clear only after the fact, when experts use statistical modeling to determine “excess mortality”—the number of deaths beyond what would normally be expected. There’s nothing particularly controversial about using excess mortality to estimate the number of people killed in a pandemic. It’s generally considered the most accurate way to express results of what is admittedly an inexact science. And if you compare at-the-time death tolls of other pandemics with after-the-fact estimates based on excess mortality, you see discrepancies similar to the ones that showed up after the 1957-58 “Asian flu” pandemic.

In November 1918, as the second wave of the H1N1 “Spanish flu” pandemic wound down, federal officials reported that the virus had killed about 82,000 Americans. Today, the CDC puts the death toll at 675,000.

In July 2009, toward the end of the first wave of the H1N1 “swine flu” pandemic in the U.S., the government reported 302 deaths attributed to the virus. The CDC now estimates that more than 12,000 Americans died in that pandemic.

The lesson is simple: It’s nearly impossible to know how deadly a pandemic really is while you’re living through it.

If you find yourself wondering whether Minnesota’s relatively low number of confirmed COVID-19 fatalities (so far) suggests the coronavirus has been overhyped, remember the mistake Charlie Brown and Lucy made back in 1958. They dismissed the “Asian flu” as a nuisance to be laughed off when it was actually killing more than 100,000 Americans. But Charles Schulz and his Peanuts characters didn’t know any better. Nobody did back then. The true scope of the 1957-58 pandemic became clear only with the passage of time and the application of statistical modeling. Now all of us find ourselves in a similar moment of muddled perceptions. Whatever the numbers tell us today about infections and fatalities, they almost certainly represent a significant undercount. It’s a sobering thought, but it’s also a reminder of why we continue to put up with this unprecedented disruption of our lives.