The Short Soldiers of WWI

I missed getting a post up for Remembrance/Veterans’ Day, but since I’m thankful for the service of our soldiers past and present, this will have to serve for both holidays. But this post will be oddly specific since I’m writing in particular about very short soldiers.

The minimum height for soldiers in the British army during WWI was five-foot-three, with the average being five-five, but many potential recruits were turned away for being too short. I’m five-four and often have to ask for help reaching things on the top shelf at the grocery store, so these fellows who were turned away were pretty short!

Unfortunately, WWI dragged on, and the war machine demanded more men to be fed to the trenches.

In Britain, this led to two things: first, the formation of “Bantam Battalions” (referring to smaller breeds of roosters/chickens) for shorter soldiers, and second, a national push to improve the health care and nutrition of British children so they could grow up tall enough to fight. In fact, some young men grew as much as two inches in training when they had three square meals for perhaps the first time in their lives, which shows just how dire their nutritional situation had been.

I was curious if a similar situation existed in the United States, which entered the war late and never had to dig quite as deeply for recruits. Only about 25 percent of US men entered the military in WWI, and their average height was about 5’7″, which would have been tall for a British soldier. Was this because American men were taller, or because US military recruiters could afford to be more picky? I’m not sure. But it wasn’t until WWII, when a much higher proportion of the male population became involved in the military, that the US government realized that many Americans were suffering from malnutrition (especially following the Great Depression) and took an interest in improving the health of American children for the sake of national defense.

A white WWI solider being measured by a white doctor.
A US WWI recruit being measured. Photo courtesy of the Library of Congress.

I wonder if this lag in interest or awareness on the US government’s part is also why our health care coverage and availability lags behind most other wealthy, industrialized nations.

If you’re curious, here are several other statistics about American soldiers in WWI versus WWII:

The average age was 25 versus 26 (the “average” WWII soldier was married with at least one child; I don’t think that was the case for most men serving in WWI).

The average height was 5’7.5″ versus 5’8″

The average weight was 141 pounds versus 144 pounds (both groups tended to gain weight after enlistment and regular meals).

In WWI, 25 to 37 percent of recruits were rejected for being unable to read or write, while in WWII, the illiteracy rates were lower, perhaps 5 to 10 percent, and due to the need for soldiers, the military instituted literacy training for illiterate men.

39 percent of WWI soldiers were immigrants or the children of immigrants. Also, many of the Native Americans who served in WWI were not considered citizens and could not vote. I cannot find an equivalent statistic for WWII, but over a hundred thousand immigrants gained citizenship by serving in the military, and we cannot forget the amazing courage and loyalty of the first-generation Japanese Americans who enlisted to fight, sometimes from the confines of internment camps – the all-Japanese-American 442nd Infantry Regiment became one of the most decorated units in US military history.

The WWI armed forces were 10 percent African American versus 11 percent in WWII (Native American, Asian American, and other minority groups/people of color definitely played an important role in both wars, but I don’t have exact statistics).

The life expectancy for men in WWI was 47 years versus 63 years during WWII.

African American soldiers of WWI.
The Harlem Hellfighters from WWI. They would have served in a segregated unit, but unlike many Black soldiers who were stuck doing the most unpleasant menial labor, they fought and were highly decorated, though largely forgotten. Photo from census.gov

Daylight Savings Time

We all hate daylight savings time, right? The “fall back” one isn’t so bad because we get an extra hour of sleep, but we pay for it when we have to “spring forward.” Even my dog was cranky today because we wouldn’t feed her at what she knew to be dinner time, since we were all pretending it was an hour earlier. And Hawaii and Arizona don’t even bother with the time change, though the Navajo Nation lands within Arizona do, which just makes everything even more confusing.

I knew that daylight savings time started in World War I as a way to save fuel (an extra hour of daylight in the evening meant less fuel used to light homes). This was in the US and also in some European countries, many of which also still practice daylight savings today. Only a few cities in Ontario, Canada had experimented with it prior to WWI.

What I didn’t know was that we’ve been getting rid of daylight savings time and bringing it back on and off for the last 100+ years. The first round of daylight saving time ended with WWI. FDR brought it back for WWII and called it “war time.” When WWII ended in 1945, so did war time.

For a while, some parts of the US practiced daylight savings time, while others did not. So, a city might change its clocks while the surrounding countryside stayed on standard time. We can imagine the chaos this would have caused for businesses, travelers, and pretty much everyone.

It was the 1960s when we got saddled with daylight savings time on a more permanent basis to settle the confusion. This was popular with sports equipment manufacturers, who hoped that people would play more sports if they had more daylight hours in the evening, and who continue to lobby for the continuation of daylight savings time. Some workers liked having more daylight time after work to spend outdoors or with their families, but for the most part, it remains unpopular with parents, teachers, farmers (who find that cows don’t adjust their milking schedule to daylight savings time), and pretty much everyone else.

During the energy crisis of the 1970s, the US and many other countries experimented with making daylight savings time permanent in the hopes of saving energy, but that caused problems with workers and school children having to leave home in the dark on winter mornings. Also, though daylight savings time does accrue a very small amount of energy savings in lighting, it may actually cause an increase in fuel use because of people driving more to evening activities. So, we moved back to the clock switching.

The days we spend on standard time are shrinking, though, moving late enough in the fall to allow trick-or-treaters to enjoy the extra hour of daylight and earlier in the spring (perhaps to avoid major religious holidays like Easter?). Maybe we’re heading toward doing away with it once more – this time for good.

Photo courtesy of maxmann

Titanic and WWI-era Flashlights

I was going to write so many blog posts during the pandemic, and then it turned out that living in unprecedented times is stressful. Between dealing with school-at-home and quarantines and the general brain fog of being uncertain and overwhelmed a lot, I didn’t post much in the last year. But I have been doing a lot of research on all kinds of things, and I want to start sharing that again. Hopefully, people find it interested and/or helpful, but at least it means I don’t have to do the research again if I forget what I learned (anyone else feel like their memory is worse after 2020?).

I’ve worked on two writing projects recently where my WWI-era-ish characters have needed flashlights to explore dark and spooky places. Since they’re Americans, it’s flashlights and not torches, as our friends across the pond call them – I love how evocative “torch” is versus “flashlight,” but here we are with American English. At our house, we recently had to turn the power off in one room while fixing a leak in the bathroom above (a whole ‘nother round of unneeded drama), and we learned all over again to appreciate portable light.

Prior to flashlights, of course, people had candles, oil-burning lanterns, and even old-fashioned torches made of wood and cloth or rushlights. But the inventions of batteries and lightbulbs meant that we could harness electricity and hold it in our hands. Pretty cool, really.

As I was researching what kinds of flashlights were available to my civilian characters between 1918 and 1920, I found one web site claiming that the flashlights in the movie Titanic were an anachronism. The specific types of flashlights might have been, but handheld flashlights were certainly available by 1912. You can see below the 1899 patent and an ad for heandheld, tube-shaped flashlights run by batteries (evidently with enough of a market that there were already multiple patents and manufacturers competing – images courtesy of Wikimedia commons, public domain). The design isn’t too different from modern flashlights.

By 1918, the Germans had developed a dynamo flashlight – one that is powered by motion instead of batteries. Modern crank or shake flashlights are an example of this flashlight type. The German dynamo flashlight was worn on the chest of soldiers’ uniforms and powered by pulling a cord that spun coils in a magnetic field, creating enough energy to run a light for five seconds per pull (see Popular Mechanics Magazine vol. 32, 1919, “German Pocket Flashlight Contains Own Dynamo”). I can’t find information on any earlier dynamo flashlights, and the article seems to suggest that this was a new innovation, saying that the technology was discovered when American soldiers captured some Germans. Image below.

The most common type of flashlight used by soldiers in WWI, though, seems to be the “upright” style in the image below, with a rectangular metal case and the bulb on the front. Some, like the one pictured, had a metal “lid” that closed like an eyelid to cover the lightbulb when needed. A similar upright pocket flashlight saved the life of Winston Churchill in the trenches when it blocked a piece of shrapnel that hit him in the chest. (image courtesy of the Imperial War Museums © IWM EPH 3684)

German battery operated electric torch associated with the First World War experiences of Captain E W Leggatt as a prisoner of war in Holzminden Camp, Germany. The torch was obtained from a German sentry. Captain Leggatt was captured on 9 August 1916, and subsequently was one of the ten men who made a successful escape through the tunnel at Holzminden POW camp, Germany, on 23 July 1918. Copyright: © IWM. Original Source: http://www.iwm.org.uk/collections/item/object/30083223

One of the things I find really interesting about the time period surrounding WWI is the constrasts. Big cities had electric lights, telephones, and other fairly modern technologies, while rural areas were still very isolated and more likely to use oil for heating and lighting. Of course, there are still parts of rural Utah, especially those associated with the Navajo reservation, that are still struggling to get water and electricity lines (though they can use solar power as an alternative now, where it’s financially viable). For a glimpse of post-WWI rural life, check out Blood in a Dry Town (formerly Home Again Blues) on Amazon.

The not-so-Spanish Flu of 1918

Since we’re all thinking about pandemics right now, I’ve been reading again about the 1918 Influenza, which may be the last time a pandemic caused this much global chaos. It’s too early to really compare Covid-19 and the 1918 Influenza, but knowing a little about what happened in 1918 might help us face 2020 with cooler heads.

First, the influenza wasn’t really Spanish. In fact, it probably started in the United States and spread overseas because of the movements of troops involved in World War I. The reason it was called the Spanish Flu was that Spain, which stayed neutral in the war, was one of the few countries that reported the truth about the devastation of the disease. The combatant countries, including the United States, tried to downplay the influenza outbreak to boost morale. Of course, newspapers’ declarations that the dangers were small or already past probably didn’t make people feel any better as they watched loved ones get ill and even die. If people trusted newspapers before, the Influenza pandemic made them much more skeptical about believing everything they read.

About 500,000 million people across every part of the world are estimated to have caught the 1918 Influenza (out of a population of close to 2 billion, so almost a quarter of all humans), and 25 to 50 million to have died, making the death toll 1 to 2% of the global population (In comparison, the Black Death of the Middle Ages killed 30 to 60% of the populations it infected). Young adults were hit the worst, perhaps because their robust immune systems violently overreacted to the virus. More US soldiers died of the Influenza than from the war itself, though wartime conditions may have weakened people’s resistance. With large numbers of unprepared people getting sick and dying at once, some large US cities had to bury the dead in mass graves.

Because so many doctors and nurses were serving in the war, cities in the United States saw a shortage of medical professionals to care for the sick, with stories circulating of nurses being kidnapped to care for ill families (though how these desperately sick people supposedly forced the nurses to stay is unclear). Regardless, as in 2020, the medical professionals and other essential service providers made heroic efforts to help communities overwhelmed by illness.

Below: Overflowing influenza hospital ward, courtesy of the Library of Congress

SpanishFluWardWalterReed

Wearing masks was a common way to try to prevent the flu from spreading. Those masks that were made of gauze did little to stop germs from spreading, though (they didn’t know about “three layers of cotton”), and they were controversial then even as they are now.

Our ancestors from 1918 eventually practiced social distancing, too, and that did work to slow the spread of the disease. Public gatherings were banned, and schools and universities closed, some being converted to temporary hospitals. This was so successful in some places that by Christmas of 1918, officials decided to relax the rules for the holiday. This led to a new outbreak of the disease at the beginning of 1919. As an example of how social distancing could protect a population, remote Kane County in southern Utah did not see any deaths from the 1918 Influenza until February of 1920, just as the pandemic was winding down.

To learn more about the 1918 “Spanish Flu” pandemic, I recommend John Barry’s The Great Influenza.

Below: Image of a masked mail carrier during the 1918 Influenza pandemic, courtesy of US National Archives.

maskedmailman

The Decline and Fall of the Utah Sugar Beet Empire

Where does your sugar come from?

Americans eat more sugar than any other nation, consuming close to 11 million metric tons of the sweet stuff annually (that’s somewhere around 150 pounds of sugar per person per year – 100 years ago, we ate closer to 18 pounds per person per year). The US alone produces over 8 million metric tons of sugar each year, and the largest sugar producer is…

Minnesota, eh?

Not the place many of us might imagine our sugar coming from, and tropical Florida and Louisiana are top contenders in sugar cane production, but more than half of US sugar actually comes from sugar beets (pictured below).

Sugar Beet Before Topping LOC

Utah is no longer even on the map for sugar production, but for much of the twentieth century, Utah was an important sugar beet producer.

Someone recently asked me, where did the sugar beets go?

After all, Idaho – literally within spitting distance of some of the Utah towns that once ran on the sugar beet industry – is still an important producer of sugar. Some parts of Utah, like the “Sugar House” neighborhood in Salt Lake City, still bear record of their association with the sugar industry. And the sugar beet is the official Utah state historic vegetable. Who knew that was a thing?

So, here’s a short-but-sweet 😉 history of sugar in Utah.

First of all, y’all know that sugar has an ugly history, right? European (and later American) sugar cravings drove perhaps the most brutal slavery-based industry from the 1500s to the 1800s, with generations of enslaved West Africans working and dying on sugar plantations, mostly in the Carribean.

By the 1800s, with international wars and slave revolts disrupting the sugar industry, European scientists developed a process for extracting sugar from beets. Abolitionists and human rights advocates were quick to promote beet sugar (while pro-slavery factions shunned it). Beet sugar also had an economic advantage because sugar beets grow in a much wider variety of climates than sugar cane, meaning France, Russia, Germany, and the non-tropical portions of the United States could all develop their own sugar industries instead of relying on imports.

By 1850, Brigham Young had led the Mormon pioneers to the relative isolation of Utah and was interested in being as self-sufficient as possible. Thus, the Church of Jesus Christ of Latter-day Saints backed an experimental sugar beet factory in Sugar House. It failed miserably, producing a nasty syrup not even independent-minded Brigham Young thought edible.

It took a few decades before American factories got the hang of beet sugar, but by the end of the 1800s, with the social and financial backing of the Church of Jesus Christ of Latter-day Saints, the Utah-Idaho Sugar Company had factories throughout Utah, and sugar beets became an important part of Utah’s economy. Sugar beets required back-breaking manual labor, but large Utah families had plenty of children to work in the fields. Their work was supplemented by Native Americans displaced by pioneer settlements and later by Latino refugees fleeing the Mexican Revolution. The worldwide economic disruptions of World War I saw the peak of the sugar industry in Utah when the state was one of the country’s top sugar producers (like the factory in Lewiston, Utah, pictured below – that’s a big pile of beets!).

Sugar beets in Lewiston LOC

From there, it was a slow downhill slide. An agricultural depression followed WWI, with prices falling into a slump after the previous war-time demand, and after that came the Great Depression. At the same time, beet leafhoppers spread a blight in Utah that damaged crops and led some factories to move away from the state.

The Utah sugar beet industry struggled on, eventually finding a blight-resistant strain of beets, and in World War II, Utah was still a major US sugar producer. Many detained Japanese Americans worked in the sugar beet fields of Utah to keep up with wartime production.

Following World War II, the sugar beet industry saw a number of technological changes that made sugar processing faster and more efficient. Ironically, this would lead to the end of large-scale commercial sugar beets in Utah. The main companies that bought Utah sugar beets, U&I (Utah-Idaho Sugar Company) and Amalgamated Sugar Company, had both started in Utah but expanded their production to Idaho and the Pacific Northwest and eventually moved their headquarters out of state. They faced several antitrust actions by the federal government as well as competition with low-cost cane sugar from overseas during the mid-1900s, which strained their resources. Also during this time, the Church of Jesus Christ of Latter-day Saints began selling its interests in the sugar companies.

So, when it came time to update factories, the sugar companies invested in those in Idaho and Oregon. One by one, the outdated Utah factories shut down, with the last one in Garland, Utah, shutting down in 1979. It was not cost-effective for Utah farmers to ship their sugar beets out of state, so most switched to other crops, and Utah fell off the sugar-producing map.

For most of Utah, sugar beets had never been an ideal crop. The soil is too alkaline, the growing season too short, and the labor too intensive. The fact that Utah enjoyed nearly a century of sweet success is a testament to the stubborn self-sufficiency of Utah’s farmers.

Utah’s official contemporary state vegetable is the sweet onion. And that’s a thing, too, because when it came time to vote on state vegetable, the onion may be important now, but the sugar beet-proponents refused to back down on giving some kind of recognition to the crop that had been so important to Utah’s economy for most of the 1900s.

Photos courtesy of the Library of Congress OWI-FSA collection (public domain).

Sources: U.S. Sugar Industry Association; American Sugar Alliance; New Hampshire Department of Health and Human Services; The Diabetes Council; Donald W. Meyers, “Rebirth of former sugar plant is sweet news for Toppenish,” Yakima Herald (Yakima, Washington); Leonard J. Arrington, Beet Sugar in the West;  Twila Van Leer, “Sugar Becomes a Sweet Success,” Deseret News; “Sugar Beets!” Lewiston-North Cache Valley Historical Board; Leonard J. Arrington, “The Sugar Industry in Utah,” Utah History Encyclopedia.

 

 

 

 

Utah Women: Pioneer, Poets & Politicians

I LOVED working on this project – Utah history has so many inspiring stories and amazing women – and I’m excited to have a release date and cover! Utah Women: Pioneers, Poets & Politicians coming November 2019 from The History Press.

Trailblazing in Untamed Territory

Representing lawmakers and lawbreakers, artists and adventurers or scholars and activists, the women of Utah defied stereotypes. At the crossroads of the West, they found new challenges and opportunities to forge their own paths. Emma Dean explored the Rocky Mountains with her famous spouse John Wesley Powell. Martha Hughes Cannon defeated her husband to become the first female state senator. Maud Fitch drove an ambulance under German artillery fire to rescue downed pilots in World War I. Author Emily Brooksby Wheeler celebrates the remarkable Utah women who, whether racing into danger or nurturing those who fell behind, changed their world and ours.

UtahWomenFrontCover

The Eleventh Hour of the Eleventh Day of the Eleventh Month…

One hundred years ago – November 11, 1918 – an armistice treaty brought a cease fire to “the war to end all wars.” For the soldiers and other volunteers serving, and for the nations across the world, it wasn’t really an end, but a chance to begin the healing process as best as they were able. I can’t say it better than John McCrae, though I like to think now that the torch represents the freedoms bought for us by past generations:

In Flanders Fields (1915)

In Flanders fields the poppies blow
Between the crosses, row on row,
That mark our place; and in the sky
The larks, still bravely singing, fly
Scarce heard amid the guns below.

We are the Dead. Short days ago
We lived, felt dawn, saw sunset glow,
Loved and were loved, and now we lie
In Flanders fields.

Take up our quarrel with the foe:
To you from failing hands we throw
The torch; be yours to hold it high.
If ye break faith with us who die
We shall not sleep, though poppies grow
In Flanders fields.

Remembering the Lost Generation

I’m disappointed to see how little attention the media and local civic organizations are giving to the centennial of the US entering The Great War. The WWI generation has been called the Lost Generation, and with good reason. After enduring WWI and the Great Depression, this generation and “their” war were overshadowed by their children fighting in WWII and the horrors of that war.

Yet the WWI generation of Americans also answered the call to go to war, often as volunteers. Many of the women who volunteered had to pay their own way – sacrificing money, time, and sometimes even their lives to nurse, drive ambulances, entertain, and feed and care for soldiers. We might think of the men and women who volunteered as naive, but by the time America entered the war on April 6, 1917, the fighting had dragged on for almost three years, and many of the young people who served had at least an idea of the gruesome conditions that awaited them.

For a giveaway of No Peace with the Dawn, a novel about how World War I changed the lives of one group of young Americans, see my Facebook page.

I think this poem by British poet Rupert Brooke, who died in the war, is a fitting memorial to all those who lost their lives in the Great War:

The Dead (1914)

These hearts were woven of human joys and cares,
Washed marvellously with sorrow, swift to mirth.
The years had given them kindness. Dawn was theirs,
And sunset, and the colours of the earth.
These had seen movement, and heard music; known
Slumber and waking; loved; gone proudly friended;
Felt the quick stir of wonder; sat alone;
Touched flowers and furs and cheeks. All this is ended.

There are waters blown by changing winds to laughter
And lit by the rich skies, all day. And after,
Frost, with a gesture, stays the waves that dance
And wandering loveliness. He leaves a white
Unbroken glory, a gathered radiance,
A width, a shining peace, under the night.

WWIposter

Save

World War I Centennial Commemoration

The United States doesn’t make as much fuss about World War I as most European countries, but the “Great War” still had a lasting impact on the United States. April 6th will mark one hundred years since the US entered the war. Some museums and historical societies will be holding events to commemorate the centennial, and my co-author Jeff Bateman and I will be at the Utah State University Museum of Anthropology April 1st at 12:30 to talk about the impact of the war on Utah and Cache Valley specifically.

Though April 1st isn’t the exact centennial of America’s entry into the war, it’s significant in Utah, at least, because it’s also General Conference weekend – when members of the LDS faith gather from across Utah and the world to listen to advice from their church leaders. April 6, 1917 was also the Saturday of General Conference weekend. War was declared while LDS church leaders and members gathered in the historic Tabernacle at Temple Square. Though the speakers did not officially announce the war over the pulpit, they did talk about the conflict that Christian soldiers would face of trying to fight while maintaining charity toward all men.

The President of the LDS Church, Joseph F. Smith, said: “…I exhort my friends, the people of our country, especially of this intermountain region, to maintain above all other things the spirit of humanity, of love, and of peace-making … I want to say to the Latter-Day Saints who may enlist, and whose services the country may require, that when they become soldiers of the State and of the Nation that they will not forget that they are also soldiers of the Cross, that they are minister of life and not of death; and when they go forth, they may go forth in the spirit of defending the liberties of mankind rather than for the purpose of destroying the enemy.”

A lot has changed in the last 100 years, but that challenge – to stand up for causes we believe in without giving in to hate towards those who oppose us or hold a different view – remains a problem that we still struggle with today.

Makeshift music in World War I

The Great War epitomized the dark side of the modern, mechanical age, turning warfare into a grinding machine spitting out broken men and women in unprecedented numbers. A theme that emerges over and over from World War I is the attempt of individual soldiers, nurses, doctors, refugees, and others to keep their humanity intact in the face of such horror. One of the ways they did this was through music.

It’s hard to imagine many traditional instruments made it to the front or survived conditions there very long, but people are endlessly creative. The Museum of the Great War in Meaux, France, has these examples of homemade musical instruments used on the Western Front:

IMG_0558IMG_0557

They used helmets, canteens, and scrap wood – along with an impressive understanding of how to lay out the strings and frets – to make music in the midst of war. I like to think it helped them think of better times, past and future, and hold on to their humanity while the world around them fell apart.

Save