Hawthorn tree lore

Hawthorn trees are among the most magical in European folklore – and also the most conflicting. Good luck or bad luck? Friend or foe? It depends on who you ask and when. But they’re gorgeous trees either way, with their pink or white blossoms and their trunks that get furrowed and gnarled with age. The most famous hawthorn is probably the Glastonbury Thorn at Glastonbury Abbey. It is said to have sprung from Joseph of Arimethea’s staff when he arrived there bearing the Holy Grail. Interestingly, the Glastonbury Thorn (or its current descendants) blooms twice a year – once in spring, like most hawthorns, and once at Christmas.

A lovely hawthorn tree in the Logan Cemetery – really, my picture doesn’t do justice to its cascade of late-May blooms.

Hawthorn trees were supposed to be particular favorites of the Fair Folk, often guarding the entryways into Elfland as well as ancient holy wells. For this reason, it’s bad luck to cut them down. There are roads in the British Isles that were redirected to go around old hawthorn trees, and some buildings there are said to be cursed because a hawthorn was removed to make way for the building. The Fay are very jealous of their trees.

I was delighted to find that the Logan Cemetery hawthorn has its own gnome guardian among its branches. He looks like he’s seen a few battles, maybe protecting the entrance to the Otherworld?

On the other hand, it was traditional to cut branches of blooming hawthorn for May Day celebrations. So, perhaps this is the one time it’s permissible to disturb the tree? Maybe it allows the Fay to join the celebrations. One should never bring the hawthorn branches or flowers inside, though. That might invite the Fair Folk’s attention (Branches from the Glastonbury Thorn supposedly decorate the queen’s table at Christmas, but maybe she gets a special dispensation).

The Logan Cemetery hawthorn has two trees growing from the debris collected over the years between its trunk and branches: this little spruce and the sapling by the gnome, which might be some kind of plum. When rowan trees grew in the joints of other trees, they were called “flying rowans” and were thought to be especially potent magic. Maybe this “flying spruce” growing from the hawthorn also has some special power.

Some people felt it was good luck to have a hawthorn growing near (but not too near) their house, while others didn’t want the Fair Folk that close. If you decide to plant a hawthorn, it’s a hardy tree with spring flowers, an informal growth habit, and tiny red fruit in the fall-winter (edible in most species but not tasty). Paul’s Scarlet, the one in these pictures, was discovered in the mid-1800s and has double pink flowers (wild European hawthorns, including the Glastonbury Thorn, are usually white), but it often loses its leaves early in the summer. Crimson Cloud is a pink-flowered European hawthorn that keeps its leaves until fall, and American hawthorns or maythorns also make nice yard trees. I haven’t found any stories associating American hawthorns with the Fair Folk, and I guess you can decide if that’s a pro or a con.

Daylight Savings Time

We all hate daylight savings time, right? The “fall back” one isn’t so bad because we get an extra hour of sleep, but we pay for it when we have to “spring forward.” Even my dog was cranky today because we wouldn’t feed her at what she knew to be dinner time, since we were all pretending it was an hour earlier. And Hawaii and Arizona don’t even bother with the time change, though the Navajo Nation lands within Arizona do, which just makes everything even more confusing.

I knew that daylight savings time started in World War I as a way to save fuel (an extra hour of daylight in the evening meant less fuel used to light homes). This was in the US and also in some European countries, many of which also still practice daylight savings today. Only a few cities in Ontario, Canada had experimented with it prior to WWI.

What I didn’t know was that we’ve been getting rid of daylight savings time and bringing it back on and off for the last 100+ years. The first round of daylight saving time ended with WWI. FDR brought it back for WWII and called it “war time.” When WWII ended in 1945, so did war time.

For a while, some parts of the US practiced daylight savings time, while others did not. So, a city might change its clocks while the surrounding countryside stayed on standard time. We can imagine the chaos this would have caused for businesses, travelers, and pretty much everyone.

It was the 1960s when we got saddled with daylight savings time on a more permanent basis to settle the confusion. This was popular with sports equipment manufacturers, who hoped that people would play more sports if they had more daylight hours in the evening, and who continue to lobby for the continuation of daylight savings time. Some workers liked having more daylight time after work to spend outdoors or with their families, but for the most part, it remains unpopular with parents, teachers, farmers (who find that cows don’t adjust their milking schedule to daylight savings time), and pretty much everyone else.

During the energy crisis of the 1970s, the US and many other countries experimented with making daylight savings time permanent in the hopes of saving energy, but that caused problems with workers and school children having to leave home in the dark on winter mornings. Also, though daylight savings time does accrue a very small amount of energy savings in lighting, it may actually cause an increase in fuel use because of people driving more to evening activities. So, we moved back to the clock switching.

The days we spend on standard time are shrinking, though, moving late enough in the fall to allow trick-or-treaters to enjoy the extra hour of daylight and earlier in the spring (perhaps to avoid major religious holidays like Easter?). Maybe we’re heading toward doing away with it once more – this time for good.

Photo courtesy of maxmann

The Decline and Fall of the Utah Sugar Beet Empire

Where does your sugar come from?

Americans eat more sugar than any other nation, consuming close to 11 million metric tons of the sweet stuff annually (that’s somewhere around 150 pounds of sugar per person per year – 100 years ago, we ate closer to 18 pounds per person per year). The US alone produces over 8 million metric tons of sugar each year, and the largest sugar producer is…

Minnesota, eh?

Not the place many of us might imagine our sugar coming from, and tropical Florida and Louisiana are top contenders in sugar cane production, but more than half of US sugar actually comes from sugar beets (pictured below).

Sugar Beet Before Topping LOC

Utah is no longer even on the map for sugar production, but for much of the twentieth century, Utah was an important sugar beet producer.

Someone recently asked me, where did the sugar beets go?

After all, Idaho – literally within spitting distance of some of the Utah towns that once ran on the sugar beet industry – is still an important producer of sugar. Some parts of Utah, like the “Sugar House” neighborhood in Salt Lake City, still bear record of their association with the sugar industry. And the sugar beet is the official Utah state historic vegetable. Who knew that was a thing?

So, here’s a short-but-sweet 😉 history of sugar in Utah.

First of all, y’all know that sugar has an ugly history, right? European (and later American) sugar cravings drove perhaps the most brutal slavery-based industry from the 1500s to the 1800s, with generations of enslaved West Africans working and dying on sugar plantations, mostly in the Carribean.

By the 1800s, with international wars and slave revolts disrupting the sugar industry, European scientists developed a process for extracting sugar from beets. Abolitionists and human rights advocates were quick to promote beet sugar (while pro-slavery factions shunned it). Beet sugar also had an economic advantage because sugar beets grow in a much wider variety of climates than sugar cane, meaning France, Russia, Germany, and the non-tropical portions of the United States could all develop their own sugar industries instead of relying on imports.

By 1850, Brigham Young had led the Mormon pioneers to the relative isolation of Utah and was interested in being as self-sufficient as possible. Thus, the Church of Jesus Christ of Latter-day Saints backed an experimental sugar beet factory in Sugar House. It failed miserably, producing a nasty syrup not even independent-minded Brigham Young thought edible.

It took a few decades before American factories got the hang of beet sugar, but by the end of the 1800s, with the social and financial backing of the Church of Jesus Christ of Latter-day Saints, the Utah-Idaho Sugar Company had factories throughout Utah, and sugar beets became an important part of Utah’s economy. Sugar beets required back-breaking manual labor, but large Utah families had plenty of children to work in the fields. Their work was supplemented by Native Americans displaced by pioneer settlements and later by Latino refugees fleeing the Mexican Revolution. The worldwide economic disruptions of World War I saw the peak of the sugar industry in Utah when the state was one of the country’s top sugar producers (like the factory in Lewiston, Utah, pictured below – that’s a big pile of beets!).

Sugar beets in Lewiston LOC

From there, it was a slow downhill slide. An agricultural depression followed WWI, with prices falling into a slump after the previous war-time demand, and after that came the Great Depression. At the same time, beet leafhoppers spread a blight in Utah that damaged crops and led some factories to move away from the state.

The Utah sugar beet industry struggled on, eventually finding a blight-resistant strain of beets, and in World War II, Utah was still a major US sugar producer. Many detained Japanese Americans worked in the sugar beet fields of Utah to keep up with wartime production.

Following World War II, the sugar beet industry saw a number of technological changes that made sugar processing faster and more efficient. Ironically, this would lead to the end of large-scale commercial sugar beets in Utah. The main companies that bought Utah sugar beets, U&I (Utah-Idaho Sugar Company) and Amalgamated Sugar Company, had both started in Utah but expanded their production to Idaho and the Pacific Northwest and eventually moved their headquarters out of state. They faced several antitrust actions by the federal government as well as competition with low-cost cane sugar from overseas during the mid-1900s, which strained their resources. Also during this time, the Church of Jesus Christ of Latter-day Saints began selling its interests in the sugar companies.

So, when it came time to update factories, the sugar companies invested in those in Idaho and Oregon. One by one, the outdated Utah factories shut down, with the last one in Garland, Utah, shutting down in 1979. It was not cost-effective for Utah farmers to ship their sugar beets out of state, so most switched to other crops, and Utah fell off the sugar-producing map.

For most of Utah, sugar beets had never been an ideal crop. The soil is too alkaline, the growing season too short, and the labor too intensive. The fact that Utah enjoyed nearly a century of sweet success is a testament to the stubborn self-sufficiency of Utah’s farmers.

Utah’s official contemporary state vegetable is the sweet onion. And that’s a thing, too, because when it came time to vote on state vegetable, the onion may be important now, but the sugar beet-proponents refused to back down on giving some kind of recognition to the crop that had been so important to Utah’s economy for most of the 1900s.

Photos courtesy of the Library of Congress OWI-FSA collection (public domain).

Sources: U.S. Sugar Industry Association; American Sugar Alliance; New Hampshire Department of Health and Human Services; The Diabetes Council; Donald W. Meyers, “Rebirth of former sugar plant is sweet news for Toppenish,” Yakima Herald (Yakima, Washington); Leonard J. Arrington, Beet Sugar in the West;  Twila Van Leer, “Sugar Becomes a Sweet Success,” Deseret News; “Sugar Beets!” Lewiston-North Cache Valley Historical Board; Leonard J. Arrington, “The Sugar Industry in Utah,” Utah History Encyclopedia.

 

 

 

 

My favorite historical resources

I recently gave a couple of classes on researching historical fiction, and even though I’ve posted about (read: “totally geeked out about”) some of these resources before, I wanted to put them all in one place (especially if I missed emailing them to anyone who wanted them!). So, here are some of the sources I use when researching historical fiction:

Secondary sources (written after the fact, by someone who was not there, often a historian): “Daily Life in…” type books for an overview of the time period, to get a big picture understanding to help put primary sources in context, and to mine the bibliography for other books with more specific details, like ghosts stories from rural Pennsylvania, food in Edo Japan, or early French heraldry. Interlibrary loan is your friend when looking for obscure secondary sources–for the cost of shipping the books via library mail (about $3.50 last time I used it), most public libraries in the U.S. will help you check out books from other libraries all over the country.

Primary sources (written by someone who was there–an eyewitness): Old diaries and letters, legal documents, newspapers, etc. Some of them are available for free through Kindle, Project Gutenberg, Google Books, etc., but there are also databases that will point you to primary sources online, such as:

Other cool, random history-related sites:

  • Google n-grams, where you can find out if an English word was used (in print) and how common it was in a given historical time period: https://books.google.com/ngrams
  • Online Etymology Dictionary, where you can find out what a word actually meant historically (it can change a lot!), and well as when it was in use: http://www.etymonline.com/
  • Historical maps: http://www.oldmapsonline.org
  • The Met museum’s searchable database of their amazing collection of historical objects, including weapons, jewelry, musical instruments, and clothing (The dresses! The beautiful, beautiful dresses!): http://www.metmuseum.org/art/collection

That’s really just a start, but these are good places to begin. There are lots of web sites run by local history societies, re-enactors, and other authors/history buffs that are full of good information too, as long as you remember to read everything online with a skeptical eye. Happy researching!

How to make historians crazy in three easy steps

Pop quiz! What’s the oldest university in Europe? The official answer is the University of Bologna in Italy (dating to the 11th century), though the University of Paris and Oxford have reasonable claims to be as old or older. They pale in comparison to the Guiness Book of World Records’ pick for the oldest operating university: University of Al-Karaouine in Morocco, founded in the ninth century by a woman, Fatima al-Fihri (how cool is that!?).

But one of the first centers for higher education in Europe was Cor Tewdws, or the College of Theodosius, founded in Roman Britain (modern Llantwit Major in Glamorgan, Wales) during the fourth century. It survived the collapse of Rome, and though it was destroyed by the Irish (known then as Scots), the Vikings, and then the Normans, it was rebuilt and functioned until the sixteenth century, when Henry VIII dissolved it because it was also a monastery.

What does this have to do with insanity in historians? I’ve been researching Cor Tewdws for my current work in progress, and I’m feeling like this:

sokkafacepalm

So, here are three things that will have historians crying, drinking, or pounding their head against the wall:

1-Tell them about some amazing historical event, but provide no details or proof. That’s why post-Roman Britain is a fascinating time period, and a completely frustrating one. It gave us none other than King Arthur–one of the most influential figures in Western legend and literature–but provides nothing more than rumors to suggest he actually lived. Part of the problem is that people don’t often bother to write things down when their society is literally burning to the ground, and what was written rarely survived. There’s just enough to tantalize–vague references, later legends–so historians fight like starving dogs over the little scraps of information and spin it to fit their pet theory. King Arthur was a Sarmatian war leader? A Celtic god? An alien? Sure, depending on how you look at it.

Stargate-FacePalm

2-The scarce evidence is a good start, but it may not be enough to bring on a full mental breakdown. The next thing you have to do is make sure the evidence is impossible to make sense of. Take the case of Cor Tewdws. We’re told that when the Irish first raided the college in the mid-400s, they captured none other than the illustrious Patrick, taking him back as a slave and thus beginning his epic journey to sainthood (you didn’t know St. Patrick was Welsh, did you?). That’s awesome. Then, a generation or so later, St. Illtud (who was one of Arthur’s knights before becoming a monk, but then, apparently everyone for a two-century span was one of St. Arthur’s knights, which lends credence to the god/alien theories) re-established the college. Some of his young students included notables like St. David (patron saint of Wales), Gildas the historian, and St. Patrick. Wait … what!?

Calvin Facepalm

3-Whew, okay, your historian survived that one too? He or she is crying a little, but pressing onward? Time for a spitball: forgery. For example, many modern interpretations of early Welsh history are based on the work of nineteenth century Welsh historian and bard Iolo Morganwg (AKA Edward Williams), who compiled ancient Welsh documents, inscriptions, and legends into one handy source. The problem? He made crap up. And since we’ve since lost some of the original documents he had access to, we have no idea which parts he made up and which parts he didn’t. Maybe we shouldn’t trust historians who use a bardic pseudonym (especially one so close to YOLO). Now, if it was just Iolo/Edward who did this, it might not be so crushing, but the kicker is that pretty much every medieval historian did the same thing. They were all big, fat liars. Oh, sure, they had good intentions, wanting to get people excited about their national heritage, but we all know where good intentions lead.

Triple-facepalm

So, the sum total is that we have no idea what really happened 1500 years ago. Honestly, there’s a lot of things we don’t know about 100 years ago, and it’s miraculous that we know anything about the “Dark Ages.” We’re basing our guesses on fragments of evidence written down after the fact, often contradicting each other, and in all likelihood at least partly made up. It would be like historians in the future trying to learn about World War II solely by watching Indiana Jones. You’d get that there were Nazis and that they lived in Germany and were bad, but that’s about it. And you’d go crazy looking for that warehouse with the Ark of the Covenant.

Indy facepalm

My new can’t-write-without-it writing tool

Historical fiction writers and word geeks, may I introduce you to your new best friend: the Online Etymology Dictionary. Etymology is the study of the history of words, or, as the Online Etymology Dictionary defines it, the “facts of the origin and development of a word.” This is not to be confused with entomology, the study of insects. 😉

Few things ruin historical fiction faster than words or phrases that don’t fit the time period (imagine a dashing Regency hero with an immaculately tied cravat telling someone to “chill out”–and the spell is broken), and that’s where this dictionary comes in handy. It draws on the immense and expensive Oxford English Dictionary but is much more accessible (and free!). For instance, it tells me that my characters could go on a picnic anytime after the year 1748 (the first time the word was used in English), but are not likely to do so until 1800 or after (when the word became common). They can collect knick-knacks starting in the 1570s, hobnob after 1763, and while they can “make over” an old dress starting in the 1590s, they cannot get a “makeover” until 1981.

Google Ngram Viewer also helps writers pinpoint historical word usage by telling us when and how frequently a word was used in historical books. The great thing about the Online Etymology Dictionary, though, is that it explains how a word evolved over time. For instance, take the word mess. We use it often in modern English. Its original meaning was “a portion of food” (i.e., the Biblical/proverbial “a mess of pottage”). It came to mean a group of people eating together, so that by the 1530s it was used in the sense of “mess hall” for a place to dine. By 1738, it could mean food mixed all together. Then by 1828, it evolved to mean any kind of a jumble, and by 1834 it also meant “a state of confusion.” Now it’s the word we recognize, but you still have to be careful about idioms. “Make a mess” was first used in 1853, “mess with” is from 1903, “make a mess of [something]” 1909, and “mess up” 1933. Also, several of these started as Americanisms, so they may not have caught on quickly across the pond.

This means my Elizabethan characters will probably not use the word “mess” (because the only way they would understand it is a meaning that would confuse my modern readers). My Victorian characters can say, “What a mess” or “messy,” and they can “make a mess,” but they won’t “mess up.”

I admit this can get nitpicky (c.1962), and even the really sharp, “voice-y” historical fiction writers like Georgette Heyer occasionally let a modernism slip into their books without crumbling the facade of the world they’ve created, but I work on the theory that the fewer modern allusions in a book to pull readers out of the story, the better. On the other hand, if we use too much archaic language, all but the most stalwart (late 14th century) or hardcore (1951) readers will probably have trouble getting immersed in the story and drift away, so it’s a tightrope walk and depends on our intended audience. Still, if nothing else, the Online Etymology Dictionary will satisfy word lovers’ urges to geek out for a while.

Using Google Ngram Viewer for historical fiction and historical fantasy

Google Ngram Viewer is one of the writing tools I turn to often when writing a story with a historical setting. This tool searches Google’s vast collection of online books (5 million plus) for the words or phrases you enter and graphs the frequency that the word appears in print. Since it’s searching printed books, it’s pretty sparse when dealing with the Renaissance or Early Modern era, but if you’re into the Regency and Victorian periods, it’s a great help. It does have some options for foreign language books as well.

For instance, during the Victorian period, the cravat of Regency fame evolved into a close relative of our modern necktie. When I’m describing my male character’s clothing, though, I don’t want to say necktie, because that will give my modern readers the wrong mental image (especially if they picture colorful modern ties). On the other hand, I don’t want to call it a cravat if that’s not what the Victorians would have said. So, I went to Google’s Ngram Viewer:

https://books.google.com/ngrams

I entered “cravat” and “necktie,” separated by commas, so they would appear on the same graph. It told me the word “necktie” first appeared in print in the late 1850s, but “cravat” continued to dominate literature until after 1900.

There are some things to be aware of with Google Ngram Viewer. “Cravat” still appears frequently in modern books, almost as often as “necktie.” Why? Because we love Regency and Victorian novels. Not many people would say cravat now–most of us would even say tie instead of necktie–but the word still appears in print because of historical fiction. Still, given the dominance of “cravat” in literature through the end of the Victorian period, I feel pretty safe assuming it was still being used to refer to contemporary Victorian fashion. (Also, it was the word I wanted to use to keep my historical flavor, so I’m prejudiced in its favor.)

Another thing to remember with Google Ngram Viewer is it doesn’t understand the evolution of a word’s meaning; it just tells you if a word was used in print. So, the word “lover” appears more frequently in Victorian literature than in modern, according to the Ngram Viewer. Don’t let this overthrow your ideas of Victorian propriety–they used “lover” to mean a suitor or romantic interest, not necessarily to imply physical intimacy as it would today. In Pride and Prejudice, when Mr. Bennett says Wickham, “… simpers, and smirks, andmakes love to us all,” he certainly doesn’t mean it in the modern sense! This is where a good dictionary that includes archaic meanings comes in handy for the writer, and where readers might need context clues to make sure they understand how you’re using the word (and some words you just can’t use–they’ve acquired too much baggage over the years).

Google Ngram Viewer won’t solve all your historical word choice dilemmas, but it can help you determine if a word is appropriate for setting the right tone for your historical fiction or fantasy. It’s also a fun way to waste time when you’re supposed to be writing, and you can call it research.

Time to switch brains

I’m moving now from polishing my Victorian ghost story, Within the Sickle’s Compass: or, The Haunting of Springett Hall to revising my still-unnamed Elizabethan novel. I feel like a need to whack my head a couple of times to clear out the Victorian stuff and make way for a totally different mindset. The Elizabethans were rougher around the edges, but I can’t help thinking they had a bit more fun. After all, kissing was totally verbotten with proper Victorians, but the Elizabethans kissed people hello and goodbye and for everything in-between. Of course, the Elizabethans also had to deal with the plague. And maybe there’s a connection there.