On the National Day of the Horse, horse lovers like ourselves like to consider America’s relationship with horses and our common history. It’s popular knowledge that European colonists brought horses over to America during the 15th and 16th century to be traded with the Native Americans, hence the Thanksgiving association. While this is true, the relationship isn’t as straightforward as that; it’s a complicated one.
A 2012 study found the wild ancestor of the modern domestic horse likely originated around 160,000 years ago in Eurasia. The scientists determined that horses were first domesticated roughly 6,000 years ago somewhere in the Eurasian Steppe. Another study published in 2017 found all modern horses descend from two distinct lines: the Arabian horse and the now-extinct Turkoman horse (which was similar to the Akhal-Teke breed).
Classic Arabian
Turkoman/Akhal-Teke
Horses spread around the world via trade, war, gifting, theft, and more. People began to selectively breed for desirable characteristics to meet their work requirements for the horses, such as speed, strength, and stamina. While people kept track of their horses’ lineage and traits for centuries, studbooks to maintain an official pedigree record didn’t come about until the 1700s. From this arose the multitude of breeds and types of horses we know today.
Just 19 of the Breeds
The Breeds of Livestock resource from Oklahoma State University lists 217 separate breeds of horses from the Abyssinian to the Zhemaichu. Meanwhile, “The Encyclopedia of the Horse” by Elwyn Hartley Edwards lists just over 150 breeds of horses, including many ancient breeds that no longer exist but are the ancestors of many breeds today. A study by the Swedish University of Agricultural Sciences references 784 horse breeds in the United Nations Food and Agriculture database, but most equine experts recognize approximately 200 horse breeds.
Although horses hadn’t been roaming the American plains in the years leading up to their European introduction, horses have a much longer relationship with America than previously thought.
Paleontology
Think millions of years, coinciding in time periods with the mighty wooly mammoth. Around 10,000 years ago, some of these wild horses crossed over the Bering land bridge that connected early America and Asia. The earliest bridles for horses were found in Eastern Europe dating back to 4000 BC, showing that the Europeans started to domesticate the wild horses around this time, using them for hunting, carrying packs and working the fields. The ancient wild horses that stayed in America became extinct but their ancestors were introduced back to the American land via the European colonists many years later.
Pleistocene Era Horse
Columbus’ second voyage was the starting point for the re-introduction, bringing Iberian horses to modern-day Mexico. Some of the Iberian horses escaped European control and became wild horses, relatives of the mustangs in the Western United States today. The first breeds of horses that were brought over were smaller, due to size constraints for the smaller ships of the time, but as time went on, larger horses such as draft horses were also imported.
Navajo Bridle
After Columbus’ re-introduction, horses spread across the continent and many Native American societies developed their cultures around them. This is where problems emerge, because although they were once native to America thousands of years ago, horses are still technically a recently introduced species to the American plains. Wild horses have few predators and a perfect habitat, so they quickly grew to become a symbol of the West. However, their populations grew too quickly, and they began to compete with farmers for the natural resources that the land held.
The Bureau of Land Management currently protects the wild herds, but they have to manage the population via sterilization techniques and round-ups, a dramatic controversy for animal rights activists.
Wild mare with foal and yearling
BLM Round-Up
The horses we see today are all examples of selective breeding via humans over the years (with the exception of the pure desert-bred Arabian) but they’re also a shared part of our mixed Native and European histories. Horses allowed humans to travel farther and faster, instrumentally help out armies during battles, and develop the country through labor-intensive agriculture.
There are currently around 9.2 million horses in the country, consisting of many breeds such as American Quarter Horses, Paints, Appaloosas, Missouri Fox Trotters, and rarer breeds such as the Shire, Lippizan, Gotland, Caspian and Colonial Spanish Mustangs.
Classic Quarter Horse
Classic Paint
Classic Appaloosa
Missouri Foxtrotter
Dappled Bay Shire
Classic Lippizanner
Gotland
Caspian
Oscar (above) is one of four horses who starred in Disney’s Hidalgo, the epic adventure movie based on the true story of Frank T. Hopkins and his Spanish mustang stallion, Hidalgo. Oscar is APHA registered as Impressively Better, born in 1991.
Most of the breeds in “The Encyclopedia of the Horse” are horses with existing registries that can trace bloodlines to ensure purity. In general, the number of horse breed registries is increasing as equine lovers recognize the need to compile data about rare and endangered breeds and types of horses.
Personally, I believe they knew the attack was coming but the Military Industrial Complex and the warmongers in our government wanted to get into the war. So they allowed it to happen. Of course, there is no proof of that but…..there’s not much proof for a LOT of things!!!
National Pearl Harbor Remembrance Day, also referred to as Pearl Harbor Remembrance Day or Pearl Harbor Day, is observed annually in the United States on December 7, to remember and honor the 2,403 Americans who were killed in the Japanese surprise attack in Hawaii on December 7, 1941, which led to the United States declaring war on Japan the next day, thus entering World War II.
On Sunday morning, December 7, 1941, the Imperial Japanese Navy Air Service attacked the neutral United States at Naval Station Pearl Harbor near Honolulu, Hawaii, killing 2,403 Americans and injuring 1,178 others. The attack sank four U.S. Navy battleships and damaged four others. It also damaged three cruisers, three destroyers, and one mine layer. Aircraft losses were 188 destroyed and 159 damaged.
Canada declared war on Japan within hours of the attack on Pearl Harbor, the first Western nation to do so. On December 8, the United States declared war on Japan and entered World War II on the side of the Allies. In a speech to Congress, President Franklin D. Roosevelt called the bombing of Pearl Harbor “a date which will live in infamy.”
There are a number of Naval memorials around the US in honor of those who died at Pearl Harbor. The most well known and highly publized is the USS Arizona.
USS ARIZONA
The USS Arizona Memorial in Pearl Harbor is a marble memorial over the sunken battleship USS Arizona, which was dedicated on May 30, 1962 (“Memorial Day”), in honor of the 1,177 crew members who were killed. The memorial remembers all military personnel who were killed in the Pearl Harbor attack. Note: This site is open to the public with boat tours to the memorial provided by the US Navy from the visitors center.
Pearl Harbor Memorial
USS OKLAHOMA
In the first ten minutes of the battle, eight torpedoes hit the USS Oklahoma, and she began to capsize. A ninth torpedo would hit her as she sunk in the mud. 14 Marines, and 415 sailors would give their lives. 32 men were cut out through the hull while the others were beneath the waterline. Banging could be heard for over 3 days and then there was silence.
USS Oklahoma Hit
In 1943, the Oklahoma was righted and salvaged. Unlike most of the other battleships that were recovered following Pearl Harbor, the Oklahoma was too damaged to return to duty. Her wreck was eventually stripped of her remaining armament and superstructure before being sold for scrap in 1946. The hull sank in a storm in 1947, while being towed from Oahu, Hawaii, to a breakers yard in San Francisco Bay.
USS Oklahoma Memorial at Pearl Harbor
USS UTAH
The USS Utah Memorial, is in remembrance of a former battleship that had been converted to a target ship in 1931 (thus, at the time of the Pearl Harbor attack carried the designation of AG-16), that was sunk in the attack on December 7, 1941. A memorial to honor the crew including the 58 who died aboard USS Utah was dedicated on the northwest shore of Ford Island, near the ship’s wreck, in 1972. The ship, along with USS Arizona, was added to the National Register of Historic Places and declared a National Historic Landmark in 1989.
Ford Island
USS Utah Sinking
USS BOWFIN
The USS Bowfin Submarine Museum and Park is in remembrance of an American submarine that sank 44 ships in World War II. This site is adjacent to the USS Arizona Memorial Visitors Center.
The submarine is owned and operated by the Pacific Fleet Submarine Memorial Association, and is now part of the USS Bowfin Submarine Museum and Park in Pearl Harbor, on the island of Oahu, Hawaii. Visitors can tour the submarine with an audio narration of life in the vessel during World War II. The park’s museum features exhibits and artifacts about submarines and the history of the United States Submarine Service, including detailed models, weapon systems, photographs, paintings, battle flags, recruiting posters, and a memorial honoring the 52 American submarines and the more than 3,500 submariners lost during World War II.
The museum’s other exhibits include a torpedo and a 40-mm quad gun, along with Poseidon C-3 and regulus I missiles. The park is located within walking distance of the visitor center for the USS Arizona Memorial and it is across the Harbor from the Battleship Missouri Memorial.
USS MISSOURI
While operating with the carriers on 11 April, the USS Missouri came under attack from a kamikaze that struck the side of the vessel below the main deck. The impact shattered the aircraft, throwing gasoline on the deck that rapidly ignited, but it was quickly suppressed by her crew. The attack caused superficial damage and the battleship remained on station.
Top left of center you can see the Kamikaze
Two crewmen were wounded on 17 April when another kamikaze clipped the stern crane and crashed in the ship’s wake. Missouri left Task Force 58 on 5 May to return to Ulithi; in the course of her operations off Okinawa, she claimed five aircraft shot down and another probable kill, along with partial credit for another six aircraft destroyed.
On 21 August, Missouri sent a contingent of 200 officers and men to Iowa, which was to debark a landing party in Tokyo to begin the process of demilitarizing Japan. Two days later, Murray was informed that Missouri would host the surrender ceremony, with the date scheduled for 31 August. The ship’s crew immediately began preparations for the event, including cleaning and painting the vessel. Missouribegan the approach to Tokyo Bay on 27 August, guided by the Japanese destroyer Hatsuzakura. That night, the ships stopped at Kamakura, where a courier brought the flag that Commodore Matthew Perry had flown during his expedition to open Japan in 1853; the flag was to be displayed during the surrender ceremony. The flotilla then entered Tokyo Bay on 29 August, and Missouri was anchored close to where Perry had anchored his own vessels some ninety-two years earlier. Poor weather delayed the ceremony until 2 September.
Japanese foreign affairs minister Mamoru Shigemitsu signs the Japanese Instrument of Surrender aboard the USS Missouri as General Richard K. Sutherland watches, 2 September 1945
Allied sailors and officers watch General of the Army Douglas MacArthur sign documents during the surrender ceremony aboard Missouri on 2 September 1945. The unconditional surrender of the Japanese to the Allies officially ended the Second World War.
In 1990, leading up to the 50th anniversary of the attack on Pearl Harbor, Congress established the Pearl Harbor Commemorative Medal. This is also known as the Pearl Harbor Survivor’s medal and was awarded to anyone who was in the U.S. Armed Forces who was present in Hawaii on December 7, 1941 and participated in combat operations that day against the attack. The medal was also awarded to civilians who were killed or injured in the attack. A few years later, Congress amended the law to allow any person who was present in Hawaii on December 7, 1941, and was involved in combat operations against Japanese military forces attacking Hawaii, to receive the award. In both instances, there was a limited time period to apply for the award, and it is no longer issued.
Pearl Harbor Survivor Medal
The battleships West Virginia and Tennessee burning after the Japanese attack on Pearl Harbor, on December 7, 1941.
Oil burns on the waters of Pearl Harbor, near the naval air station, after the Japanese attack on Pearl Harbor on December 7, 1941.
The battleship USS Arizona belches smoke as it topples over into the sea during a Japanese surprise attack on Pearl Harbor, Hawaii.
No country has a closer association with the language of Olde Englande than the USA. From the days of the first Puritan settlers to recent cross-Atlantic tweetings, the two countries have shared in the development of English.
Many words and phrases used in the USA have retained their Elizabethan English meanings and pronunciations that have long disappeared in England itself. There are many American phrases which are used in the USA but haven’t been adopted anywhere else. Example of this are:
BLUE PLATE SPECIAL: Webster’s Dictionary defines ‘blue plate’ as a restaurant dinner plate divided into compartments for serving several kinds of food as a single order and a main course (as of meat and vegetable) served as a single menu item.
One early citation of the phrase is in this advert for the Young Women’s Christian Association, printed in the Illinois newspaper The Decatur Daily Review, September 1924. However, it is believed that the term blue plate special first appeared on menus of the Fred Harvey chain of restaurants in 1892. These were located at stations along the Atchison, Topeka and Santa Fe Railroad. The blue plate special was designed to allow passengers to grab a quick bite to eat when the train stopped.
LEAD-PIPE CINCH: The ‘cinch’ that this expression derives from is the Spanish/Mexican word for a horse’s saddle-girth – cincha. The word is recorded in English, as ‘synch’ and later ‘cinch’ in various Canadian and US sources from the 1860s onward. From the 1880s the use was extended into a verb form and things which were tightly secured were said to be ‘cinched’ – for example, this piece from The Manitoba Daily Free Press, December 1882: “The next movement was to throw the bull, and then cinch a lasso and rope tightly around his body.”
The word cinch was also used in the USA as the name of sturdy fixing brackets, which were secure and unlikely to come loose.
The figurative use of cinch, meaning to tie-up or make certain, in non-animal contexts began around the same time. The usage was often in contexts where the rich and powerful used their status to form monopolies or indulge in insider trading in order to cheat the general public. An example of this comes from the Illinois newspaper The Morning Review, December 1889: “The briber and bribed would sit down to a game of poker and a lead-pipe cinch was nothing to the sure thing the legislators had.”
The common usage of ‘cinch’ now, that is, to mean ‘easy’ rather than ‘secure’, comes from this ‘easy money’ association. In October of 1891, The Daily Morning Republican, listed a number of ‘cinch’ superlatives to describe a punter’s certainty that his horse Firenzo would win the next day:
“The track will be heavy tomorrow, and I’ve got a copper riveted, lead pipe, copyrighted, air tight cinch. Firenzo in the mud – she swims in it.”
EIGHTY-SIXED: The term is American and originated in the restaurant trade. Both meanings loosely refer to something that was previously okay becoming not okay. The earliest known example of the expression in print is found in the journal of the American Dialect Society -American Speech, 1936: “Eighty-six, item on the menu not on hand.”
The actual origin is uncertain but is often suggested to be one of these: (1) Chumley’s Bar and restaurant at 86 Bedford Street in Greenwich Village NYC; (2) a reference to article 86 of the New York state liquor code which defines when bar patrons should be refused service; and (3) from Delmonico’s Restaurant in New York City. Item number 86 on their menu, their house steak, was often unavailable during the restaurant’s early years.
PRESTO CHANGO: Presto chango is a variant of the earlier exclamation ‘hey presto,’ which is used primarily in the USA. Before either expression was coined, conjurers and other stage performers simply said ‘presto!’ to draw attention to the culmination of a trick.
Presto is an Italian word meaning ‘quickly’ and it was used in England with that sense from the 13th century. “Hey Presto” began being used in England in the 18th century. The English writer Henry Fielding used it in 1732 in his farce The Lottery.
We go forward to the 19th century and ‘presto chango’ began being used in the USA. It took various spellings – ‘presto change’, ‘presto changeo’ and ‘presto chango’. ‘Presto! change’ is recorded in England in 1824 and it soon migrated to the USA and became ‘presto chango.’ One early US example can be found at the Pensacola Gazette & West Florida Advertiser April, 1824: “A tailor cannot drop his bodkin, a brick mason his trowel, or a grocer his cent per cent on coffee and candles; and become my Lord Coke or Hale by a presto change” Another was in the Ohio newspaper The Huron Reflector, February 1844: “Hey! presto! chango! as the juggler says – Kitty Grimes was not to be married to James Duncan after all.” Although ‘presto change’ was first used in the UK, the ‘presto chango‘ form can be said to be American – in fact, few people outside the USA would know what it meant.
Considering the debacle of an election we just experienced, I thought the following words were appropriate!!!!
CHEAT: Under medieval law a title to real estate could lapse in many ways. Property affected by such a lapse was called an “escheat” and became forfeit to the king. These cases were so numerous that some rulers employed escheators to look after their interests. Usually working on a commission basis, these fellows seized property at every opportunity. If they didn’t violate laws, they certainly trifled with justice. Because of the questionable practices of these royal agents, it became customary to call any dishonest person a cheat.”
Cheater Leader in the House
CON MAN: Hard times following the Civil War forced criminals to resort to all sorts of tricks to gain relatively small amounts of money. One of the most common was the sale of fraudulent mining stock. Investors were reluctant to advance funds without examining property, so swindlers adopted the practice of asking a victim to make a small deposit “just as a gesture of confidence.” The full amount was to be paid only after a trip to the West on the part of the purchaser.
Con-Man in the Senate
A swindler would take the money advanced and decamp. This type of trick became known as the “confidence game” because it worked only if the victim had confidence in the proposal. Anyone who practiced confidence games came to be called a con man. This title was applied to many types of swindlers and is now used to describe a shrewd thief who finds suckers [voters] by means of the Internet or e-mail.
FEET OF CLAY: Nebuchadnezzar II was the Babylonian king who captured Jerusalem in 587 BD, destroyed the city, and took the Hebrew people into captivity, ending the Judean kingdom. The book of Daniel tells how the young Hebrew captive explained one of the king’s strange dreams. Nebuchadnezzar had seen a giant image with a golden head, silver arms and breast, brass thighs, and iron legs. Every part was metal except the feet, which were compounded partly of iron and partly of potter’s clay.
Daniel said that his feet made the metal figure vulnerable, meaning that Babylon would be broken into pieces. Impressed by this dramatic story, English readers of the Bible seized upon the weak spot of the strange figure as a symbol of weakness in general. Today, any noted person with a vulnerable point is still said to have feet of clay.
Feet of Clay Crenshaw
KANGAROO COURT: When the English explorer Capt. James Cook returned from Australia in 1771, he was branded a liar. People disbelieved his reports of a strange animal that hopped about on two legs and stood as high as a man, which he reported the natives called a “kangaroo.” Many who heard his accounts doubted their truth and there was great joking about kangaroos.
When a few specimens were brought to Europe, they created a sensation. Anything marvelous or unusual was likely to be termed “kangaroo.” For example, an 1835 issue of the Gentleman’s Magazine described an eccentric horseman as holding his reins with “kangaroo attitude.” Settlers in the New World used the word to stand for any type of irregular gathering. During Reconstruction following the Civil War, a “kangaroo convention” held in Virginia made national headlines.
Criminals who adopted the odd word applied it to a “court” held by inmates of prisons. In such a proceeding, old-timers charged newcomers with such offenses as breaking into jail or being lousy and trying to scratch. Influenced by the prominence of irregular political gatherings, any extra-legal sham hearing came to be known as a kangaroo court.
SMARMY: “Smarmy,” according to the Oxford English Dictionary, dates back to 1909 as an adjectival form of the word “smarm” or “smalm” which had been around for 100 years. Originally just a verb for smoothing, especially of hair, its meaning gradually moved to include the implication of a real smoothie. If you describe someone as smarmy, you dislike them because they are unpleasantly polite and flattering, usually because they want you to like them or to do something for them.
You place the dome in your hand, turn it over and beautifully, magically the item inside is engulfed in a swirling slow-motion blizzard. Everyone can relate to them – evoking a childhood memory or nostalgia of a simpler time. The first mention of a snow globe featured a man with an umbrella displayed at the Paris Exposition of 1878. It was later suggested that the globes were created to commemorate the Tower’s inauguration.
This extremely rare LouisVuitton Eiffel Tower dome made of luggage is a whimsical example that sold for $995 in 2017.
A few years later, a Viennese man Edwin Perzy developed the same idea when researching a way to improve operating room light..He used a glass globe filled with water, hoping to create a magnifying lens by increasing refraction. To enhance the reflected light, Perzy put ground glass in the water. When it quickly sank, he tried semolina which floated slowly to the bottom of the globe. It did nothing to improve the light quality, but the snowfall inspired him to make his first snow globe: he carved a small house and inserted it into the globe.
Edwin Perzy
Facsimile of Perzy’s first globe used in Citizen Kane
Mass production began in the US during the 1920s. Joseph Garaja of Pittsburgh was granted a patent in 1927, which altered how snow globes were made. His method needed the spheres to be assembled underwater, removing any trapped air. This ingenious method made it possible for the industry to go into mass production, which drastically lowered the prices of globes.
However, by the early 1960s, glass snow globes had been overtaken by plastic Hong Kong-made globes. It was soon discovered that the water in their spheres was filthy, obtained directly from their port. As a result, a Hong Kong snow globe producer got into significant trouble and was temporarily barred from entering the United States.
The “snow” in snow globes has a fascinating backstory as well. Snow was previously created in glass domes using tiny porcelain, bone chips, or ground rice. Camphor/wax, as well as meerschaum, was also used to make these snowflakes. Today, most “snow” is tiny particles or shards of white plastic. Also, the liquid hasn’t always been water; at one point, light oil was used. In addition, glycol (antifreeze) was added to help with the problem of freezing during winter shipping.
The snow globe fell out of favor in the 1970’s when it epitomized kitsch –but have evolved into something more sophisticated, intricate and valued among designers and collectors. Novelty gift manufacturers have upgraded the designs and components making them unique gift items often including beautifully modelled landscapes.
Some incorporate lights, music and motors eliminating the need for shaking. Many high-end department stores introduce a custom design every year to commemorate the Christmas season.
Snow globes have become an increasingly popular collectible for both antique and novelty globes. Actor, Corbin Bernsen may be the most prolific collector with about 8,000 – he began collecting snow globes in the ‘80’s. “There’s something that happens to a collector, this internal voice that says, ‘I want to have one of each that is in existence,’” Bernsen says.
Corbin Bernsen
Originally the globes were made of glass and the figures inside were made of porcelain, bone, metals, minerals, rubber or wax. The snow or “flitter” as it’s called, could have been ground rice, wax, soap, sand, bone fragments, meerschaum, metal flakes or sawdust. Producers tried everything. The base was either round or square and may have been of stone, marble, ceramic or wood. Some are quite bizarre!!!
“Snow domes are not only fascinating to look at, to hold, to play with, they are folk art,” says collector Nancy McMichael, author of Snowdomes (Abbeville Press). “They are a bridge back to an idealized past we think existed but is actually in our head. It is something we carry with us.”
Two hundred years after George Washington issued the first presidential proclamation of a day of public thanksgiving, President George H.W. Bush stepped before reporters, 30 schoolchildren and one antsy 50-pound turkey in the White House Rose Garden on November 17, 1989. The public presentation of a plump gobbler to the chief executive in the lead-up to Thanksgiving had been a time-honored photo op since the 1940s, but Bush would add a new presidential tradition of his own. After noting that the turkey appeared “understandably nervous,” Bush added: “Let me assure you, and this fine tom turkey, that he will not end up on anyone’s dinner table, not this guy. He’s granted a presidential pardon as of right now.”
Decades later, the presidential turkey pardon remains an annual Thanksgiving ritual. However, while Bush formalized the fowl tradition, he may not have been the first president to issue a stay of execution to a turkey. A story is told that while Abraham Lincoln occupied the White House, his young son Tad grew so close to a turkey destined for Christmas dinner that he named him Jack and led him around on a leash like a pet. Listening to Tad’s pleas to spare the turkey from his culinary fate, the Great Emancipator granted a reprieve and freed the bird.
A decade later during the administration of President Ulysses S. Grant, Rhode Island poultry dealer Horace Vose began to send plump turkeys to the White House for Thanksgiving dinners. Although a staunch Republican, Vose was non-partisan when it came to turkeys. He sent birds to presidents of both parties until his death in 1913. Beginning in 1946, a pair of poultry industry groups—the National Turkey Federation and the Poultry and Egg National Board—assumed the duties of presenting presidents with turkeys for the holidays. That year, the groups delivered a 42-pound Texas tom to President Harry Truman for Christmas.
While Truman began the ritual of appearing with the gift turkeys in staged photo ops, he is erroneously credited with starting the presidential pardon tradition. The misinformation is so prevalent that the Truman Library has issued a statement on its web site that its staff “has found no documents, speeches, newspaper clippings, photographs, or other contemporary records in our holdings which refer to Truman pardoning a turkey that he received as a gift in 1947, or at any other time during his Presidency.”
In fact, not only did the turkeys given to Truman and some of his successors fail to receive clemency, they suffered a much different fate by ending up on the presidential dinner table. In 1948 Truman told reporters that the turkeys given to him “would come in handy” for the 25 people expected for dinner at his Independence, Missouri, home that Christmas. Ten days before Thanksgiving in 1953, National Turkey Federation president Roscoe Hill presented a live 39-pound turkey to President Dwight Eisenhower, who hoped Hill would kill, freeze and return the gobbler to the White House “in plenty of time because I hope to spend Thanksgiving with my youngsters and I want to take him along.”
A president finally took pity on a gifted bird in 1963 when John F. Kennedy spared the life of a mammoth 55-pound white turkey wearing a sign around its neck—clearly not of its own volition—that read “Good Eating, Mr. President!” “We’ll just let this one grow,” Kennedy said with a grin. “It’s our Thanksgiving present to him.” As the president left the Rose Garden on November 19, 1963, the turkey prepared for its return to a California farm while Kennedy finalized preparations for his fateful trip to Dallas three days later.
Although newspapers in 1963 reported that “Merciful President Pardons Turkey,” the first president to actually use the word “pardon” at the National Thanksgiving Turkey Presentation may have been Ronald Reagan, albeit as a quip. During the throes of the Iran-Contra scandal in 1987, Reagan sidestepped reporters’ questions about whether he planned to pardon any of his aides accused of wrongdoing. When then asked about the fate of the 55-pound turkey he was just given, Reagan joked, “I’ll pardon him.”
Although the National Thanksgiving Turkey and its alternate (sent in case the primary turkey can’t fulfill its duties—mainly, staying alive to make it to the presentation ceremony) now receive stays of execution, their remaining days do not last too long. The skeletons and organs of turkeys bred for consumption are incapable of supporting extreme weights, and most of the reprieved turkeys die prematurely within the following year.
(I went in search of what the Pilgrims ate at the first Thanksgiving and came across this article by Mark Fleming at the newengland.com website.)
The Thanksgiving meal is remarkably consistent in its elements: the turkey, the stuffing, the sweet potatoes, the cranberry sauce. Barring ethical, health, or religious objections, it is pretty much the same meal for everyone, around the country, and through the years of their lives. We stick with the basics and simply change the seasonings.
But what about that first Thanksgiving in the fall of 1621 (historians don’t know the exact date, but place it sometime between September 21 and November 9), when British settlers hosted the first documented harvest celebration? What did they eat at the first Thanksgiving, and how similar is it to the traditional American Thanksgiving meal today?
Here’s how Edward Winslow described the first Thanksgiving feast in a letter to a friend:
“Our harvest being gotten in, our governor sent four men on fowling, that so we might after a special manner rejoice together after we had gathered the fruits of our labor. They four in one day killed as much fowl as, with a little help beside, served the company almost a week. At which time, amongst other recreations, we exercised our arms, many of the Indians coming amongst us, and among the rest their greatest king Massasoit, with some ninety men, whom for three days we entertained and feasted, and they went out and killed five deer, which we brought to the plantation and bestowed on our governor, and upon the captain and others. And although it be not always so plentiful as it was at this time with us, yet by the goodness of God, we are so far from want that we often wish you partakers of our plenty.”
What They (Likely) Did Have at the First Thanksgiving
Venison
Fowl (geese and duck)
Corn
Nuts (walnuts, chestnuts, beechnuts)
Shellfish
So venison was a major ingredient, as well as fowl, but that likely included geese and ducks. Turkeys are a possibility, but were not a common food in that time. Pilgrims grew onions and herbs. Cranberries and currants would have been growing wild in the area, and watercress may have still been available if the hard frosts had held off, but there’s no record of them having been served. In fact, the meal was probably quite meat-heavy.
Likewise, walnuts, chestnuts, and beechnuts were abundant, as were sunchokes. Shellfish were common, so they probably played a part, as did beans, pumpkins, squashes, and corn (served in the form of bread or porridge), thanks to the Wampanoags.
It’s possible, but unlikely, that there was turkey at the first Thanksgiving.
What They (Definitely) Did Not Have at the First Thanksgiving
A turkey centerpiece
Potatoes (white or sweet)
Bread stuffing or pie (wheat flour was rare)
Sugar
Aunt Lena’s green bean casserole
But how about bringing a little more truly traditional flavor back to your table? Back in 2003, we consulted with historians at Plimoth Plantation, the Wampanoag and English settlers living history museum in Plymouth, Massachusetts, and asked writer Jane Walsh to devise a menu that incorporated some of the foods that would have been served at the first Thanksgiving. We didn’t eliminate any favorites or try to go sugar-free. We skipped the venison. Really, like everyone else who will gather around a table on the fourth Thursday in November this year, we simply changed the seasonings.
The Wampanoag and English settlers may not have had access to all of the ingredients included in these recipes, but by including pheasant, goose, or venison in your Thanksgiving menu, you’re at least paying tribute to a meat they likely enjoyed back in 1621. Chestnuts and native corn were common, too. Here are a few dishes to get you further inspired — both reader-submitted and from the Yankee recipe archives.
TAKES THE CAKE: The phrase “takes the cake” comes from the cake walks that were popular in the late 19th century. Couples would strut around gracefully and well-attired, and the couple with the best walk would win a cake as a prize. Interestingly, cake walk was soon used to describe something that could be done very easily, and it’s very possible that from there we get the phrase “piece of cake.”
PARTING SHOT: A parting shot, which is a final insult tossed out at the end of a fight when you assume it’s over, was originally a Parthian shot. The Parthians, who lived in an ancient kingdom called Parthia, had a strategy whereby they would pretend to retreat, then their archers would fire shots from horseback. Parthian sounds enough like parting, and, coupled with the fact that not a lot of people knew who the Parthians were, the phrase was changed to parting shot.
DEAD AS A DOORNAIL: One could certainly argue that a doornail was never alive, but when a doornail is dead, it has actually been hammered through a door, with the protruding end hammered and flattened into the door so that it can never come loose or be removed or used again. The phrase “dead as a doornail” has actually been around since the 14th century, about as long as the word doornail has officially been in the English language.
DOWN TO BRASS TACKS: “There are many theories about what “down to brass tacks” means, including that brass tacks is rhyming slang for hard facts. But it’s very likely that the brass tacks being mentioned here are actual brass tacks. Merchants used to keep tacks nailed into their counters to use as guides for measuring things, so to get down to brass tacks would be you were finally done deciding what you wanted and were ready to cut some fabric and do some actual business.
IT’S GREEK TO ME: “The phrase “it’s Greek to me” is often attributed to Shakespeare, but it’s been around since well before his time. An earlier version of the phrase can be found written in Medieval Latin translations, saying “Graecum est; non potest legi,” or “it’s Greek. Cannot be read.”
SMART ALEC: “You may have presumed the Alec in “smart Alec” was just a name that sounded good preceded by the word smart, but that’s not necessarily the case. Professor Gerald Cohen suggested in his book”Studies in Slang” that the original smart Alec was Alexander Hoag, a professional thief who lived and robbed in New York City in the 1840s. Hoag was a very clever criminal who worked with his wife and two other policemen to pickpocket and rob people. He was eventually busted when he decided to stop paying the cops.
HEARD IT THRU THE GRAPEVINE: “The grapevine people hear things through is a grapevine telegraph, which was the nickname given to the means of spreading information during the Civil War as a kind of wink at an actual telegraph. The grapevine telegraph is just a person-to-person exchange of information, and much like when you play a game of telephone, it’s best to presume that the information you receive has gone through a few permutations since it was first shared.
CAT’S OUT OF THE BAG: “Farmers used to stick little suckling pigs in bags to take them to market. But if a farmer was trying to rip somebody off, they would put a cat in the bag instead. So, if the cat got out of the bag, everybody was onto their ruse, which is how we use the phrase today, just not quite so literally. (We hope.)
OUT OF WACK: “Today, “out of whack” means not quite right, but it took a long time to get there. Whack appeared in the 18th century as a word that meant to strike a blow when used as a verb. The noun whack was the blow that was whacked on something. But whack also grew to mean portion or share, especially as loot that was being split by criminals. From there, whack grew to mean an agreement, as in the agreed share of loot, but it also meant in good order. If something was behaving as it was intended to, it was “in fine whack.” Eventually the opposite fell into common usage, and something that wasn’t in good shape was “out of whack.”
KIBOSH: “Evidence of kibosh dates the word to only a few years before Charles Dickens used it in an 1836 sketch, but despite kibosh being relatively young in English its source is elusive. Another hypothesis pointed to Irish caidhp bhais, literally, coif (or cap) of death, explained as headgear a judge put on when pronouncing a death sentence, or as a covering pulled over the face of a corpse when a coffin was closed. Today, “to put the kibosh on something” is to shut it down.
BETWEEN A ROCK AND A HARD PLACE: “Some people think that the phrase “between a rock and a hard place” is a kind of sloppy reference to Odysseus. But in 1921, the phrase became a popular means of describing when miners had to choose between dangerous work for little or no money or definite poverty during the Great Bankers’ Panic of 1907.
GOT UP ON THE WRONG SIDE OF THE BED: “The generally accepted origin of the phrases “get up on the wrong side of the bed” and wake up on the wrong side of the bed is ancient Rome, where superstition was rampant. Ancient philosophers equated the right side of anything as the positive side, and the left side of anything as the sinister or negative side. The story says that Romans always exited the bed on the right side in order to start the day in contact with positive forces. If one rose on the left side of the bed, he started the day in contact with negative forces.
MAD AS A HATTER: “The expression is linked to the hat-making industry and mercury poisoning. In the 18th and 19th centuries, industrial workers used a toxic substance, mercury nitrate, as part of the process of turning the fur of small animals, such as rabbits, into felt for hats. Workplace safety standards often were lax and prolonged exposure to mercury caused employees to develop a variety of physical and mental ailments, including tremors (dubbed “hatter’s shakes”), speech problems, emotional instability and hallucinations.
Edmund Fitzgerald, St. Mary’s River, 1975. Photo by Bob Campbell
The legend of the Edmund Fitzgerald remains the most mysterious and controversial of all shipwreck tales heard around the Great Lakes. Her story is surpassed in books, film and media only by that of the Titanic. Canadian folksinger Gordon Lightfoot inspired popular interest in this vessel with his 1976 ballad, “The Wreck of the Edmund Fitzgerald.”The Edmund Fitzgerald was lost with her entire crew of 29 men on Lake Superior November 10, 1975, 17 miles north-northwest of Whitefish Point, Michigan. Whitefish Point is the site of the Whitefish Point Light Station and Great Lakes Shipwreck Museum. The Great Lakes Shipwreck Historical Society (GLSHS) has conducted three underwater expeditions to the wreck, 1989, 1994, and 1995.At the request of family members surviving her crew, Fitzgerald’s 200 lb. bronze bell was recovered by the Great Lakes Shipwreck Historical Society on July 4, 1995. This expedition was conducted jointly with the National Geographic Society, Canadian Navy, Sony Corporation, and Sault Ste. Marie Tribe of Chippewa Indians. The bell is now on display in the Great Lakes Shipwreck Museum as a memorial to her lost crew.The Fateful Journeyby Sean Ley, Development OfficerThe final voyage of the Edmund Fitzgerald began November 9, 1975 at the Burlington Northern Railroad Dock No.1, Superior, Wisconsin. Captain Ernest M. McSorley had loaded her with 26,116 long tons of taconite pellets, made of processed iron ore, heated and rolled into marble-size balls. Departing Superior about 2:30 pm, she was soon joined by the Arthur M. Anderson, which had departed Two Harbors, Minnesota under Captain Bernie Cooper. The two ships were in radio contact. The Fitzgerald being the faster took the lead, with the distance between the vessels ranging from 10 to 15 miles.Aware of a building November storm entering the Great Lakes from the great plains, Captain McSorley and Captain Cooper agreed to take the northerly course across Lake Superior, where they would be protected by highlands on the Canadian shore. This took them between Isle Royale and the Keweenaw Peninsula. They would later make a turn to the southeast to eventually reach the shelter of Whitefish Point.Weather conditions continued to deteriorate. Gale warnings had been issued at 7 pm on November 9, upgraded to storm warnings early in the morning of November 10. While conditions were bad, with winds gusting to 50 knots and seas 12 to 16 feet, both Captains had often piloted their vessels in similar conditions. In the early afternoon of November 10, the Fitzgerald had passed Michipicoten Island and was approaching Caribou Island. The Anderson was just approaching Michipicoten, about three miles off the West End Light.Captain Cooper maintained that he watched the Edmund Fitzgerald pass far too close to Six Fathom Shoal to the north of Caribou Island. He could clearly see the ship and the beacon on Caribou on his radar set and could measure the distance between them. He and his officers watched the Fitzgerald pass right over the dangerous area of shallow water. By this time, snow and rising spray had obscured the Fitzgerald from sight, visible 17 miles ahead on radar.At 3:30 pm that afternoon, Captain McSorley radioed Captain Cooper and said: “Anderson, this is the Fitzgerald. I have a fence rail down, two vents lost or damaged, and a list. I’m checking down. Will you stay by me till I get to Whitefish?” McSorley was checking down his speed to allow the Anderson to close the distance for safety. Captain Cooper asked McSorley if he had his pumps going, and McSorley said, “Yes, both of them.”
“The Wreck Site II” by David Conklin
As the afternoon wore on, radio communications with the Fitzgerald concerned navigational information but no extraordinarily alarming reports were offered by Captain McSorley. At about 5:20 pm the crest of a wave smashed the Anderson’s starboard lifeboat, making it unusable. Captain Cooper reported winds from the NW x W at a steady 58 knots with gusts to 70 knots, and seas of 18 to 25 feet.
According to Captain Cooper, about 6:55 pm, he and the men in the Anderson’s pilothouse felt a “bump”, felt the ship lurch, and then turned to see a monstrous wave engulfing their entire vessel from astern. The wave worked its way along the deck, crashing on the back of the pilothouse, driving the bow of the Anderson down into the sea.
“Then the Anderson just raised up and shook herself off of all that water – barrooff – just like a big dog. Another wave just like the first one or bigger hit us again. I watched those two waves head down the lake towards the Fitzgerald, and I think those were the two that sent him under.”
Keeping Watch
Morgan Clark, first mate of the Anderson, kept watching the Fitzgerald on the radar set to calculate her distance from some other vessels near Whitefish Point. He kept losing sight of the Fitzgerald on the radar from sea return, meaning that seas were so high they interfered with the radar reflection. First mate Clark spoke to the Fitzgerald one last time, about 7:10 pm:
“Fitzgerald, this is the Anderson. Have you checked down?”
“Yes, we have.”
“Fitzgerald, we are about 10 miles behind you, and gaining about 1 1/2 miles per hour. Fitzgerald, there is a target 19 miles ahead of us. So the target would be 9 miles on ahead of you.”
“Well,” answered Captain McSorley, “Am I going to clear?”
“Yes, he is going to pass to the west of you.”
“Well, fine.”
“By the way, Fitzgerald, how are you making out with your problems?” asked Clark.
“We are holding our own.”
“Okay, fine, I’ll be talking to you later.” Clark signed off.
The radar signal, or “pip” of the Fitzgerald kept getting obscured by sea return. And around 7:15 pm, the pip was lost again, but this time, did not reappear. Clark called the Fitzgerald again at about 7:22 pm. There was no answer.
Captain Cooper contacted the other ships in the area by radio asking if anyone had seen or heard from the Fitzgerald. The weather had cleared dramatically. His written report states:
“At this time I became very concerned about the Fitzgerald – couldn’t see his lights when we should have. I then called the William Clay Ford to ask him if my phone was putting out a good signal and also if perhaps the Fitzgerald had rounded the point and was in shelter, after a negative report I called the Soo Coast Guard because I was sure something had happened to the Fitzgerald. The Coast Guard were at this time trying to locate a 16-foot boat that was overdue.”
With mounting apprehension, Captain Cooper called the Coast Guard once again, about 8:00 pm, and firmly expressed his concern for the welfare of the Fitzgerald. The Coast Guard then initiated its search for the missing ship. By that time the Anderson had reached the safety of Whitefish Bay to the relief of all aboard. But the Coast Guard called Captain Cooper back at 9:00 pm:
“Anderson, this is Group Soo. What is your present position?”
“We’re down here, about two miles off Parisienne Island right now…the wind is northwest forty to forty-five miles here in the bay.”
“Is it calming down at all, do you think?”
“In the bay it is, but I heard a couple of the salties talking up there, and they wish they hadn’t gone out.”
“Do you think there is any possibility and you could…ah…come about and go back there and do any searching?”
“Ah…God, I don’t know…ah…that…that sea out there is tremendously large. Ah…if you want me to, I can, but I’m not going to be making any time; I’ll be lucky to make two or three miles an hour going back out that way.”
“Well, you’ll have to make a decision as to whether you will be hazarding your vessel or not, but you’re probably one of the only vessels right now that can get to the scene. We’re going to try to contact those saltwater vessels and see if they can’t possibly come about and possibly come back also…things look pretty bad right now; it looks like she may have split apart at the seams like the Morrell did a few years back.”
“Well, that’s what I been thinking. But we were talking to him about seven and he said that everything was going fine. He said that he was going along like an old shoe; no problems at all.”
“Well, again, do you think you could come about and go back and have a look in the area?”
“Well, I’ll go back and take a look, but God, I’m afraid I’m going to take a hell of a beating out there… I’ll turn around and give ‘er a whirl, but God, I don’t know. I’ll give it a try.”
“That would be good.”
“Do you realize what the conditions are out there?”
No reply from the Coast Guard. Captain Cooper tries again.
“Affirmative. From what your reports are I can appreciate the conditions. Again, though, I have to leave that decision up to you as to whether it would be hazarding your vessel or not. If you think you can safely go back up to the area, I would request that you do so. But I have to leave the decision up to you.”
“I’ll give it a try, but that’s all I can do.”
The Anderson turned out to be the primary vessel in the search, taking the lead. With the ship pounding and rolling badly, the crew of the Anderson discovered the Fitzgerald’s two lifeboats and other debris but no sign of survivors. Only one other vessel, the William Clay Ford, was able to leave the safety of Whitefish Bay to join in the search at the time. The Coast Guard launched a fixed-wing HU-16 aircraft at 10 pm and dispatched two cutters, the Naugatuck and the Woodrush. The Naugatuck arrived at 12:45 pm on November 11, and the Woodrush arrived on November 14, having journeyed all the way from Duluth, Minnesota.
The Coast Guard conducted an extensive and thorough search. On November 14, a U.S. Navy plane equipped with a magnetic anomaly detector located a strong contact 17 miles north-northwest of Whitefish Point. During the following three days, the Coast Guard cutter Woodrush, using a sidescan sonar, located two large pieces of wreckage in the same area. Another sonar survey was conducted November 22-25.
Finding the Fitzgerald
Restored Fitzgerald bell – Great Lakes Shipwreck Museum
The following May, 1976, Woodrush was again on the scene to conduct a third sidescan sonar survey. Contacts were strong enough to bring in the U.S. Navy’s CURV III controlled underwater recovery vehicle, operating from Woodrush.
The CURV III unit took 43,000 feet of video tape and 900 photographs of the wreck. On May 20, 1976, the words “Edmund Fitzgerald” were clearly seen on the stern, upside down, 535 feet below the surface of the lake.
On April 15, 1977 the U.S. Coast Guard released its official report of “Subject: S.S. Edmund Fitzgerald, official number 277437, sinking in Lake Superior on 10 November 1975 with loss of life.” While the Coast Guard said the cause of the sinking could not be conclusively determined, it maintained that “the most probable cause of the sinking of the S.S. Edmund Fitzgerald was the loss of buoyancy and stability resulting from massive flooding of the cargo hold. The flooding of the cargo hold took place through ineffective hatch closures as boarding seas rolled along the spar deck.”
However, the Lake Carrier’s Association vigorously disagreed with the Coast Guard’s suggestion that the lack of attention to properly closing the hatch covers by the crew was responsible for the disaster. They issued a letter to the National Transportation Safety Board in September, 1977. The Lake Carrier’s Association was inclined to accept that Fitzgerald passed over the Six Fathom Shoal Area as reported by Captain Cooper.
Later, in a videotaped conversation with GLSHS, Captain Cooper said that he always believed McSorley knew something serious had happened to Fitzgerald as the ship passed over Caribou Shoal. Cooper believes that from that point on, McSorley knew he was sinking.
Conflicting theories about the cause of the tragedy remain active today. GLSHS’ three expeditions to the wreck revealed that it is likely she “submarined” bow first into an enormous sea, as damage forward is indicative of a powerful, quick force to the superstructure. But what caused the ship to take on water, enough to lose buoyancy and dive to the bottom so quickly, without a single cry for help, cannot be determined.
Twenty-nine men were lost when the Fitzgerald went down. There is absolutely no conclusive evidence to determine the cause of the sinking. The bell of the ship is now on display in the Great Lakes Shipwreck Museum as a memorial to her lost crew.
(If, like me, you are discouraged by the current political rhetoric, you might be surprised to learn the last few elections have not been the weirdest or nastiest in our history. I stumbled upon this article by Stephanie Pappas that details what were 5 of the nastiest elections in our history in her opinion. Keep in mind when you read this, it’s dated 2012, so her list may have changed.)
Reading the political news, you’d think this election is the nastiest, most contentious and most important our nation has ever faced. No doubt the outcome matters, but in the annals of American elections, this one barely registers for sheer strangeness.
In fact, electoral politics have always been a down-and-dirty business, starting at least as early as 1800, when our founding fathers proved themselves adept at bitter battles. Other elections have featured nasty accusations, bizarre happenstance and even the death of one of the candidates.
Read on for five of the strangest presidential elections in U.S. history.
1. The very first one, 1788-1789
The first presidential election in our nation’s history was one-of-a-kind in that it was literally no contest. Organized political parties had yet to form, and George Washington ran unopposed. His victory is the only one in the nation’s history to feature 100 percent of the Electoral College vote.
The real question in 1788 was who would become vice president. At the time, this office was awarded to the runner-up in the electoral vote (each elector cast two votes to ensure there would be a runner-up.) Eleven candidates made a play for the vice-presidency, but John Adams came out on top.
2. It’s a tie, 1800
Electoral politics got serious in 1800. Forget the hand-holding peace of George Washington’s first run — political parties were in full swing by this time, and they battled over high-stakes issues (taxes, states’ rights and foreign policy alignments). Thomas Jefferson ran as the Democratic-Republican candidate and John Adams as the Federalist.
At the time, states got to pick their own election days, so voting ran from April to October (and you thought waiting for the West Coast polls to close was frustrating). Because of the complicated “pick two” voting structure in the Electoral College, the election ended up a tie between Jefferson and his vice-presidential pick, Aaron Burr. One South Carolina delegate was supposed to give one of his votes to another candidate, so as to arrange for Jefferson to win and Burr to come in second. The plan somehow went wrong, and both men ended up with 73 electoral votes.
That sent the tie-breaking vote to the House of Representatives, not all of whom were on board with a Jefferson presidency and Burr vice-presidency. Seven tense days of voting followed, but Jefferson finally pulled ahead of Burr. The drama triggered the passage of the 12th amendment to the U.S. Constitution, which stipulates that the Electoral College pick the president and vice-president separately, doing away with the runner-up complications.
3. Things get nasty, 1828
Anything involving dueling war veteran Andrew Jackson was liable to get dirty, but the 1828 electoral battle between Jackson and John Quincy Adams took the cake for mudslinging. Jackson had lost out to Adams in 1824 after Speaker of the House Henry Clay cast a tie-breaking vote. When Adams chose Clay as his Secretary of State, Jackson was furious and accused the two of a “corrupt bargain.”
And that was before the 1828 election even got started, when Adams was accused of pimping out an American girl to a Russian Czar. Jackson’s wife, Rachel, was called a “convicted adulteress,” because she had, years earlier, married Jackson before finalizing her divorce to her previous husband. Rachel died after Jackson won the election, but before his inauguration; at her funeral, Jackson blamed his opponents’ bigamy accusations. “May God Almighty forgiver her murderers, as I know she forgave them,” Jackson said. “I never can.”
To round out a rough election, Jackson’s inauguration party (open to the public) turned into a mob scene, with thousands of well-wishers crowding into the White House.
“Ladies fainted, men were seen with bloody noses, and such a scene of confusion took place as is impossible to describe,” wrote Margaret Smith, a Washington socialite who attended the party.
4. Running against a corpse, 1872
In 1872, incumbent Ulysses S. Grant had an easy run for a second term — because his opponent died before the final votes were cast.
Grant had the election in the bag even before his opponent, Horace Greeley, died, however. The incumbent won 286 electoral votes compared with Greeley’s 66 after election day. But on Nov. 29, 1872, before the Electoral College votes were in, Greeley died and his electoral votes were split among other candidates. Greeley remains the only presidential candidate to die before the election was finalized.
5. The hanging chads, 2000
Democrat Al Gore beat Republican George W. Bush in the popular vote in the 2000 election, but the electoral vote was a close, and controversial, call. As election night drew to a close, New Mexico, Oregon and Florida remained too close to call.
It would be Florida that determined the winner, but not until the Supreme Court weighed in. For a month, the outcome of the election remained in recount limbo, as Gore’s campaign contested the vote count in several close counties and the Florida and U.S. Supreme Courts engaged in a tug-of-war over whether to halt the recounts or extend their deadlines. Among the challenges faced by the hand counts: determining whether semi-attached scraps of paper, or “hanging chads,” on punch-card ballots should count as votes.
Ultimately, on Dec. 12, the Supreme Court ruled 7-2 that a statewide recount was unconstitutional, alongside a further decision that the smaller recounts could not go forward. The decision meant the original vote counts stood, giving the election to Bush.
GABARDINE: Few movements in history have been more thrilling than the pilgrimages of the Middle Ages. Many people traveled to shrines throughout Europe and even to the Holy Land. Pilgrims continued to visit some of the shrines at enormous sacrifice of time and money. They wore an unofficial but characteristic garb: a gray cowl bearing a red cross and a broad-brimmed, stiff hat. Pilgrims carried a staff, a sack, and a gourd. They usually traveled in company with other adventurers, singing hymns as they walked and begging food from those they met.
Medieval Pilgrims
Since a particular type of upper garment was worn by the pilgrim, it gradually came to be identified with the journey itself. A will filed in 1520 included this bequest: “Until litill Thomas Beke my gawbardyne to make him a gowne.” From the garment the term came to refer to the coarse material from which it was customarily made. Slight modifications in spelling produced gabardine – a kind of cloth that passed from the religious pilgrim’s vocabulary into general use.
Assorted Gabardine
RUBBER: On his second voyage to “East India,” Columbus found natives playing with a substance they called caoutchouc. It would stretch and then snap back into shape; when made into balls it would bounce. Scientists who examined the odd substance agreed that it was unlike anything known in Europe, yet they confessed themselves unable to imagine any use for it.
Small quantities of caoutchouc were brought to Europe, but it remained a curiosity for more than two centuries. Finally, someone discovered by accident that the material could be used for removing the marks of a lead pencil. Hence, bookkeepers termed it “lead-eater.”
Around 1780 Joseph Priestley experimented with a bit of caoutchouc, hoping to find some use more important than erasing errors made in ledgers. He failed and decided that it would never be of value except for rubbing out pencil marks.
Joseph Priestley
Consequently, he called it “East India rubber.” Soon the nickname of the one-job substance was abbreviated to rubber. The name serves as a perpetual reminder that civilization was once at a loss as to what to do with a substance of a thousand uses.
MAP: Greek geographers of the sixth century BC developed considerable skill in making charts to guide sailors and travelers. Then the Romans extended the art by engraving scale representations of the Empire on fine marble slabs. These devices, and the more abundant clay tablets, proved to be extremely cumbersome, so someone thought of painting geographical charts upon cloth.
Fragment of Greek “Map”
For this purpose, the most suitable material proved to be fine table linen, or mappa. This led to the practice of calling any flat geographical chart a map.
RECIPE: Since Latin was the universal language of medieval scholars, physicians used it in writing directions for compounding medicines. Virtually every prescription listed the ingredients in precise order and began with the Latin verb recipe, meaning “take.”
Ancient Apothecary “Recipes”
Care in measuring and blending the ingredients of a tasty dish is also essential. Therefore, when housewives began to master the art of reading and writing, they adopted the apothecary’s custom and made written lists of ingredients and steps in cookery. Inevitably, such a set of directions took the pharmaceutical name and became familiar to the household recipe.
BUDGET: Struggling with a budget is no new problem; it dates back to the days of the Roman Empire. Housewives had to be cautious in their spending and they kept money for household expenses in a little leather bulga (Latin for bag). This custom also prevailed among businessmen, who may have borrowed it from their wives or vice versa.
Antique “Bulga”
Centuries later, the Latin word was adopted into Middle French as bougette (“little leather bag”). When the British Chancellor of Exchequer appeared before Parliament, he carried his papers explaining the estimated revenue and expenses in a leather bag and then “opened the budget” for the coming year. Thus, budget (as it came to be pronounced) came to mean a systematic plan for expenditures, both for governments and for private individuals.
EAT ONE’S HAT: Many a man engaged in a contest of some sort has offered to eat his hat if he loses. In such a situation, a knowledge of etymology would be of great value, for the expression eat one’s hat once referred not to a Stetson or a Panama, but to a culinary product.
Napier’s famous Boke of Cookry, one of the earliest European cookbooks, gives the following directions: “Hattes are made of eggs, veal, dates, saffron, salt, and so forth.” In the hands of amateur cooks, the concoction was frequently so unpalatable that it required a strong stomach to eat it.
Even so, the early braggart who offered to eat a hatte had in mind nothing so distasteful as a felt or a straw!
FLOUR: During the Elizabethan Age, the word “flower” meant “the best,” as it does today in such expressions as “the flower of the nation’s youth.”
Millers of the period ground wheat by a crude process, then sifted the meal. Only the finest of it passed through the cloth sieve in a process called “boulting.” Reserved for tables of the nobility, this top-quality ground wheat was naturally called the “flower of wheat,” but in this context the word came to be spelled flour. The two spellings were used interchangeably until the 19th century. In Paradise Lost, Milton wrote the line, “O flours that never will in other climates grow.”
Boulting
COOKING TERMS: There is at least one serious gap in European history. Her contemporaries failed to record the name of the woman who first thought of stuffing an egg. Nothing is known about her recipe, except that she was liberal with pepper. Her invention was so hot that folks who tried it were reminded of Beelzebub’s fiery furnaces. As a result, the tidbit came to be called a deviled egg.
Most other terms of cookery are prosaic by comparison. More than half were borrowed from the French – which suggest that English cooks were never very imaginative. Braise stems from French for “hot charcoal.” Toast is but slightly modified from “toaster” (“to parch with heat). Boil stems from a continental verb meaning “to make little bubbles.” Poach grew out of pocher, which meant “to pouch,” that is, to enclose an egg’s yellow in a little pouch of white.
Fry, grill, roast and baste were also adapted from French. Fricassee was taken as is from that language, but the ultimate origin is unknown.
The oldest term in cookery is probably cook, still much like Latin coquus. The Norse gave us bake, from baka (“hearth”). The Saxons contributed sear, spelled just as it is today. It originally meant to “wither with heat.” Scorch – the bane of a cook’s existence – has a long history that goes all the way back to the Old English scorkle, which started life as a term for skinning meat by searing.