A few minutes ago, I was contemplating the fact that my BD is a couple of days away. I had received more than one request for the date in order to host a little “celebration” for me on the blogs. I sincerely appreciate them thinking of me and wanting to do something for me.
The issue for me is this: birthdays mean nothing to me – I have literally forgotten it even WAS my birthday until my Sis called: “Oh, it IS my BD, isn’t it???” I had nothing to do with being born – I had no choice in the matter. In my eyes, it’s not a celebration of my accomplishments: how I’ve improved in various ways, good changes I was a part of bringing about, the people I’ve helped…..those are the kinds of things I believe deserve celebration.
Celebrating children’s birthdays, on the other hand, helps to instill confidence, let’s them know they are loved and appreciated, that they have value and are deserving, etc., etc. Kids NEED that – adults should not! Of course, it would be hurtful for a person to ignore their significant other’s BD – certainly you want your partner to recognize the day with you. Beyond that….whatever!
As with any of my editorials, this is my opinion only and YMMV. I’m sure having lived alone for so many years also has something to do with it but even if HB or the grands or anyone else were to forget? Meh – I might jokingly poke them but I wouldn’t really care much.
In any case, as usual, my mind promptly went to the “why and how” of the “Happy Birthday” tradition. Source: happybirthday2all.com
What Does Birthday Mean? Why We Celebrate Birthdays? What is a birthday?
A birthday is a tradition of marking the anniversary of the birth of a person, fictional character, or an organization. Birthday is the day when a person first enters in the living world. Birthdays are celebrated all over the world in different ways. Some people also celebrate the birthdays of their God, celebrities, and founders of their religion.
Wondering who invented birthdays?
Birthdays are a way to celebrate surviving one more year, sharing and creating memories with your friends, family, and loved ones. There is a lot of research performed to find out the origin of celebrating birthdays and cutting a birthday cake. However, the research remained inconclusive.
The 18th birthday of a person holds special significance in many cultures because at this age the person is considered to be transformed from a child to an adult. The 25th birthday is known as a Silver jubilee, 50th birthday means golden Jubilee and if you are lucky enough to celebrate your 100th birthday then it is known as Platinum Jubilee.
What does a birthday really mean?
Your birthday is a day to celebrate the time you have spent on this earth. You have successfully managed to live one more year, gained a lot of experiences, created memories, learned new things, met new people and many more things might have changed from your last birthday. It’s time you spend some quality time with your friends and family and make a plan to achieve more in the upcoming year.
Why do we celebrate birthdays? & Why is birthday special?
Birthdays are special because they provide you an opportunity to look back at the time spent you have spent from your birth until now. So, here are some reasons [excuses?] that make it fun and special to celebrate your birthdays.
Birthday celebration makes it easy to eat tasty birthday cake and get lots of gifts
A chance to thank God for keeping you safe, happy, and healthy for one more year
Birthdays give you the excuse to party like there is no tomorrow
Meet with your relatives, friends, and family
An excuse to get drunk without the consequences
An excuse to take a holiday from the office to celebrate your birthday
An excuse to dress up in your best dress and then show off to your friends
Clicking lots of funny pics and selfies and then posting them on social media to make your friends jealous
Go for shopping and having a big fat dinner at the place of your choice
Getting blessings from your elders and grandparents
So, to conclude, you could say that it doesn’t matter when birthday celebrations originated or why birthdays are celebrated. What matters is that you spend some quality time and enjoy the day of your birth. If you are going to celebrate the birthday of your loved one then you should get a gift, a beautiful greeting card, a lovely birthday wish, and maybe organize a small party to make him/her happy.
TRUCKEE, Calif. —Western stagecoach companies were big business in the latter half of the 19th century. In addition to passengers and freight, stages hauled gold and silver bullion as well as mining company payrolls.
Stage robbery was a constant danger and bandits employed many strategies to ambush a stagecoach. Thieves rarely met with much resistance from stage drivers, since they had passenger safety foremost in mind. The gang was usually after the Wells Fargo money box with its valuable contents. Passengers were seldom hurt, but they were certainly relieved of their cash, watches and jewelry.
Before the completion of the transcontinental railroad over Donner Pass in 1868, the only transportation through the Sierra was by stage. Rugged teamsters held rein over six wild-eyed horses as they tore along the precipitous mountain trails. The stagecoaches were driven by skilled and fearless men who pushed themselves and their spirited horses to the limit.
One of the most famous drivers was Charles Darkey Parkhurst, who had come west from New England in 1852 seeking his fortune in the Gold Rush. He spent 15 years running stages, sometimes partnering with Hank Monk, the celebrated driver from Carson City. Over the years, Pankhurst’s reputation as an expert whip grew.
From 20 feet away he could slice open the end of an envelope or cut a cigar out of a man’s mouth. Parkhurst smoked cigars, chewed wads of tobacco, drank with the best of them, and exuded supreme confidence behind the reins. His judgment was sound and pleasant manners won him many friends.
One afternoon as Charley drove down from Carson Pass, the lead horses veered off the road and a wrenching jolt threw him from the rig. He hung on to the reins as the horses dragged him along on his stomach. Amazingly, Parkhurst managed to steer the frightened horses back onto the road and save all his grateful passengers.
During the 1850s, bands of surly highwaymen stalked the roads. These outlaws would level their shotguns at stage drivers and shout, “Throw down the gold box!” Charley Parkhurst had no patience for the crooks despite their demands and threatening gestures.
The most notorious road agent was nicknamed “Sugarfoot.” When he and his gang accosted Charley’s stage, it was the last robbery the thief ever attempted. Charley cracked his whip defiantly, and when his horses bolted, he turned around and fired his revolver at the crooks. Sugarfoot was later found dead with a fatal bullet wound in his stomach.
In appreciation of his bravery, Wells Fargo presented Parkhurst with a large watch and chain made of solid gold. In 1865, Parkhurst grew tired of the demanding job of driving and he opened his own stage station. He later sold the business and retired to a ranch near Soquel, Calif. The years slipped by and Charley died on Dec. 29, 1879, at the age of 67.
A few days later, the Sacramento Daily Bee published his obituary. It read; “On Sunday last, there died a person known as Charley Parkhurst, aged 67, who was well-known to old residents as a stage driver. He was, in early days, accounted one of the most expert manipulators of the reins who ever sat on the box of a coach. It was discovered when friendly hands were preparing him for his final rest, that Charley Parkhurst was unmistakably a well-developed woman!”
Once it was discovered that Charley was a woman, there were plenty of people to say they had always thought he wasn’t like other men. Even though he wore leather gloves summer and winter, many noticed that his hands were small and smooth. He slept in the stables with his beloved horses and was never known to have had a girlfriend.
Charley never volunteered clues to her past. Loose fitting clothing hid her femininity and after a horse kicked her, an eye patch over one eye helped conceal her face. She weighed 175 pounds, could handle herself in a fistfight and drank whiskey like one of the boys.
It turns out that Charley’s real name was Charlotte Parkhurst. Abandoned as a child, she was raised in a New Hampshire orphanage unloved and surrounded by poverty. Charlotte ran away when she was 15 years old and soon discovered that life in the working world was easier for men. So she decided to masquerade as one for the rest of her life. The rest is history. Well, almost. There is one last thing. On November 3, 1868, Charlotte Parkhurst cast her vote in the national election, dressed as a man. She became the first woman to vote in the United States, 52 years before Congress passed the 19th amendment giving American women the right to vote.
The fire station in Soquel, California, has a plaque reading: “The first ballot by a woman in an American presidential election was cast on this site November 3, 1868, by Charlotte (Charley) Parkhurst who masqueraded as a man for much of her life. She was a stagecoach driver in the mother lode country during the gold rush days and shot and killed at least one bandit. In her later years she drove a stagecoach in this area. She died in 1879. Not until then was she found to be female. She is buried in Watsonville at the pioneer cemetery.”
In 1955, the Pajaro Valley Historical Association erected a monument at Parkhurst’s grave, which reads: “Charley Darkey Parkhurst (1812-1879) Noted whip of the gold rush days drove stage over Mt. Madonna in early days of Valley. Last run San Juan to Santa Cruz. Death in cabin near the 7 mile house. Revealed ‘one eyed Charley’ a woman. First woman to vote in the U.S. November 3, 1868.”
In 2007, the Santa Cruz County Redevelopment Agency oversaw the completion of the Parkhurst Terrace Apartments, named for the stagecoach driver and located a mile along the old stage route from the place of his/her death.
There was also a book written about Charley called “Charley’s Choice – The life and Times of Charley Parkhurst,” written by Fern J. Hill that might be of interest.
Did you ever wonder, like I have, how we came to separate and name the various generations? It started with the Boomers, the naming of the generations. Yes, the term Lost Generation came first, but the idea that demographic groupings of people born in a span of years should have a particular name really caught on with the post-WWII generation.
William Strauss and Neil Howe did not invent the idea of a generational schema, but they popularized it. In 1991, they published a book touting the idea that there were cyclical patterns in U.S. history based on generational differences. Their names for the generations, however, were different than those most commonly used today. Their names for the groups born in particular spans of years were:
The generally accepted names today are as follows.
1883–1900: The Lost Generation 1901–28: The Greatest Generation (The G.I. Generation) 1929–45: The Silent Generation 1946–64: Baby Boomers 1965–80: Generation X (Gen X) 1981–96: Millennials (Generation Y) 1997–2012: Generation Z 2013– : Generation Alpha
But where do these names come from?
Lost Generation (1883–1900)
The name for the generation that fought in the First World War has a literary origin. The name is both literal and metaphorical. It is literal in sheer numbers of young men who died in the war but t is also metaphorical in that it represents a rootlessness and destruction of moral purpose as a result of the war. The term Lost Generation first appears in one of the epigraphs in Ernest Hemingway’s 1926 novel The Sun Also Rises. In the book, Hemingway attributed the phrase to Gertrude Stein in conversation. Four decades later, Hemingway described that conversation: “It was when we had come back from Canada and were living in the rue Notre-Dame-des-Champs and Miss Stein and I were still good friends that Miss Stein made the remark about the lost generation. She had some ignition trouble with the old Model T Ford she then drove and the young man who worked in the garage and had served in the last year of the war had not been adept, or perhaps had not broken the priority of other vehicles, in repairing Miss Stein’s Ford. Anyway he had not been sérieux and had been corrected severely by the patron of the garage after Miss Stein’s protest. The patron had said to him, “You are all a génération perdue.”
“That’s what you are. That’s what you all are,” Miss Stein said. “All of you young people who served in the war. You are a lost generation.”
Greatest Generation / G.I. Generation (1901–27)
The earliest use of Greatest Generation is by Democratic Congressman Hatton Sumners of Dallas, Texas in 1940, before the United States was even in the war. Sumners used the term in a series of speeches, or the same stump speech, given multiple times that year. Sumners uses the term in an aspirational, rather than a descriptive sense, arguing that this generation must rise from the devastation of the Great Depression to fight fascism and right the world.
The other name for this particular generation is more prosaic: the G.I. Generation. It simply acknowledges the vast number of men of that cohort who served in uniform during the war.
Silent Generation (1928–45)
Bracketed by the war generation and the boomers and often overlooked, the Silent Generation would seem to be aptly named. The name first appears in the Detroit Free Press of 1 November 1951, but this is in an excerpt from a Time magazine piece of 5 November. The Time piece reads:
“Youth today is waiting for the hand of fate to fall on its shoulders, meanwhile working fairly hard and saying almost nothing. The most startling fact about the younger generation is its silence. With some rare exceptions, youth is nowhere near the rostrum. By comparison with the Flaming Youth of their fathers & mothers, today’s younger generation is a still, small flame. It does not issue manifestoes, make speeches or carry posters. It has been called the “Silent Generation.” But what does the silence mean? What, if anything, does it hide? Or are youth’s elders merely hard of hearing?”
Baby Boomers (1945–64)
Generic use of baby boom is much older than any of these generational names. It’s an Americanism dating to at least the 1870s to mark any uptick in births. The application of the term to the then-expected increase in births following the Second World War dates, as one might expect, to 1945. There had been a short increase in the birth rate following the U.S. entry into the war, but on 4 February 1945 the U.S. Department of Commerce reported this mini-boom was over and to expect a larger one in the year to come: “The Commerce Department reported Saturday night that the Nation’s birth rate, which rose 30 per cent above prewar levels in the year after Pearl Harbor, now is declining and will stay that way until the end of hostilities precipitates another baby boom.”
Generation X (1965–80)
Generation X first appears in December 1952 issue of Holiday magazine, touting an upcoming photo-essay by photographer Robert Capa, although the term would not appear in the photo-essay itself:
“What, you may well ask, is Generation X? […] Our tag for what we believe to be the most important group of people in the world today—the boys and girls who are just turning 21. These are the youngsters who have seen and felt the agonies of the past two decades, often firsthand, who are trying to keep their balance in the swirling pressures of today, and who will have the biggest say in the course of history for the next 50 years.”
Millennials / Generation Y (1981–96)
More successful was Strauss and Howe’s naming of the Millennial generation. From their 1991 book: “At Burrville Elementary, 13ers in older grades found the uniforms slightly humiliating, but the younger kids hardly seemed to mind. These kids in green coats and yellow blouses are the vanguard of America’s MILLENNIAL GENERATION. Cute. Cheerful. Scout-like. Wanted. Not since the 1910s, when midlife Missionaries dressed child G.I.s in Boy Scout brown, have adults seen such advantage in making kids look alike and work together. Not since the early 1900s have older generations moved so quickly to assert greater adult dominion over the world of childhood—and to implant civic virtue in a new crop of youngsters.”
Millennials have also gone by the rather unimaginative Generation Y, as they are the cohort that follows the Gen Xers. Call them Generation Y, because Y comes after X, and maybe because they’re coming of age with the big questions laid out before them.
— Y can’t we go out in the sun?
— Y can’t the AIDS epidemic be stopped?
— Y is the environment in the state it is?
— Y is Canada in the state it is?
— Y can’t I get decent work?
Generation Z (1997–2012)
Of course, Generation Y led to ‘Generation Z,” which appears by 2010, likely due to a lack of a more creative term. Some refer to this generation as “iGen” since they have never known a world without the Internet. Martha Irvine of the Associated Press states, “they are the tech-savviest generation of all time… even toddlers can maneuver their way through YouTube and some first-graders are able to put together a PowerPoint presentation for class.” A teacher’s most complicated struggle with Generation Z is not necessarily how to relate lessons to them, but rather how to prepare these students for careers and jobs that don’t even exist yet.
Generation Alpha (2013– )
Having run out of letters in the Latin alphabet, we turn to Greek for the name of the next cohort. From the Australian newspaper Northern Star of 12 March 2011: “They are smart, cashed-up, career driven and are making their way to a place near you.”
It’s the newest addition to society’s demographic categories—Generation Alpha. Babies born from 2010 are part of this demographic, coming after the digital-native Generation Z and the want-want-want Generation Y. You may note that the same critiques and notes of despair are sounded whenever a new generation comes of age. The “problem with kids these days” has always been and presumably always will be.
Harriet Tubman was an escaped enslaved woman who became a “conductor” on the Underground Railroad, leading enslaved people to freedom before the Civil War, all while carrying a bounty on her head. But she was also a nurse, a Union spy and a women’s suffrage supporter. Tubman is one of the most recognized icons in American history and her legacy has inspired countless people from every race and background. NOTE: Harriet Tubman Day is celebrated on the day that she died, because her exact birthday is unknown.
When Was Harriet Tubman Born?
Harriet Tubman was born around 1820 on a plantation in Dorchester County, Maryland. Her parents, Harriet (“Rit”) Green and Benjamin Ross, named her Araminta Ross and called her “Minty.”
Rit worked as a cook in the plantation’s “big house,” and Benjamin was a timber worker. Araminta later changed her first name to Harriet in honor of her mother.
Harriet had eight brothers and sisters, but the realities of slavery eventually forced many of them apart, despite Rit’s attempts to keep the family together. When Harriet was five years old, she was rented out as a nursemaid where she was whipped when the baby cried, leaving her with permanent emotional and physical scars.
Around age seven Harriet was rented out to a planter to set muskrat traps and was later rented out as a field hand. She later said she preferred physical plantation work to indoor domestic chores.
A Good Deed Gone Bad
Harriet’s desire for justice became apparent at age 12 when she spotted an overseer about to throw a heavy weight at a fugitive. Harriet stepped between the enslaved person and the overseer—the weight struck her head.
She later said about the incident, “The weight broke my skull … They carried me to the house all bleeding and fainting. I had no bed, no place to lie down on at all, and they laid me on the seat of the loom, and I stayed there all day and the next.”
Harriet’s good deed left her with headaches and narcolepsy the rest of her life, causing her to fall into a deep sleep at random. She also started having vivid dreams and hallucinations which she often claimed were religious visions (she was a staunch Christian). Her infirmity made her unattractive to potential slave buyers and renters.
Escape from Slavery
In 1840, Harriet’s father was set free and Harriet learned that Rit’s owner’s last will had set Rit and her children, including Harriet, free. But Rit’s new owner refused to recognize the will and kept Rit, Harriet and the rest of her children in bondage.
Around 1844, Harriet married John Tubman, a free Black man, and changed her last name from Ross to Tubman. The marriage was not good, and the knowledge that two of her brothers—Ben and Henry—were about to be sold provoked Harriet to plan an escape.
Harriet Tubman: Underground Railroad
On September 17, 1849, Harriet, Ben and Henry escaped their Maryland plantation. The brothers, however, changed their minds and went back. With the help of the Underground Railroad, Harriet persevered and traveled 90 miles north to Pennsylvania and freedom.
Tubman found work as a housekeeper in Philadelphia, but she wasn’t satisfied living free on her own—she wanted freedom for her loved ones and friends, too.
She soon returned to the south to lead her niece and her niece’s children to Philadelphia via the Underground Railroad. At one point, she tried to bring her husband John north, but he’d remarried and chose to stay in Maryland with his new wife.
Fugitive Slave Act
The 1850 Fugitive Slave Act allowed fugitive and freed workers in the north to be captured and enslaved. This made Harriet’s role as an Underground Railroad conductor much harder and forced her to lead enslaved people further north to Canada, traveling at night, usually in the spring or fall when the days were shorter.
She carried a gun for both her own protection and to “encourage” her charges who might be having second thoughts. She often drugged babies and young children to prevent slave catchers from hearing their cries.
Over the next 10 years, Harriet befriended other abolitionists such as Frederick Douglass, Thomas Garrett and Martha Coffin Wright, and established her own Underground Railroad network. It’s widely reported she emancipated 300 enslaved people; however, those numbers may have been estimated and exaggerated by her biographer Sarah Bradford, since Harriet herself claimed the numbers were much lower.
Nevertheless, it’s believed Harriet personally led at least 70 enslaved people to freedom, including her elderly parents, and instructed dozens of others on how to escape on their own. She claimed, “I never ran my train off the track and I never lost a passenger.”
Harriet Tubman’s Civil War Service
When the Civil War broke out in 1861, Harriet found new ways to fight slavery. She was recruited to assist fugitive enslaved people at Fort Monroe and worked as a nurse, cook and laundress. Harriet used her knowledge of herbal medicines to help treat sick soldiers and fugitive enslaved people.
In 1863, Harriet became head of an espionage and scout network for the Union Army. She provided crucial intelligence to Union commanders about Confederate Army supply routes and troops and helped liberate enslaved people to form Black Union regiments.
Though just over five feet tall, she was a force to be reckoned with, although it took over three decades for the government to recognize her military contributions and award her financially.
Harriet Tubman’s Later Years
After the Civil War, Harriet settled with family and friends on land she owned in Auburn, New York. She married former enslaved man and Civil War veteran Nelson Davis in 1869 (her husband John had died 1867) and they adopted a little girl named Gertie a few years later.
Harriet had an open-door policy for anyone in need. She supported her philanthropy efforts by selling her home-grown produce, raising pigs and accepting donations and loans from friends. She remained illiterate yet toured parts of the northeast speaking on behalf of the women’s suffrage movement and worked with noted suffrage leader Susan B. Anthony.
In 1896, Harriet purchased land adjacent to her home and opened the Harriet Tubman Home for Aged and Indigent Colored People. The head injury she suffered in her youth continued to plague her and she endured brain surgery to help relieve her symptoms. But her health continued to deteriorate and eventually forced her to move into her namesake rest home in 1911.
Pneumonia took Harriet Tubman’s life on March 10, 1913, but her legacy lives on. Schools and museums bear her name and her story has been revisited in books, movies and documentaries.
Harriet Tubman: $20 Bill
In 2016, the United States Treasury announced that Harriet’s image will replace that of former President and slaveowner Andrew Jackson on the $20 bill. Treasury Secretary Steven Mnuchin (who served under President Trump) later announced the new bill would be delayed until at least 2026. In January 2021, President Biden’s administration announced it would speed up the design process to mint the bills honoring Tubman’s legacy.
Tubman even had a World War II Liberty ship named after her, the SS Harriet Tubman.
The Victorian era was one of science and innovation. Cameras, cars, electricity and evolution were heralded in under the reign of Queen Victoria. In the world of Chemistry, Dalton and Faraday were making discoveries in atomic theory and electricity. One of the most famous chemists of this era was William Henry Perkin.
In 1856, an 18-year-old William Perkin, Hofmann’s assistant at the Royal College of Chemistry, was tasked to create a chemical synthesis of quinine. Quinine is found in tonic water and used as an anti-malarial. Perkin made several attempts at the synthesis over the Easter vacation in his home laboratory, using coal tar as a source of aniline. Oxidizing the aniline with potassium dichromate gave a black sludge which didn’t contain aniline: it contained something far more exciting. Perkin noticed whilst cleaning out a flask with ethanol that a purple solution had formed – an observation which led to Perkin becoming one of the most celebrated chemists of the Victorian era.
The purple substance – initially named aniline purple – was one of the world’s first synthetic dyes: mauveine. Mauveine’s significance as a dye is its elusive colour. Throughout history, purple clothes have been worn almost exclusively by the richest in society due to the expense of creating purple dyes. Phoenician dye, known as ‘Purple of the Ancients’, is a famous example made from predatory sea snails.
Perkin was encouraged by his family to test the purple substance for colouring clothes. A sample was sent to Messrs Pullar of Perth who gave their approval. Finding success, he quickly patented the method. He set up a factory with his brother, funded by his father. In doing so, he brought purple to the Victorian mass market.
Purple became the height of fashion in Paris and London in the late 1850s to early 1860s, and the frenzy over mauve became known as ‘mauveine measles’. Even Queen Victoria was not exempt from the excitement, appearing in 1862 at the International Exhibition wearing a silk dress colored by mauveine. Wife of Napoleon III, Empress Eugenie wore mauveine-dyed dresses to stat functions.
But within this story there lurks a curious mystery. Closer inspection of Perkin’s synthesis method reveals that he may have been hiding something. There are eight bottles of mauvine alleged to have been made by Perkin left in the whole world, spread across six museums in four cities: London, Manchester, Bradford and New York. Museum-stored mauveine was tested in the 1990s and is rich in two main components – the chromophores of mauveine known as mauveine A and mauveine B.
Dr. John Plater at the University of Aberdeen repeated Perkin’s synthesis as it was written in the original patent, and here’s where the curiosity begins. The synthesis produces not two chromophores of mauveine, but four: A, B, B2 and C.
Did Perkin miss something out when he patented his method? Or are the samples in these museums not genuine Perkin’s mauveine? To solve this mystery, Dr. Plater began investigating the synthesis of mauveine. Perkin’s starting material was aniline extracted from coal tar, and later made commercially from coal-tar, which would also have contained two impurities, ortho- and para-toluidine, which have a similar chemical structure to aniline.
Dr. Plater’s attempts to make mauveine from different combinations of aniline and toluidines were always unsuccessful – he never managed to create a product with only the A and B chromophores. Every synthesis created four chromophores of mauveine. Removing the B2 and C chromophores was also impossible.
In the search for more information, Dr. Plater was given access to analyze three samples of Perkin’s mauveine. These samples were stored in museums: one in Manchester, Bradford, and the other in Sudbury, the London borough where Perkin built both his family home and his factory. One sample is accompanied with a letter, addressed to Prof Henry Armstrong Fellow of the Royal Society, from William Henry Perkin’s son, Frederick Mollwo Perkin. The ‘Mollwo’ letter, as it is now known, provides evidence that the museum-stored mauveine samples are from Perkin’s factory.
Dr Plater used liquid chromatography-mass spectrometry (LC-MS) to identify the chromophores present in these mauveine samples. In LC-MS, liquid chromatography is used to separate compounds by running them along a long column filled with reverse phase silica gel. The different chromophores reach the end of the column at different times. Mass spectrometry can then be used to identify the structure of each chromophore from its molecular mass and the way it fragments. This revealed that the Bradford and Sudbury mauveines, like the Manchester mauveine, are highly rich in mauveine A and B.
Museum-stored mauveines match each other in their compositions, and the Mollwo letter provides evidence that the samples originate from Perkin. However, the synthesis described by Perkin doesn’t produce mauveine with the correct composition. Dr. Plater deduces from this that Perkin actually used a different synthesis method from the one he said he used.
A final clue in this mystery comes from six pence stamps. Victorian postage stamps printed using mauveine dyes are available to purchase online. Dr. Plater analyzed the mauveine in 15 six pence stamps using LC-MS. Each stamp had a slightly different composition, generally of all four chromophores. A fluctuating composition of mauveine provides further evidence that the method for synthesizing mauveine changed over time. Dr. Plater believes that his method is actually more similar than the method in Perkins patent to the method actually used by Perkin.
One question remains: why would Perkin patent one method for making mauveine, but use another? An answer may be to do with the yield of product. Dr. Plater notes that mauveine is actually very difficult to make. The yields are low – about 1 per cent. The method proposed by Dr. Plater increases the yield to about five per cent. Perkin discovered this synthesis by accident, but clearly understood the chemistry well enough to understand the need for research and development.
But there may be another answer to this question. In a lecture in 1896, Perkin revealed his concerns about his competition: other manufacturers of mauveine were using copper chloride as an oxidizing agent in place of Perkin’s potassium dichromate. Dr. Plater has strong evidence now that Perkin never revealed his true method. Perkin may well have done this intentionally: as the demand for synthetic dyes grew, Perkin wanted to avoid his competition getting hold of his secrets.
Perkin was the first person to mass produce a synthetic dye, but this research uncovers a new aspect to Perkin’s achievements. Analysis of mauveine stored in museums, Victorian stamps, and Perkin’s original patent provides evidence that Perkin iterated and improved his method of making his dye, making him one of the first chemists to realize the value of research and development. Because Perkin never revealed his true method, we may never know how he did it – but with Dr. Plater’s research we are one step closer to the truth.
It took a war, famine, and poultry to develop the technological breakthrough responsible for saving thousands of premature infants. The Franco-Prussian war in 1870-1871, along with a concomitant famine, had contributed to a significant population decline in France. To increase the growth rate, the French needed to start having more babies, as quickly as possible. But one obstetrician realized that if he could find a way to reduce infant mortality, then the population growth rate problem could be solved far sooner.
That French obstetrician was Dr. Étienne Stéphane Tarnier, who, having observed the benefits of warming chambers for poultry at the Paris Zoo, had similar chambers constructed for premature infants under his care. These warm air incubators, introduced at L’Hôpital Paris Maternité in 1880, were the first of their kind. Dr. Pierre Budin began publishing reports of the successes of these incubators in 1888. His incubators had solved the deadly problem of thermoregulation that many premature babies faced.
Dr. Budin wanted to share his innovation with the world, but few in the stubborn medical establishment would listen. Many doctors viewed the practice as pseudo-scientific and outside the realm of standard care. But Dr. Budin was convinced that the Tarnier incubators would save so many lives that he enlisted the help of an associate, Dr. Martin Couney, in exhibiting the new incubators at the World Exposition in Berlin in 1896.
Apparently blessed with skills in showmanship as well as medicine, Dr. Couney took the assignment perhaps a step farther than what Dr. Budin has originally anticipated; Couney asked the Berlin Charity Hospital to borrow some premature babies for this experiment, and they granted his request, thinking that the children had little chance of survival anyway. When he managed to hire a cadre of nurses to fully demonstrate the capabilities of the incubators, he was ready to take the show on the road.
Nestled between exhibits of the Congo Village and the Tyrolean Yodelers, “Couney’s Kinderbrutanstalt,” or ‘Child Hatchery,’ became a wild success. Remarkably, all six babies in the Tarnier incubators survived. From there, Couney took his entourage to the United States where he went on to share his show at virtually every large exhibition and at the World’s Fair. He ultimately settled at New York City’s Coney Island amusement park and connected parents eager to save the lives of their premature newborns with circus sideshow visitors willing to pay 25¢ to view the uncannily tiny babies. It was an odd connection indeed, but a brilliant one that kept the warming glow of the incubator lights on for over 40 years, and saved thousands of babies in the process.
The babies were premature infants kept alive in incubators pioneered by Dr. Martin Couney. The medical establishment had rejected his incubators, but Couney didn’t give up on his aims. Each summer for 40 years, he funded his work by displaying the babies and charging admission — 25 cents to see the show.
In turn, parents didn’t have to pay for the medical care, and many children survived who never would’ve had a chance otherwise. Lucille Horn was one of them. Born in 1920, she, too, ended up in an incubator on Coney Island. “My father said I was so tiny, he could hold me in his hand,” she tells her own daughter, Barbara, on a visit with StoryCorps in Long Island, N.Y. “I think I was only about 2 pounds, and I couldn’t live on my own. I was too weak to survive.”
She’d been born a twin, but her twin died at birth. And the hospital didn’t show much hope for her, either: The staff said they didn’t have a place for her; they told her father that there wasn’t a chance in hell that she’d live. “They didn’t have any help for me at all,” Horn says. “It was just: You die because you didn’t belong in the world.”
But her father refused to accept that for a final answer. He grabbed a blanket to wrap her in, hailed a taxicab and took her to Coney Island — and to Dr. Couney’s infant exhibit.
“How do you feel knowing that people paid to see you?” her daughter asks. “It’s strange, but as long as they saw me and I was alive, it was all right,” Horn says. “I think it was definitely more of a freak show. Something that they ordinarily did not see.”
Horn’s healing was on display for paying customers for quite a while. It was only after six months that she finally left the incubators.
Years later, Horn decided to return to see the babies — this time as a visitor. When she stopped in, Couney happened to be there, and she took the opportunity to introduce herself.
“And there was a man standing in front of one of the incubators looking at his baby,” Horn says, “and Dr. Couney went over to him and he tapped him on the shoulder.”
“Look at this young lady,” Couney told the man then. “She’s one of our babies. And that’s how your baby’s gonna grow up.” After all, Horn was just one of thousands of premature infants that Couney cared for and exhibited at world fairs, exhibits and amusement parks from 1896 until the 1940s. He died in 1950, shortly after incubators like his were introduced to most hospitals.
At the time, Couney’s efforts were still largely unknown — but there is at least one person who will never forget him. “You know,” she says, “there weren’t many doctors then that would have done anything for me. Ninety-four years later, here I am, all in one piece. And I’m thankful to be here.”
Although this incident happened over 90 years ago, it still intrigues me. This was one of the first mysteries I ever read about in high school and it’s stuck with me. March 1 is the 91st anniversary of the event, so I am bringing the tale to you from the All That’s Interesting website. (I have added some pictures because some of the pictures in the story would not post for me.)
The Tragic Story Of The Lindbergh Baby Kidnapping
By Katie Serena
On May 12, 1932, the tiny body of one-year-old Charles Augustus Lindbergh Jr. was discovered in the woods outside of Trenton, New Jersey. The coroner’s report stated that the child had been dead for over two months. The child’s skull had a hole in it as well as several other fractures, and the coroner ruled the cause of death as a blow to the head. Several of the baby’s body parts were also missing.
The Lindbergh baby, the son of Spirit of St. Louis pilot Charles Lindbergh Sr., had been missing for roughly three months after being kidnapped from his crib at the Lindbergh home.
The child had been put to bed by the nurse at 7:30 PM. Two hours later, Lindbergh Sr. heard a noise coming from that he assumed to be a wooden crate, snapping in the kitchen. At 10:00 PM, the nurse discovered that the child’s crib was empty.
After discovering that the child was not with the nurse, or with his mother, Lindbergh Sr. discovered a ransom note on the windowsill and a broken ladder outside the window. After reading the note, Lindbergh Sr. fruitlessly searched the house and the grounds before calling the police.
For three months, the Lindbergh family, along with the FBI, searched for the child, even fulfilling an enormous ransom request and interviewing countless suspects and witnesses.
In the end, the official culprit named was Richard Hauptmann, an immigrant from Germany who had a criminal record back in his homeland. Police discovered Hauptmann in possession of $14,000 of the original $50,000 used to pay the ransom after tracking him through one of the $10 bills he had spent at a local gas station.
Hauptmann was arrested and charged with capital murder of the Lindbergh baby, a charge that allowed the death penalty as a possible option. The trial was dubbed the “Trial of the Century,” with one reporter even claiming it was the “biggest story since the Resurrection.”
As big as the trial was, the jury was surprisingly quick to return a guilty verdict. He was immediately sentenced to death and his two requests for appeal were both denied. On April 3, 1936, four years after the kidnapping, Richard Hauptmann was executed via electric chair.
The Official Investigation Of The Lindbergh Baby Kidnapping
Though the case seemed open and shut on paper, the investigation was far from. Between the media frenzy, the mysterious ransom letters, and the numerous side investigations happening, it’s a miracle anyone was convicted.
When the Lindbergh baby kidnapping was first reported, hundreds of loyal Lindbergh fans and concerned citizens descended upon the Lindbergh estate. While the media attention helped to boost the case and help spread the word about the missing toddler, the high levels of traffic on the estate effectively destroyed any footprint evidence that might have been found outside the home.
It also encouraged hundreds of false reports of sightings and information. Military officials and investigators all offered their services, claiming to have expertise in kidnappings and law enforcement. However, only one of them truly did.
Herbert Norman Schwarzkopf, superintendent of the New Jersey State Police Department, along with Lindbergh, theorized that the Lindbergh kidnapping was part of an organized crime ring rather than a single perpetrator seeking the ransom money. Following that lead, they reached out to mobsters, both in and out of prison, hoping one of them would have information on the Lindbergh baby.
Al Capone himself even reached out to Lindbergh, offering his services in exchange for an early prison release, though he was quickly denied. Similarly, it was decided that mobsters were likely to be less than helpful when it came to offering up information for free.
Due to the media circus and the high profile of Lindbergh, President Herbert Hoover was notified of the kidnapping the morning after it happened. Though kidnappings were usually dealt with among local authorities, Hoover assigned the entire Bureau of Investigation (not yet Federal) to the case and authorized them to work with the New Jersey police.
As a reward for information pertaining to Charles Lindbergh Jr.’s case, the police department offered up $25,000. In addition, the Lindbergh family offered another $50,000 of their own.
The Unofficial Investigation
While the New Jersey Police were investigating alongside the Lindbergh family, a retired New York school teacher was also taking an interest in the Lindbergh baby case.
John F. Condon, who was at the time a well-known personality in the Bronx, wrote a letter to a local newspaper offering a reward of $1,000 if the kidnapper would return “Little Lindy” to a Catholic priest. Surprisingly, Condon received a letter back from people claiming to be the kidnappers, asking Condon to be their intermediary between them and Lindbergh.
Lindbergh, desperate to find his son, agreed, allowing Condon to fulfill the letters request. Condon placed a classified ad in another newspaper and arranged a meeting with one of the kidnappers to take place in Woodlawn Cemetery in the Bronx.
The meeting did indeed take place, though under cover of darkness, so the culprit’s face was never clearly seen. However, the man said his name was John and claimed he was part of an escaped Scandinavian gang. He claimed to have the toddler in his possession in a boat off the coast and would return it for the ransom. When Condon doubted the man’s story, the man promised to return the baby’s pajamas.
Indeed, a few weeks later, Condon received a toddler’s sleeping suit in the mail. Lindbergh confirmed that the pajamas were his sons and asked Condon to continue communicating with the kidnappers and fulfilling their requests.
The Ransom For The Lindbergh Baby
Over the course of the Lindbergh kidnapping investigation, the Lindberghs and Condon received a total of seven ransom letters. The first was found by Charles in his son’s room immediately after discovering the boy was gone. It outlined the Lindbergh baby kidnapping and asked for $50,000 to be delivered to a yet-undisclosed location in small bills.
The first note was signed with a “signature,” a hand-drawn symbol comprised of three circles and three punched out holes. The second and third notes, delivered to the Lindbergh home and local investigators, carried the same symbols. The rest of the notes were delivered to Condon and did not carry the notes, though their authenticity was confirmed.
After the delivery of the seventh note, the Lindberghs and the police authorized Condon to orchestrate a drop off of the funds. The ransom money was comprised of gold certificates, chosen because they were about to be withdrawn from circulation, placed inside a handmade box, specifically designed so that it would be easy to recognize in the future. The bills were not marked, but each bill’s serial number was recorded so it could be tracked in the future.
Condon met with “John” on April 2, 1932, to hand over the money. He was told at the meeting that Charles Lindbergh Jr. was in the custody of two innocent women but provided no further information.
Having no leads besides “Cemetery John,” the police began tracking the serial numbers of the ransom bills.
A pamphlet was distributed to businesses in New York containing the serial numbers and providing information for what to do if they were found. Some of the bills turned up, though most went unseen. Most of the bills that appeared showed up randomly and in scattered locations such as Chicago and Minneapolis, though the people who had used them were never located.
A break in the case came on the day that the gold certificates, which made up a large sum of the ransom, were ordered to be turned in for other bills. A New York man brought $2,980 into a Manhattan bank, hoping to exchange them. It was only after he left the bank that it was discovered that the serial numbers matched those of the ransom bills.
Over a period of 30 months, police noticed that many of the bills had started popping up, specifically in the upper east side of Manhattan. Even more specifically, they were being spent along the Lexington Avenue subway route. After a local gas station called and said they had one of the ransom bills in their possession, police were led to Richard Hauptmann.
Though Hauptmann is considered the official kidnapper of Charles Lindbergh Jr., that hasn’t stopped conspiracy theorists from coming up with their own version of what actually happened during the Lindbergh kidnapping.
Defenders of Hauptmann’s are quick to point out that his fingerprints were never found on the ladder or any of the ransom notes. They also attest to the fact that the crime scene was a mess from the start and that any evidence available was quickly compromised by the media circus it became.
Some experts — both self-proclaimed and legitimate — have theorized that Hauptmann was a scapegoat and that Lindbergh knew who the real kidnapper was but was either in on it or too afraid to say anything.
In fact, one of the most popular, and some might say substantiated claims is that the kidnapping was perpetrated by Charles Lindbergh himself. Some say that he accidentally killed his son while attempting a practical joke and staged the kidnapping to cover up his crimes, pointing the finger at Hauptmann to cover his own deeds.
Some believe that Lindbergh orchestrated the kidnapping as a publicity stunt and that after the hired kidnappers didn’t get whatever it was Lindbergh had promised them, the stunt went horribly wrong.
Lindbergh, his family, and the New Jersey police have argued against the theories that he was responsible for the kidnapping, insisting that everything they knew about the case suggested it had been legitimate and that the toddler’s death was simply the result of the kidnapper snapping under pressure.
Whatever the case, though it is closed, the Lindbergh baby kidnapping has become one of the most controversial and conspiratorial cases to ever be discussed by the American public.
Outside of pop culture and media, the case broke ground when it pushed Congress to pass the Federal Kidnapping Act, which made transporting a kidnapping victim across state lines a federal offense. The law is commonly referred to as the “Lindbergh Law.”
Today is the 40th anniversary of the M*A*S*H series finale. The Mental Floss website has a list of 17 interesting things we might not know or remember (from an article dated February 28, 2018.)
In 1968, surgeon H. Richard Hornberger—using the nom de plume of Richard Hooker—collaborated with writer W.C. Heinz to create the book MASH: A Novel About Three Army Doctors, based on his experiences with the 8055th Mobile Army Surgical Hospital during the Korean War. Two years later, Robert Altman used the book as the basis for a movie about the fictional 4077th unit (he cut the number 8055 in half). Two years after that, M*A*S*H came to life again in the form of an 11-season television series. And 35 years ago today, that show culminated in the most-watched series finale in television history. Here are some facts about the show that won’t get you a Section 8.
ALAN ALDA AND JAMIE FARR SERVED IN THE U.S. ARMY.
Alda (Hawkeye Pierce) was in the Army Reserve for six months in Korea. Farr enlisted, and was stationed in Japan when Red Skelton requested his services on his USO Tour through Korea. Wayne Rogers (Trapper John McIntyre) joined the U.S. Navy for a time as a ship navigator. Mike Farrell (B.J. Hunnicut) served in the U.S. Marine Corps.
MCLEAN STEVENSON AUDITIONED FOR HAWKEYE, AND COMEDIAN ROBERT KLEIN TURNED DOWN THE ROLE OF TRAPPER JOHN.
Stevenson was convinced to take the role of Lt. Colonel Henry Blake instead. As for Klein, he denied a claim that he lived to regret the decision.
LARRY GELBART WROTE THE PILOT IN TWO DAYS FOR $25,000.
The veteran screenwriter had been living in London after growing tired of Hollywood, but he couldn’t pass up the opportunity to try to adapt Robert Altman’s movie for television audiences.
KLINGER WAS ONLY SUPPOSED TO BE IN ONE EPISODE.
He was also supposed to be gay. Jamie Farr’s character was changed to a heterosexual who cross-dressed to try to get himself kicked out of Korea. Allegedly, the Klinger character was influenced by comedian Lenny Bruce’s claim that he got discharged from the Navy for claiming to have “homosexual tendencies.”
ONLY THE NETWORK WANTED THE LAUGH TRACK.
Gelbart and executive producer Gene Reynolds were against the canned laughter; unfortunately, CBS knew of no other way to present a 30-minute “comedy.” Gelbart and Reynolds did manage to get the network to agree to take out the laughing during the scenes in the operating room, and as the seasons progressed, the track got quieter and quieter. In the U.K., the BBC omitted the laugh track entirely.
CBS DIDN’T WANT ONE “UNPATRIOTIC” EPISODE.
An episode where soldiers stand outside in the freezing cold so that they can make themselves sick enough to be sent home was rejected by CBS. That soldier tactic was apparently actually used during the Korean War.
THE WRITERS CAME UP WITH AN INGENIOUS WAY OF DEALING WITH SCRIPT COMPLAINTS.
After growing tired of having to listen to cast members’ notes about their scripts, M*A*S*H writer Ken Levine and his fellow scribes changed their script on two occasions so that the actors were forced to pretend it was parka weather on 90- to 100-degree days on their Malibu ranch set. They took the hint and the “ticky tack” notes stopped.
WAYNE ROGERS WAS ABLE TO LEAVE THE SHOW BECAUSE HE NEVER SIGNED A CONTRACT.
Rogers was threatened with a breach of contract lawsuit. The problem was that he had never signed a deal, objecting to the standard contract given to TV actors when he had started playing Trapper John, particularly the “morals clause,” which he considered antiquated. Rogers said that aside from missing the cast—and his friendship with Alda in particular—he had no regrets about leaving the show after season three.
ALDA WAS THE ONLY ACTOR WHO WAS AWARE OF HENRY BLAKE’S FATE UNTIL MOMENTS BEFORE SHOOTING THE FINAL SCENE IN “ABYSSINIA, HENRY.”
Gelbart and Reynolds used the opportunity for McLean Stevenson wanting to leave after the third season to “make a point” about the “wastefulness” of war, and decided to kill off Henry Blake. After distributing the script without the last page and shooting all of the scenes written therein, Gelbart asked the cast to wait a few minutes before the start of the end-of-season wrap party and gave them each one copy of the final page, where Radar enters the O.R. and announces that Henry didn’t make it.
Larry Linville (Frank Burns) immediately remarked that it was “f***ing brilliant.” Gary Burghoff (Radar) turned to Stevenson and called him a son of a bitch, because he was going to get an acting Emmy for the episode. (He didn’t.) They then shot the scene in two takes. Gelbart and Reynolds claimed they received over 1000 letters from people upset over the ending. Reynolds also claimed that CBS was so unhappy with the decision that in at least one repeat airing, they cut out the final scene.
THE WRITERS RAN OUT OF NAMES.
During season six, there’s an episode that features four Marine patients named after the 1977 California Angels infield. Throughout season seven, the patients were named after the 1978 Los Angeles Dodgers. Ken Levine didn’t just use baseball players’ names though; in “Goodbye Radar,” Radar’s new girlfriend was named after one of Levine’s former lady friends, Patty Haven.
THE SERIES LASTED MUCH LONGER THAN THE ACTUAL KOREAN WAR.
The series spent 11 years telling the story of Army doctors and nurses dealing with a three year, one month, and two day war.
ALDA CO-WROTE 13 AND DIRECTED 31 EPISODES OF THE SERIES.
That 31 count includes the series finale. Alda was the first person to ever win an Emmy for acting, directing, and writing on the same program.
A METRIC TON OF FUTURE STARS MADE GUEST APPEARANCES.
Ron Howard played an underage Marine. Leslie Nielsen played a Colonel. Patrick Swayze portrayed an injured soldier with leukemia. John Ritter, Laurence Fishburne, Pat Morita, Rita Wilson, George Wendt, Shelley Long, Ed Begley Jr., Blythe Danner, Teri Garr, and even Andrew Dice Clay also all visited the 4077th.
THE SERIES FINALE IS STILL THE MOST WATCHED EPISODE OF TELEVISION IN AMERICAN HISTORY.
Seventy-seven percent of the people watching television in the United States on the night of Monday, February 28, 1983 were watching the two-and-a-half-hour series finale, “Goodbye, Farewell and Amen.” That was 121.6 million people. A company only had to pay $30,000 to run a 30-second commercial when M*A*S*H got started in 1972. For the series finale, a 30-second spot cost $450,000.
THERE WERE THREE SPINOFFS.
Trapper John, M.D., aired from 1979 to 1986 and was about Trapper John McIntyre’s present-day tenure as chief of surgery back in San Francisco (it didn’t star Wayne Rogers). AfterMASH featured Col. Potter (Harry Morgan), Father Mulcahy (William Christopher), and Klinger (Jamie Farr) working at a veterans’ hospital in Missouri right after the events of M*A*S*H; it was cancelled in its second season as it was unable to compete with The A-Team. W*A*L*T*E*R followed the new adventures of Walter “Radar” O’Reilly (Burghoff again), who became a St. Louis cop after losing the family farm and his wife (not Patty Haven) and attempting suicide. The pilot wasn’t picked up, and only aired once, and only in the eastern and central time zones, on CBS on July 17, 1984.
RADAR’S TEDDY BEAR WAS SOLD AND RETURNED TO BURGHOFF.
Burghoff said Radar’s teddy bear had been lost for 30 years until it suddenly turned up at an auction in 2005. A medical student bought it for $11,500, and promptly sold it back to Burghoff.
A CONSTRUCTION WORKER FOUND THE SHOW’S TIME CAPSULE ALMOST IMMEDIATELY.
In the series’ penultimate episode, “As Time Goes By,” the characters bury a time capsule under the Fox Ranch. Two months later, the land was sold. Soon after, a construction worker found the capsule and got in contact with Alan Alda to ask what he should do with it. After he was told to keep it, Alda claimed the construction worker “didn’t seem very impressed.”
“Every word carries a secret inside itself; it’s called etymology. It is the DNA of a word.” — Mary Ruefle,”Madness, Rack & Honey”
“Etymology” derives from the Greek wordetumos, meaning “true.” The practice of etymology is uncovering the truth by tracing the root of a word. If you’re interested in language, it can be quite exhilarating. Like being a linguistic detective. There will be few pictures in this one so settle down….this will take you a few minutes!!!
“anything that befalls of ruinous or distressing nature; any unfortunate event,” especially a sudden or great misfortune, 1590s,
from French désastre (1560s),
from Italian disastro, literally “ill-starred,” from dis-,
from Latin astrum,
from Greek astron “star”.
The origin of the word points to unfavorable events being blamed on certain planet positions. Destiny is written in the stars – in some conceptions of fate in mythology, the universe is fixed and inevitable.
A far cry from World’s Strongest Man, the origin of the word ‘muscle’ is perhaps the most surprising.
“contractible animal tissue consisting of bundles of fibers,”
late 14c., “a muscle of the body,”
from Latin musculus “a muscle,” literally “a little mouse.
Rather than relating to strength and brawn as we understand it, ‘muscle’ is derived from the appearance of a muscle under the skin. Particularly biceps, which were thought both in Latin and in Greek to resemble a mouse running beneath the skin.
Perhaps you’ve been told by an English teacher in the past to avoid using the word ‘nice’. This is because the word is so commonly used in our language that it’s not highly descriptive or imaginative. Many English teachers consider it a cop-out. Yet its origins are far more interesting than the word appears.
late 13c., “foolish, ignorant, frivolous, senseless,”
from Old French nice (12c.) “careless, clumsy; weak; poor, needy; simple, stupid, silly, foolish,”
from Latin nescius “ignorant, unaware,” literally “not-knowing,”
Old English clud “mass of rock, hill,” related to clod.
The modern sense “rain-cloud, mass of evaporated water visible and suspended in the sky” is a metaphoric extension that begins to appear c. 1300 in southern texts, based on the similarity of cumulus clouds and rock masses.
The usual Old English word for “cloud” was weolcan (see welkin).
In Middle English, skie also originally meant “cloud.”
The last entry for cloud in the original rock mass sense in Middle English Compendium is from c. 1475.
The origins of the word ‘cloud’ are surprising. You wouldn’t automatically associate their wispy appearance with the solidity of rocks. The etymology explains that it refers to the mass it accumulates and thus appearing similar to earth formations.
This is a great example of the word being an example of itself.
in rhetoric, “a figure conjoining words or terms apparently contradictory so as to give point to the statement or expression,”
1650s, from Greek oxymōron, noun use of neuter of oxymōros (adj.) “pointedly foolish,”
from oxys “sharp, pointed” (from PIE root *ak- “be sharp, rise (out) to a point, pierce”) + mōros “stupid” (see moron).
Now, it’s used more broadly to denote a contradiction in terms. Originally, though, it was a clash of terms around sharpness and dullness.
The origins of ‘quarantine’ may interest you.
1660s, “period a ship suspected of carrying disease is kept in isolation,”
from Italian quaranta giorni, literally “space of forty days,”
from quaranta “forty,” from Latin quadraginta “forty,” which is related to quattuor “four” (from PIE root *kwetwer- “four”). So called from the Venetian policy (first enforced in 1377) of keeping ships from plague-stricken countries waiting off its port for 40 days to assure that no latent cases were aboard. Also see lazaretto.
The extended sense of “any period of forced isolation” is from the 1670s.
Earlier in English the word meant “period of 40 days in which a widow has the right to remain in her dead husband’s house” (1520s), and, as quarentyne (15c.), “desert in which Christ fasted for 40 days,” from Latin quadraginta “forty.”
We understand ‘quarantine’ as a period of isolation to prevent the spread of an illness, but the background on this is very interesting. The root of the word is more specific to the period of time elapsed.
Without the word ‘tragedy’, we wouldn’t have one of the greatest songs by the Bee Gees. But there is also an interesting word history to be grateful for.
late 14c., “play or other serious literary work with an unhappy ending,”
from Old French tragedie (14c.), from Latin tragedia “a tragedy,”
from Greek tragodia “a dramatic poem or play in formal language and having an unhappy resolution,”
apparently literally “goat song,” from tragos “goat, buck” + ōidē “song” (see ode), probably on model of rhapsodos (see rhapsody).
Although the specificity of the goat connection is debated, the connection to goats, in general, is accepted. There are a few different possibilities as to why. The etymology includes the literal translation “goat song”. Tragedy as we know it has its roots in ancient Greece, where it’s thought people dressed as goats and satyrs in plays. There are other theories surrounding goat sacrifices. Either way, who knew goats were involved at all?
What would a list of surprising etymology be without the word ‘surprise’ itself?
also formerly surprize, late 14c.,
“unexpected attack or capture,” from Old French surprise “a taking unawares” (13c.),
from noun use of past participle of Old French sorprendre “to overtake, seize, invade” (12c.),
Meaning “something unexpected” first recorded 1590s, that of “feeling of astonishment caused by something unexpected” is c. 1600.
Meaning “fancy dish” is attested from 1708.
When you think of the word ‘surprise’ today, you might think of smiling faces. In history, though, it had a much more violent origin. The word is rooted in an invasion in having the element of surprise as an advantage. It is also interesting that it has root words meaning “grasp”. This can also be related to words like “comprehend”.
It is interesting how the word ‘comrade’ is considered a non-neutral term. Whether it’s a veteran recalling time spent with his old army comrades, or used among the political left. Its origins point to it being more widely applicable.
1590s, “one who shares the same room,” hence “a close companion,”
from French camarade (16c.),
from Spanish camarada “chamber mate,” or Italian camerata “a partner,”
from Latin camera “vaulted room, chamber” (see camera).
In Spanish, a collective noun referring to one’s company.
In 17c., sometimes jocularly misspelled comrogue.
Used from 1884 by socialists and communists as a prefix to a surname to avoid “Mister” and other such titles.
Also related: Comradely; comradeship.
With this considered, you could call any of your cohabitants “comrade”. And it’s perfectly acceptable to use it for your partner, no matter what your politics are.
To end where we started, with the spirit of investigation, let’s have a look at the word ‘clue’.
“anything that guides or directs in an intricate case,” 1590s, a special use of a revised spelling of clew “a ball of thread or yarn” (q.v.).
The word, which is native Germanic, in Middle English was clewe, also cleue; some words were borrowed from Old French and Middle but these later were reformed, and this process was extended to native words (hue, true, clue) which had ended in a vowel and -w.
The spelling clue is first attested mid-15c.
The sense shift is originally in reference to the clew of thread given by Ariadne to Theseus to use as a guide out of the Labyrinth in Greek mythology. The purely figurative sense of “that which points the way,” without regard to labyrinths, is from 1620s.
As something which a bewildered person does not have, by 1948.
The word origins rooted in old stories like this are the most fascinating. A clue could be any object now. But, once upon a time, it was explicitly a ball of yarn a character used to find his way.
In honor of George Washington’s birthday, I am bringing an article, written by Dave Roos, called 8 Crazy Facts About the Washington Monument. Enjoy!
The National Park Service calls the original design plan “audacious, ambitious and expensive,” which explains why all but the obelisk was eventually scrapped.
On Sept.19, 2019, the Washington Monument reopened to the public after a three-year renovation. Eager tourists got in line early to experience the zippy new elevator and take in one of the best views East of the Mississippi.
The Washington Monument is an impressive structure dedicated to an American icon, but its construction was less than smooth (it was actually derailed for decades by a political coup). Here are eight surprising facts about America’s favorite obelisk.
A Memorial for Washington Was Planned Way Before He Died
It’s hard to overstate how much Americans loved George Washington. As early as 1783, when Washington was very much alive, plans were in the works for erecting a large statue of the first president on horseback near the Capitol building. In fact, the architect of Washington, D.C., the French landscape engineer Charles Pierre L’Enfant, left an open place for the statue in his drawings. And that’s almost exactly where the Washington Monument sits today.
Congress failed to act on the equestrian statue, and even after Washington died in 1799, legislators couldn’t agree on what kind of monument best suited the national hero. Frustrated with congressional feet-dragging, a private organization called the Washington National Monument Society was formed in 1833 to raise money and solicit designs for a large-scale homage to America’s beloved first president.
The Original Design Was a Mashup
In 1836, the Washington National Monument Society announced a design contest for the future Washington Monument and the winning sketch was submitted by 29-year-old architect Robert Mills, who would go on to design the U.S. Post Office, the Patent Office and the Treasury Building.
Mills’ original design was a mashup of architectural references. First, there was to be a 600-foot obelisk with a flattened top, a nod to the Egyptomania that had captured the early 19th-century imagination. (Note that soon after Washington’s death, the House of Representatives proposed the construction of a marble pyramid, 100 feet on each side, to serve as the first president’s mausoleum. The pharaohs would have approved, but Congress didn’t.)
In Mills’ original sketch, the giant Egyptian obelisk was to be encircled at its base by a neoclassical temple with 30 towering columns. On top of the circular temple would be a statue of Washington on a chariot, and in between each of the 30 columns would stand statues of 30 different revolutionary war heroes.
The National Park Service called Mills’ original plan “audacious, ambitious and expensive,” which explains why all but the obelisk was eventually scrapped.
There’s a Zinc Time Capsule in the Cornerstone
An estimated 15,000 to 20,000 crowded the National Mall to witness the laying of the Washington Monument’s cornerstone on July 4, 1848. But first the 24,500-pound hunk of pure white marble had to be dragged through the streets on a cart with bystanders grabbing lengths of rope to help the cause.
After a droning two-hour speech by the Speaker of the House, the assembled dignitaries placed mementos in a zinc box that would be sealed in the monument’s cornerstone for eternity (or until an alien race plucks it from the ruins of Western civilization). Included in the zinc time capsule were copies of the Declaration of Independence and the Constitution, a portrait of Washington, an American flag, all the coins in circulation and newspapers from 14 states. The laying of the cornerstone was performed by a grandmaster of the masonic lodges and its actual location apparently is still a mystery.
Construction Was Stalled by the Pope’s Stone Saga
The unfinished stump of the Washington Monument, as it looked for over 25 years. During the U.S. Civil War, the site was used for the grazing and slaughtering of government cattle, earning it the nickname Beef Depot Monument.
By 1856, after eight years of slow and painstaking construction, the obelisk stood 156 feet high and would remain that way — an unfinished eyesore that Mark Twain called “a hollow, oversized chimney” — for the next 21 years. The reason, weirdly enough, had to do with the Pope.
In 1853, the Washington National Monument Society was dangerously low on funds, so they came up with a scheme whereby large donors could have a commemorative stone placed in the interior of the obelisk. One of those donors ended up being Pope Pius IX, who shipped over a 3-foot piece of marble from the Temple of Concord in Rome.
The Pope’s gift really ticked off members of the new “Know-Nothing” party, who were virulently anti-immigrant and anti-Catholic. On the night of March 6, 1854, a gang of men locked the night watchman in his shed and stole the Pope’s stone, allegedly tossing it in the Potomac.
The controversy over the stolen stone brought donations to a standstill. But even worse was what happened next; a contingent of Know-Nothings staged a coup and overthrew the leadership of the Monument Society. Donations dried up entirely and the Know-Nothings only managed to add 20 more feet to the obelisk by the outbreak of the Civil War, when construction was halted altogether.
Yes, the Monument is Three Different Colors
After the Civil War, during which the grounds of the stubby Washington Monument were used as a cattle yard and slaughterhouse, Congress finally decided to take over. On July 5, 1876, in time for the centennial celebration of the Declaration of Independence, Congress appropriated $2 million for the completion of the monument and construction resumed in 1877.
The first task of the new chief engineer, Thomas L. Casey, was to reduce the total height of the obelisk to 555 feet, exactly 10 times the width of the structure, and to spend years reinforcing the foundation with concrete.
The next issue was the masonry. The original quarry in Baltimore had shut down, so Casey tried shipping down rock from Massachusetts. But after placing only a few layers of this stone, it was clear that it was a different color and of poorer quality than the original. So, the builders changed tack yet again and brought in stone from another Baltimore quarry, which was used to finish the final two-thirds of the obelisk.
The result is that the Washington Monument is nearly white on the bottom, a tannish-pink on the top with a thin belt of light brown in the middle. Classy, Casey.
The Priceless Capstone Would Cost a Few Bucks Today
Construction of the obelisk was finally completed on Dec. 6, 1884, more than 36 years after the first cornerstone was laid, with the ceremonial setting of the capstone. When you think of precious metals befitting the capstone of a 555-foot monument dedicated to the nation’s greatest hero, you think of gold, maybe silver, but certainly not aluminum.
Yet back in the late 19th-century, pure aluminum was a very rare commodity, and it was chosen for this important feature, as the metal would not tarnish. (In 1884, aluminum cost $1.10 per ounce or $26 per ounce in 2019 dollars; in 2019, aluminum cost around 78 cents per pound.) The 100-ounce aluminum capstone for the Washington Monument was the largest single piece of cast aluminum in the world. The final cost of the Washington Monument was $1.18 million in 1884 or nearly $30 million in 2019 dollars.
Before the capstone was shipped to Washington, D.C., it went on exhibit on the showroom floor of Tiffany & Co. in New York City, where visitors could say they “jumped over the Washington Monument.” Yay!
For Five Glorious Years, It Was the World’s Tallest Manmade Structure
And then Eiffel built his silly tower in 1789, which at 1,063 feet is nearly twice as tall as the Washington Monument.
But the Washington Monument is — and probably always will be — the tallest structure by far in Washington, D.C., although not for the reasons you might have heard. It has nothing to do with city planners who didn’t want any building to block the view of the Capitol Building or the Washington Monument. That’s actually a myth.
The height limits on buildings in the District of Columbia were established by the Height of Buildings Acts of 1899 and 1910, which were primarily concerned with the fire safety of new construction methods that allowed buildings to be raised to incredible new heights. The laws, which are still on the books in D.C., restrict the height of buildings to the width of the street in front of them, which is 130 feet in most places and 160 feet on Pennsylvania Avenue.
Half a Million Tourists Ride Up the Monument Every Year
The Washington Monument is one of the most popular tourist destinations in Washington, D.C., and untold millions of people visit the monument grounds every year. But given that there’s only one elevator that zips people to the observation deck, only 55 people can be admitted into the monument every half hour. That means that around 500,000 people enjoy the spectacular view from the top of the Washington Monument every year.
The newly installed elevator system will only take 70 seconds to carry visitors to the 51-story observation deck, where they will take in panoramic views of the National Mall, the Capitol Building, the White House and the wilds (suburbs, actually) of Virginia and Maryland up to 25 miles in all directions on a clear day.
Now That’s Cool
The first tourist elevator was installed in the Washington Monument in 1889, just five years after its completion.