Do You Know This Good Luck Saying for the First of the Month?
Saying “rabbit rabbit” is a quirky tradition that’s believed to bring good fortune on the first day of the month. But how did this unusual phrase come to hold such power?
Could the superstitious phrase “rabbit rabbit” have influenced the outcome of the 1932 U.S. presidential election? Some, including President Franklin Delano Roosevelt, seemed to believe it could. There’s a tradition of saying “rabbit rabbit” on the first day of the month to ensure good luck for the days ahead, and journalists documented FDR’s practice of reciting the phrase, as well as carrying a lucky rabbit’s foot (which is now on display at the FDR Presidential Library and Museum). That year, he made history as the first Democratic candidate in 80 years to secure both the Electoral College and the popular vote — though whether luck had anything to do with it is anyone’s guess.
Advertisement
Regardless, the president wasn’t alone in his devotion to the lagomorphic phrase. During World War II, British fighter pilots reportedly recited “rabbit rabbit” before taking flight, hoping for a successful mission. The first print citation seems to be in a 1909 British periodical: A reader’s submission recounts a story of how their child would say “rabbit” on the first day of the month, believing it would bring good fortune as the first word spoken.
But this superstitious belief in rabbits surpasses modern Western culture. In Chinese tradition, the rabbit is regarded as the luckiest of the 12 animals in the zodiac. Across many ancient cultures, the rabbit symbolized fertility and life, and in modern symbolism, rabbits serve as a beacon for the coming spring and a religious representation of renewal. All of these beliefs contribute to the enduring superstition that rabbits are harbingers of good fortune.
According to its adherents, saying “rabbit rabbit” is most effective when recited upon waking up on the first of the month. But if you forget to say it, don’t worry. According to NPR, saying “black rabbit” or “tibbar tibbar” (“rabbit” spelled backward) before bed will still do the trick in keeping any misfortune at bay for the month ahead.
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
In the ever-expanding email lexicon, few abbreviations are as ubiquitous as “cc.” How has this 19th-century term come to be so important in 21st-century tech?
The evolution of technology is unrelentingly swift. Generations have witnessed state-of-the-art inventions become obsolete in the blink of an eye. The fax machine, the pager, and the landline, all once revolutionary, have earned their place in the annals of history. Yet email — born in 1971 when engineer Ray Tomlinson sent the very first “ping” — has remarkably endured. This persistent form of communication continues to redefine itself for the digital age, but one enigmatic email feature hasn’t changed for 50 years: the “cc.”
Advertisement
The “cc” field is a familiar sight perched within the recipient line of an email. The purpose is to send a copy of the message to an additional recipient who might need the information, but isn’t integral to the action of the message. Its modern definition traces back to its original usage in the late 19th century, when a duplicate was called a “carbon copy.”
Around the 1870s, long before photocopiers, documents were painstakingly duplicated by hand using carbon paper between two sheets of plain paper. Something written or typed on the top page would be transferred to the bottom piece of paper through the pressure on the carbon (essentially ink). The exact replica was called a “carbon copy.” By the 1920s, the term “carbon copy” had shifted to figuratively describe something that was a near-identical replica, such as “Mark was a carbon copy of his father.” This usage made its way into corporate America in the 1930s, when it was used in business as shorthand to ensure that multiple parties received the same information.
When email emerged in the 1970s, “cc” was quickly adapted, as the jargon was already familiar in professional circles. The concept of the “bcc” (blind carbon copy) soon followed, allowing a sender to conceal recipients entered in the “bcc” field. In the 1980s, the use of “cc” became so prevalent that it evolved into a verb, as in, “I cc’d Amy on that message.”
Even as digital technology seems to be on the cusp of the next big thing,”cc” is here to stay. The anachronism has survived 150 years, evolving from an industrial-era hand-copying tool into an indispensable feature of email etiquette, cementing its place in communication with one simple click.
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
Even Shakespeare grappled with this grammar choice. Here’s how to avoid the common mix-up between “which” and “that” by identifying what type of clause you’re writing.
“Caesar, thou art revenged, / Even with the sword that killed thee.” This pivotal verse from Shakespeare’s Julius Caesar uses “that” to introduce a restrictive clause, which means it provides essential context. But sometimes, you’ll see similar sentences incorrectly written with “which,” as in, “Even with the sword which killed thee.” Although these sentences convey the same meaning, the latter is actually a grammatical error. Here’s how to get it right.
Advertisement
The distinction between “which” and “that” can confuse even the most experienced writers.The key to using these words correctly lies in distinguishing between restrictive and nonrestrictive clauses. A restrictive clause provides critical information about the noun it modifies. You can’t remove the clause without altering the sentence’s meaning. For example, “The book that I borrowed is on the table.” The intent of the sentence is to specify the borrowed book, not just any random book, so “that I borrowed” is necessary information. A nonrestrictive clause, however, adds extra detail that can be omitted without changing the sentence’s core meaning. For instance, “The novel, which I read on the bus to kill time, was a thriller set in Nantucket.” Remove “which I read on the bus to kill time,” and the sentence still makes sense: “The novel was a thriller set in Nantucket.”
The key to this word choice typically relates to comma usage. Use “that” for restrictive clauses (no commas) and “which” for nonrestrictive clauses (with commas). For example: “The pen that I broke is in the trash” (restrictive), versus “The pen, which I bought in Maine, broke in my purse” (nonrestrictive).
If you’re still having trouble telling them apart, don’t sweat it. Even Shakespeare wasn’t so strict about this rule. In The Winter’s Tale, he wrote, “It is a heretic that makes the fire, not she which burns in’t.” Current grammar rules would dictate “that” in the restrictive clause, but Shakespeare wrote in what’s called Elizabethan English, which is a precursor to modern English, and the rules don’t exactly match up. It’s also worth noting that Shakespeare’s works were meant to be performed on a stage, and it’s less likely that someone will nitpick your use of “that” versus “which” in spoken word.
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
Sometimes banishment can be a good thing — just look at Napoleon’s exile to Elba or when Pete Best was pushed out of the Beatles. The concept applies to the English language as well, which is rife with words that have overstayed their welcome. Linguists at Michigan’s Lake Superior State University are particularly passionate about the topic, so much so that they’ve released a Banished Words List annually since 1976. This quirky tradition seeks to playfully ban any words, acronyms, or slang for “Mis-Use, Over-use, and General Uselessness.” Let’s take a look at the 2025 contenders for linguistic banishment.
Advertisement
Some “banished” words have been used so frequently that they’ve lost all meaning. This includes “cringe” — the No. 1 entry — as saying it is likely to make anyone within earshot do that exact action. We also have “game changer,” which has been said so often that it’s nothing more than a cliché. “Era” is the third entry on the list, as a certain megastar’s tour encouraged marketers to make everything in the past year an “era.” Next up is “dropped,” which used to be reserved for major album debuts. However, I recently heard someone use it for a new grocery product, so I can confirm it is indeed over.
The 2025 Banished Words List also includes acronyms and slang, such as “IYKYK.” This translates to “If you know, you know,” and was lambasted for being too cryptic and also unhelpful. Another bit of modern slang to get rid of is “Sorry, not sorry,” a phrase that’s more backhandedly disingenuous than it is sincere. LSSU also recommends doing away with youthful slang such as “Skibidi” (a true nonsense word) and “100%,” the latter of which has been overused in place of more straightforward phrases like “I agree.”
To complement LSSU’s Banished Words List, I have a few other recommendations. Let’s start with “very” — overuse of this term is a sign of lazy writing. English is full of so many descriptive terms, so forgo phrases such as “very happy” and use “ecstatic,” “overjoyed,” “blissful,” or another synonym instead. I’d also like to get rid of the phrase “It is what it is,” as it’s unhelpful and fails to contribute much, if anything, to the conversation.
Obviously none of these words is actually being banished, but it’s worth using this list to analyze your personal vernacular. If you find yourself guilty of falling back on these terms and phrases, try to incorporate some more interesting alternatives in their place.
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
A squinting modifier is a grammatical quirk that creates ambiguity, often confusing readers who are unsure if the modifier is attached to the word before or after it.
Imagine you’re approaching a traffic light intersection with both signals red, and you’re unsure which direction will turn green first. You’re forced to shift your gaze back and forth, trying to determine your next move. This feeling mirrors the experience of reading a sentence with a squinting modifier. It forces the reader to pause, looking back and forth, unsure which word is actually being modified.
Advertisement
For example, consider the sentence, “Studies show that reading often improves memory.” The adverb “often” creates a dilemma. Does it modify “reading” (suggesting reading occurs frequently) or “improves memory” (indicating that improvement happens often)? Either interpretation is possible, leaving readers in doubt.
A squinting modifier is an adverb or adjective between two words or phrases. In the above example, the adverb “often” is the squinting modifier. Here’s another: “The house that got a new roof recently was sold.” It’s unclear whether the house recently got a new roof or if the house itself recently was sold. This confusion gives the modifier its name — squinting — as it seems to look in both directions.
These modifiers can be fixed by rearranging or rephrasing the sentence. The previous example can be fixed this way: “The house that got a new roof was recently sold.” Similarly, “Studies show that frequent reading improves memory” is another simple fix. By repositioning the modifier or rephrasing the sentence, you can eliminate ambiguity and clarify your meaning for readers, making your writing clear and concise.
Featured image credit: Prostock-studio/ Shutterstock
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
These complimentary terms are often used interchangeably, but they carry distinct meanings. Knowing the difference can help you choose your words more thoughtfully.
The old adage “cruel to be kind” might sound catchy, but the saying probably should be “cruel to be nice.” Both “kind” and “nice,” by definition, are positive traits, but “nice” often reflects a surface-level politeness motivated by social conformity. Think of holding a door open for someone — it’s a courteous gesture, but the motivation isn’t necessarily rooted in goodwill. You might be thinking, “Hurry up,” while the person walks to the door — so is this actually a sincere act if you’re slightly annoyed by it? There are a few usages of “nice,” but the one we’re talking about means “pleasant; agreeable; satisfactory.” This definition reveals surface-level intentions of being polite and doing what is needed to maintain the status quo — traits often motivated by societal expectations. By contrast, “kind” means “having or showing a friendly, generous, and considerate nature.” It describes an innate sense of being.
Advertisement
Ethicists at Santa Clara University agree: “The distinguishing factor seems to lie in the motivation of a person or act.” For example, an act of politeness, such as helping someone who dropped their groceries, might be considered nice, but kindness is more consistent and entails a heartfelt willingness to help over time. In essence, to be nice is an action, while “kind” describes a lasting personality trait.
Looking into the etymology further clarifies this divergence. “Kind” comes from the Old English cynde, meaning “natural” or “innate,” often linked to familial warmth. “Nice,” however, evolved from the Latin nescius (meaning “ignorant”), once used to describe qualities such as carelessness or foolishness before it transformed into its current meaning of “pleasant” by the mid-18th century.
Rest assured, both terms are complimentary today. However, “kind” implies a more genuine nature related to personality, while “nice” often describes polite actions fueled by social obligations, whether authentic or not. These terms can overlap but are not entirely synonymous, so recognizing these nuances can help you be more accurate in your descriptions.
Featured image credit: maxim ibragimov/ Shutterstock
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
Pop culture shapes the way we talk, but some phrases have become so widespread that you may have forgotten where they came from. These words and phrases originated in classic American television.
The impact of television is hard to overstate: It has given us countless hours of entertainment, provides important information, and serves as a platform for the life-changing products sold during late-night infomercials (we’re looking at you, George Foreman Grills).
TV also has had an indelible impact on the English language, as it has introduced us to a variety of new phrases and words that didn’t exist prior to those particular TV shows or episodes. In many cases, you may not realize that these terms were coined on TV — it seems like they’ve always been a part of our collective lexicon. Here are a few examples of phrases that leapt from the small screen to the pages of the dictionary.
Advertisement
Jump the Shark
The phrase “jump the shark” is a slang term that, according to the Oxford English Dictionary (OED), means “to begin a period of inexorable decline in quality or popularity.” While it can be used in general today, the origin is a specific storyline from the 1977 Happy Days episode “Hollywood: Part 3,” in which the character Fonzie leaps over a literal shark while he’s on water skis. However, no characters in the episode used the actual phrase “jump the shark.” It wasn’t until 1987 that this common idiom was coined. According to the Los Angeles Times, it was conceived of by future radio personality Jon Hein. To him, the shark storyline demonstrated the writers using more outrageous circumstances to try to win back viewers. He considered this to be the pivotal point that marked the decline in quality — and you can see the decline in popularity of Happy Days during the later years of the show.
Hein’s phrase has not “jumped the shark” and continues to be used colloquially to reference any previously popular or high-quality element that has taken a sudden turn for the worse or outlandish. It’s often used in reference to TV criticism, but could be applied to writing, personal behavior, relationships, or a variety of subjects that have ongoing output.
Advertisement
Friend Zone
The OED cites the first recorded use of the term “friend zone” in a season 1 episode of Friends. The term is defined as any relationship in which “one person has an unreciprocated romantic or sexual interest in the other.” In the context of Friends — specifically in the episode “The One With the Blackout” — Joey used “friend zone” to describe Ross’ relationship with his unrequited crush Rachel. The term has since evolved, and is now used as both a noun (as it was on Friends) and a verb. To be “friend zoned” means that one party has made clear that they are keeping another person (who has romantic interest) firmly in a friendship category.
Saying the Quiet Part Out Loud
The Simpsons has contributed much to the English language over its illustrious run, from “embiggen” to “meh,” and it also helped originate the idea of “saying the quiet part out loud.” In the season 6 episode “A Star Is Burns,” the character Krusty the Clown was bribed in exchange for a vote in a movie festival. When asked how he could vote for an inferior film, Krusty replied, “Let’s just say it moved me … to a bigger house! Oops, I said the quiet part loud and the loud part quiet.”
While it’s hard to trace the exact origins of concepts such as this, The Simpsons is believed to be the earliest use of the general idea in any form, as there’s no evidence of an earlier instance. Today, it’s more commonly written as “saying the quiet part out loud,” and the phrase is generally used when someone unintentionally reveals the subtext of a statement (or perhaps intentionally reveals it in a sarcastic or ironic manner).
Advertisement
Debbie Downer
In 2004, the hilarious Rachel Dratch played the character Debbie Downer on Saturday Night Live. The character was known for interrupting otherwise pleasant conversations with depressing facts about the real world, often unintentionally and never maliciously. Dratch debuted the character on the May 1, 2004 episode, and it became a recurring bit. Today, calling someone a Debbie Downer is essentially telling them to stop bringing down the mood and lighten up.
What is the greatest legacy of Buffy the Vampire Slayer? Buffy fans might have a lot of answers for you, but the widespread answer is popularizing the use of “google” as a verb. In the 2002 episode “Selfless,” computer geek and sidekick Willow asks Buffy, “Have you googled her yet?” — referring to the idea of using the internet to search for data about someone else. At the time, Google was still relatively new, and it wasn’t yet the de facto search engine for everyone. Founder Larry Page had used “google” as an intransitive verb as early as 1998, but Buffy ushered in a new transitive use about googling something specific, compared to how Page used googling as a general concept.
This brief line in an episode of Buffy contributed to “google” moving from company name to general term. The verb “google” was named the most useful word of 2002 by the American Dialect Society, and it entered the Oxford English Dictionary in 2006.
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
It’s doubtful that we can trace the exact origin of the phrase “Doubting Thomas,” but we do know who it refers to. Are you familiar with the Thomas in question?
When talking about someone who’s habitually incredulous, why do we say “Doubting Thomas” instead of “Skeptical Samuel” or “Mistrusting Mary”? It’s because the Thomas in question happened to be quite famous. No, not Thomas Edison, and not Thomas the Tank Engine either. This Tom was known for his skepticism: The origins of this phrase refer to Thomas the Apostle from the Bible.
Advertisement
Thomas’ doubting nature appears in chapter 20 of the Gospel of John in the New Testament. According to that tale, Thomas was told that a resurrected Jesus Christ had appeared before the other apostles, though he refused to believe them until he saw it with his eyes. One week later, Jesus appeared before Thomas, causing the apostle to change his mind and believe. This scene has been a popular topic among artists as far back as the sixth century, with one of the earliest examples found in a mosaic at the Basilica di Sant’Apollinare Nuovo in Ravenna, Italy. It was later painted by Caravaggio in his 1601-02 work “The Incredulity of Saint Thomas.”
The phrase “Doubting Thomas” didn’t appear in the Bible, though, and it’s difficult to know for sure when it became a popular idiom. The Oxford English Dictionary cites an early example of the phrase in print in an 1883 article in Harper’s Magazine: “Doubting Thomases, who will only believe what they see, must wait awhile.” Given the long-lasting popularity of that magazine, it stands to reason that it helped inspire the readership to pick up and spread the usage of the same phrase.
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Those of us who are from the 1900s (as the kids say) likely equate the idea of being “preppy” with boat shoes, sweaters tied around your shoulders, and yachting with grandpapa in Nantucket. But that classical definition of preppy has changed in recent years, especially among Gen Zers. This made me wonder how the term has evolved over the generations since it was first coined.
Advertisement
The Oxford English Dictionary (OED) notes the word “preppy” was first used in 1900 as a synonym of “immature.” “Preppy” was in reference to “prep school,” where the students were children and teens — hence the “immature” connotation. This use was sporadic until the 1950s, a decade that marked the emergence of the baby boomer generation. However, the earlier term related to the school structure is the noun “prep.” Since at least the mid-1800s, to call someone a “prep” meant that they were a prep school student, and the school names themselves were often shortened. We see an example in a famous depiction of mid-20th-century prep school culture in J.D. Salinger’s Catcher in the Rye: “Pencey Prep is this school that’s in Agerstown, Pennsylvania.”
During the early 1960s, “preppy” came to refer to a specific fashion style worn by those who attended prep school — individuals who tended to belong to a conservative, wealthy social set. This style could be described as neat and classic, characterized by ties, sweaters, and school emblems. During the peak Generation X years (do the math for the young adulthood of those born between 1965 and 1980), the preppy style included plaid, herringbone, and houndstooth patterns, as well as tailored suits, loafers, and silk blouses.
Preppy fashions and attitudes spanned Gen X and millennial pop culture, depicted by conservative teen Alex P. Keaton on the 1980s TV show Family Ties, the Walsh twins in the 1990s TV phenomenon Beverly Hills, 90210, the snotty rich girls in the 1989 dark comedy Heathers, and high-society characters such as Carlton Banks from The Fresh Prince of Bel Air and the prep school students of Gossip Girl.
But while the word “preppy” had a similar connotation for the latter half of the 20th century, it means something very different for a Gen Zer. Instead of being restricted to the prep school social set, it has evolved into an aesthetic that anyone can achieve. TikTok user @preppyygals defines “preppy” as a fashion style that includes flowy, colorful dresses and shoes stamped with bright stars and other bedazzled elements. In a Today show story, a Generation Alpha member defined the aesthetic by retailers Lululemon and Lily Pulitzer. Another said it’s “when you wear pink and wear smiley faces.” This shift of “preppy” from a strict reference to prep schools to an open aesthetic (very different from tailored prep school uniforms) demonstrates how slang terms continue to expand and evolve the more people use them. What will “preppy” become for Generation Beta?
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Regionalisms can be found all over the menu, but breakfast is home to plenty of them. While you’re debating scrambled or over-easy, make sure you know whether to order “pancakes” or “flapjacks” in your town.
A full English breakfast traditionally includes eggs, bacon, sausages, baked beans, tomatoes, mushrooms, and toast. While there’s no such specific definition for a full American breakfast, if you were to ask for such a thing at an American diner, you’d likely get some form of eggs, bacon, and a stack of thin, hot, and starchy cakes, smothered in syrup. Depending on where you’re from, those cakes may be called “pancakes” or “flapjacks” — though we wouldn’t blame you if you scarfed them down so quickly, you don’t care about the name. This choice is one of several linguistic food debates common throughout America, with the words varying based on region.
Advertisement
The word “pancake” was the first of these similar terms to be coined, with an etymology dating to circa 1400. The food is defined as “a thin, flat cake of batter, usually fried and turned in a pan. Pancakes are usually eaten with syrup or rolled up with a filling.” The word was created as a literal reference to the cooking process. It’s perhaps the most common variant in the country, especially in large urban areas and the northern United States.
“Flapjack” is a Southern term that can describe pan-fried cakes, but also a specific type of apple turnover. However, in Canada and the United Kingdom, a flapjack is an entirely different food item akin to a biscuit containing rolled oats, as noted by Dictionary.com.
A third contender is “hotcake” — a term you’ll find on the McDonald’s menu. This word was coined in the United States in the late 17th century, referring to “any of various types of cake which are baked on a griddle or fried.” You’re probably familiar with the idiom “sell like hotcakes,” so it makes sense for the fast-food giant to use the alternative term as a tactic to help the food item stand out on the menu (compared with the more generic “pancake”).
Other hyper-regional terms for pancakes include “johnnycake,” which was used by New Englanders in colonial times, but might still be heard in the area. In parts of the South, you also may come across “hoe cakes” on a menu. Put any modern associations with the first word aside — this is related to the practice of cooking cakes on the metal part of a field hoe, which was common among enslaved individuals in that part of the country. Whichever term you decide to use, rest assured that you’ll be getting some decadent goodness on your breakfast plate. Now the only question is: chocolate chips or blueberries?
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Enter your email to receive daily lessons that dive into what makes English so fascinating. Each email is packed with odd rules, etymologies, and the tools you need to be a better communicator.
Sorry, your email address is not valid. Please try again.
Sorry, your email address is not valid. Please try again.