2 MIN READ

Why Do We Say “Cooling One’s Heels”?

The idiom “cool your heels” has galloped its way into our everyday language — but it originated with hooves, not heels.

by Rachel Gresh
Close-up of galloping horses hooves

Many captivating tales begin with racehorses —  real-life champions Seabiscuit and Secretariat had  their legacies immortalized in film and literature. But there’s one equine anecdote that didn’t make it to the silver screen: the origin of the idiom “cool your heels.” It’s used today to evoke a sense of waiting impatiently or for an extended time, but this expression originated on the racetrack, with  roots tracing back to equestrian practices from several centuries ago.

Advertisement

The idiom comes from a literal sense of cooling hooves. After a tiring ride or a nail-biting race, horses  need to be cooled down to prevent injury, often by dousing their lower legs with water, called “cooling.” This practice, still done today, restricts blood flow to reduce inflammation and soothe weary muscles to promote recovery. Without this equine remedy, star racehorses might not have succeeded, and we would be down one idiom.

The phrase “cooling the hooves” was used during the 16th century in the literal sense, and shortly after it left the stable and entered everyday language as the figurative expression we use today. In the mid-18th century, we see the clear transition from a horse’s steady hooves to the sturdiest part of a person’s foot, the heel. Henry Fielding wrote in Amelia (1752): “In this Parlour, Amelia cooled her Heels, as the Phrase is, near a Quarter of an Hour.”

Today, Merriam-Webster defines this idiom as “to wait or be kept waiting for a long time especially from or as if from disdain or discourtesy.” You’ll see it used casually, whether lightheartedly or out of frustration, in situations like this: “I know you’re anxious to get started, but cool your heels while we get the paperwork together.” You might notice the striking connection between an exuberant racehorse needing a cooldown and an eager human forced into a delay, both parties expelling energy and a sense of urgency.

Featured image credit: KateLeigh/ iStock
Rachel Gresh
Freelance Writer
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
Advertisement
4 MIN READ

How Punctuation Marks Change in Other Languages

While many languages that use the Latin alphabet have similar punctuation marks, other languages use different symbols. Let’s learn how some languages around the world punctuate sentences.

by Jennifer A. Freeman
Closeup of a book on a white surface with writing in Japanese

The English language uses the Latin alphabet — the “ABCs” that toddlers learn to sing — and those building blocks of words come with punctuation marks that add structure and nuance to sentences. While many languages that use the Latin alphabet have similar punctuation marks, there are many more languages and alphabets with additional options. Even when the function is the same (ending or pausing a sentence, for example), different symbols do the work. Let’s learn how some languages around the world punctuate sentences.

A Brief History of Punctuation

Many ancient alphabets and languages didn’t include a system of punctuation, but the earliest known recording of punctuation is the Mesha Stele, also called the Moabite Stone, found in what is now Jordan. The artifact, which dates to 840 BCE, is written in a version of the Phoenician alphabet, with points and horizontal strokes to separate words.

Most ancient languages developed without punctuation or even spaces between words in written form, making it tricky for readers to parse out the meaning, but around the fifth century BCE, Greek playwrights began to use marks to help with reading their stories out loud. The Greek scholar and librarian Aristophanes of Byzantium is sometimes credited as the inventor of punctuation because he marked sections of writing with different types of dots — symbols that would come to be known as commas, colons, and periods.

Punctuation became even more widely used with the advent of the printing press in the mid-15th century. And when the typewriter was invented in the mid-19th century, it became important to standardize the use of punctuation marks to aid both reading and comprehension.

Commas

A comma is used in English to indicate a short pause or to separate items on a list. Several other languages also use commas, but they have a slightly different look.

In Arabic, words are written from right to left instead of left to right, so the comma faces the opposite direction (،) compared to a comma in English. Japanese commas don’t curve (、), and the straight-line punctuation mark is used extremely liberally in Japanese writing. Instead of following specific grammar rules for usage, the Japanese comma can be inserted anywhere the writer wants a break or a pause. The Japanese language also uses full-width spacing (giving that extra room around the comma), as opposed to half-width spacing in English.

Quotation Marks

In English, these marks (“) are used at the beginning and end of a quoted passage to set it off from the rest of the sentence. The Filipino, and Hindi languages handle quotation marks the same way American English does, but others take a different approach.

Several language systems, including German, Dutch, Hungarian, Hebrew, Romanian, and Icelandic, use one quotation mark at the bottom and one at the top at either end of the quoted text („…”). French, Greek, Italian, and Spanish occasionally use angular quotation marks, called guillemets («…»). The marks are flipped in the Danish language (»…«). In traditional Chinese, quotation marks appear like little bars on either side of the quote (「…」), but in simplified Chinese, the Western-style quotation marks are used.

Question Marks

Virtually every language has some sort of mark to indicate a query. As with commas, Arabic question marks appear backward (؟). In Armenian, the question mark is shaped like a slight curve (՞), and in Greek, the symbol for a question mark looks like a semicolon (;). In Spanish, sentences are bookended with upside-down and right-side-up question marks (¿…?).

Periods

One of the oldest punctuation marks, a period, indicates a full stop at the end of a sentence. This basic dot is not so simply represented in every language.

In the Bengali language, spoken in Bangladesh, the period is simply a straight line (।), which is an excellent way to indicate a divide between sentences. In the Armenian language, the period resembles an English colon (։), and in Japanese, the period resembles a small circle (。), not a simple dot.

Featured image credit: States of Mind/ Adobe Stock
Jennifer A. Freeman
Senior Editor, Word Smarts
Jennifer A. Freeman is the Senior Editor of Word Smarts and Word Daily. When she's not searching for a perfect synonym or reaching "Genius" level on Spelling Bee, she's playing with her Welsh Terrier in Greenville, SC.
Advertisement
2 MIN READ

What Is the Difference Between a Hotel and a Motel?

They both have beds, cable television, and ice machines that make way too much noise. So what exactly differentiates a hotel from a motel?

by Bennett Kleinman
View of luxury hotel in Dubai

When booking an overnight stay, semantics are likely the last thing on your mind — you’re probably more concerned with a comfy bed and some free breakfast the next morning. But if you’re curious about what makes “hotels” and “motels” distinct from one another, we have the answer. 

The word “hotel” refers to a place providing overnight accommodations, meals, and other similar services for tourists. This often excludes hostels and Airbnbs, which don’t exactly fall under the same umbrella. “Hotel” is an English word whose etymology dates to the 1640s. It’s derived from the French l’hôtel, which originally meant “a mansion” or “large house,” but now refers to the lodging-for-hire “hotel” in French as well.

Advertisement

“Motel,” on the other hand, describes a specific kind of hotel. The portmanteau combining the words “motorside” and “hotel” was coined in the 1920s amid the burgeoning popularity of the personal motor vehicle. Because of this connection, you’ll usually find motels located along major thoroughfares rather than in the heart of a densely packed city — though not always today, as the term now refers more to a style than to a location. Motels tend to have large, free parking lots for anyone staying the night, and each room is typically accessed directly from that parking area. This differs from other types of hotels, where guests are more likely to access rooms through a communal central lobby.

So, while all motels are hotels, not all hotels are motels. While they’re both likely to offer a comfortable place to sleep at the end of the day, their etymological distinctions set the two types of accommodations apart.

Featured image credit: Imaster/ Adobe Stock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
3 MIN READ

Why Do We Call an Unmarried Man a “Bachelor”?

Discover how a term once reserved for a knight in shining armor evolved into a modern label for an unmarried man.

by Rachel Gresh
Young man looks at the nature view

Many medieval English titles have withstood the test of time — the monikers of “queen,” “duke,” and “earl,” for example, evoke the same significance now as when they were coined. However, one particular title has undergone a surprising transformation: bachelor. Once a term for a young knight, “bachelor” now designates an unmarried man, a shift that took centuries to complete.

Like many English words, “bachelor” was derived from Old French, appearing in various Middle English spellings including “bacheler,” “bachelier,” and the familiar “bachelor.” During the Middle Ages, the title was styled as “knight bachelor,” denoting the most common or basic type of knight, typically a young man without a title or land. Eventually, knights ascended through the hierarchy and gained new titles. The subsequent ranking, the knight banneret, served as a commander of a company of knights.

Advertisement

The etymology of “bachelor” earlier than Old French is debated. One leading theory points to the Medieval Latin baccalarius, meaning “vassal farmer, adult serf without a landholding.” This term is rooted in baccalaria, referring to fields or land under a lord’s control. Another theory proposes a simpler origin from the Latin baculum, meaning “a stick,” because a squire would practice with a staff instead of a sword, symbolizing a beginner status.

By the end of the Middle Ages, knights became obsolete on battlefields with the rise of gunpowder, and the term “bachelor” expanded in usage, melding the two Latin roots with a sense of a novice young man with no land or title. It slowly became a word for any unmarried man, regardless of knighthood status. It conveyed a sense of youth and inexperience, and as such, by the end of the 14th century, “bachelor” also represented someone who had obtained the lowest degree at a university — a bachelor’s degree.

As the centuries progressed, the medieval connection between “bachelor” and knighthood has been nearly forgotten. The word is firmly entrenched in modern English wedding vernacular, where “bachelor party” has described prewedding festivities since the end of the 19th century. This is also when “bachelorette” emerged in English, formed by adding the French suffix “-ette” to “bachelor.” At first glance, the “bachelor” of the raucous “bachelor party” or a modern “bachelor pad” seems to stray far from its original usage, but its etymology reveals a strong connection between the definitions by exemplifying how societal roles change (and how they stay the same) over time. 

Featured image credit: gawrav/ iStock
Rachel Gresh
Freelance Writer
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
Advertisement
2 MIN READ

Why Are Nosy People Called “Eavesdroppers”?

Listen — we’ve all been guilty of eavesdropping, accidental or otherwise. But for as familiar an action as it may be, its origin story is less well known.

by Bennett Kleinman
Young woman listening intently to what she hears through a megaphone

Maybe you’re in a quiet train car and someone is talking loudly on their phone, or you’re around the corner from colleagues chatting in the break room and they don’t know you’re there. However you end up in that situation, sometimes the juicy details of another person’s private conversation are too tantalizing to resist, and now you’re an eavesdropper. “Eavesdropping” is a catch-all word that refers to any sort of aural snooping, but it wasn’t always that way. Early eavesdropping took place in a very specific location, from which the term got its name.

The word “eavesdrop” comes from “eavesdrip,” which was coined in an Old English charter from 868 CE, though it had little to do with snooping and more to do with architectural features of a house. “Eavesdrip” referred to the area around a house where rain fell off the edge of the roof (extensions known as “eaves”) and onto the ground. 

Advertisement

Sometime before the 1500s, the architectural term evolved into “eavesdrop,” and it also acquired its modern connotation of snooping. If someone was an eavesdropper, they were known to stand in close vicinity around a house so they could secretly listen in on the private conversations happening inside. In 1515, one text warned citizens about “Euesdroppers vnder mennes walles or wyndowes… to bare tales.” Another citation from 1611 reads, “To eaue-drop, to prie into men’s actions or courses.” Despite the variations in spelling, you can see “eavesdroppers” had much the same reputation back then as they do now.

In time, “eavesdrop” shed its very literal definition. Someone no longer has to literally stand beneath an eave in order to eavesdrop — neighborhood busybodies are just as comfortable picking up gossip on their front porch or on the phone as they’re standing under the eaves. 

Featured image credit: halbergman/ iStock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
2 MIN READ

When Should I Use “Different From” vs. “Different Than”?

Though interchangeable in everyday conversation, “different from” and “different than” carry subtle distinctions that can elevate your grammar game.

by Rachel Gresh
Two direction arrows on chalkboard

Recently I was chatting with a friend about our favorite morning beverages, comparing the qualities of espresso, lattes, matcha, and tea. As the conversation unfolded, I noticed we freely alternated between “different from” and “different than” without a second thought. For instance, I said, “Matcha is different from regular green tea,” while my friend declared, “An Americano definitely tastes different than a regular coffee.” With our coffee and tea preferences settled, the grammar nerd in me was left wondering: Which of these phrases is grammatically correct?

It turns out that “different from” is the more formal and universally accepted phrase. If you aim to please style guides, it’s the safer choice. The statement “sympathy is different from empathy” flows naturally. However, “different from” isn’t the only acceptable variation; “different than” and “different to” have been around for centuries, each with its own set of nuances.

Advertisement

“Different to,” as in, “Her approach to the issue was different to mine,” is most often used in British English. It’s best reserved for informal situations, in the same way that its American counterpart, “different than,” should be treated. 

According to Merriam-Webster, “different than” got a bad rap in the 19th century when grammarians adopted a little-known guideline stating that “than” should only be used following a comparative adjective showing a higher or lower quality or degree, such as “taller” or “worse.” Following that rule, the correct usage of “than” is “My brother is taller than my sister but shorter than my dad.” However, “different” can function as a comparative adjective, which is why it might feel more natural to use “than” instead of “from,” as seen in the example, “The second book was different than the first — it had a much darker tone.”

So, where does this leave us? Although “different than” has been shunned by grammar purists for centuries, it’s perfectly acceptable to use, especially in informal language. However, for those bound to formal writing conventions or style guides, “different from” remains the go-to choice in American English.

Featured image credit: stevanovicigor/ iStok
Rachel Gresh
Freelance Writer
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
Advertisement
2 MIN READ

What Are Metonymy and Metalepsis?

Using one word to refer to another thing can sound counterintuitive, but we do it daily. Metonymy and metalepsis are two concepts that explain how we use substitutions in our speech.

by Julia Rittenberg
Crown in hands close up above a table

Even if you pride yourself on being plainspoken or delivering a direct message, figures of speech are likely sprinkled throughout your conversations. The words “metonymy” and “metalepsis” may sound intimidating, but you’ve probably been using these poetic devices in your personal lexicon without even knowing it. 

Metonymy is “the substitution of the name of an attribute or adjunct for that of the thing meant.” Like many terms that describe linguistic elements, “metonymy” comes from Greek — metōnumia translates to “change of name.” Think about how often people say “Hollywood” to refer to the entire film industry, whether or not a movie was filmed in the very small Hollywood neighborhood of the city of Los Angeles. Even a movie production based in Atlanta is part of Hollywood, in the sense of the industry.  

Another common metonymic word is “crown.” Literally, it’s what a monarch wears on their head, but we also use the word to refer to a monarchy in general. Netflix’s hit show The Crown, for example, focuses first on the life of Queen Elizabeth II, but also incorporates the lives and experiences of her siblings, children, and family members.

Advertisement

On the other hand, metalepsis (from the Greek for “substitution”) is when “one thing refers to another thing that is only slightly related to it,” or substituting an unrelated term for the original concept. When I get obsessed with a new fantasy book series and spend days researching fan theories, I might use the metaleptic phrase “falling down the rabbit hole” to explain my obsession. I’m not physically going anywhere when I’m in a fandom rabbit hole, and I’m not actually reading Alice’s Adventures in Wonderland (where the “rabbit hole” reference comes from), but the metalepsis helps me effectively communicate how much a new fixation has taken over my life. 

Using a figure of speech that falls into the category of metonymy or metalepsis is like calling in a talented understudy on a night the star can’t perform — the audience will appreciate the great performance from the word you choose to use in place of the original concept.

Featured image credit: Natali/ Adobe Stock
Julia Rittenberg
Freelance Writer
Julia Rittenberg is a culture writer and content strategist driven by a love of good stories. She writes most often about books for Book Riot. She lives in Brooklyn with a ton of vintage tchotchkes that her cat politely does not knock over.
Advertisement
6 MIN READ

How To Never Make These Common Punctuation Mistakes Again

Punctuation errors abound. Here’s how to create coherent midsentence breaks and end your thoughts on a high note.

by Rachel Gresh
punctuation marks from typewriter

A sentence without appropriate punctuation is like a highway without road signs — it still technically functions on a basic level, but it lacks clarity. Using punctuation properly can be tricky, though, and common errors, from comma splices to misused semicolons, can make writing feel disjointed. Even the most seasoned writers fall victim to these mistakes. Fortunately, there are some practical guidelines and tips for avoiding punctuation errors and ensuring clear, coherent communication.

Prevent Comma Splices

A comma splice occurs when two independent clauses are linked by a comma without a conjunction. Consider this example: “I went to the library, I found the book I was looking for.” If both segments of the sentence can stand alone as a complete sentence, as in this case, it’s a comma splice.

The easiest way to fix this is to add a conjunction, a word that links the clauses. You might fix the aforementioned sentence by saying, “I went to the library, and I found the book I was looking for.” Throwing in a coordinating conjunction is typically a safe bet. These are easy to remember using the acronym FANBOYS (for, and, nor, but, or, yet, so). 

If a conjunction doesn’t work in the context or tone of your sentence, you can change the comma to a semicolon, which is made for linking independent clauses: “I went to the library; I found the book I was looking for.” If none of these options seems appropriate, simply split your clauses into two separate sentences.

Advertisement
Don't Use Commas With Incomplete Clauses

While commas are essential to use before conjunctions that connect two independent clauses, a comma isn’t necessary if one of the clauses is dependent. This requires knowing the difference between independent clauses (those that can stand alone as complete sentences) and dependent (or incomplete) clauses that cannot stand alone. 

Consider this incorrect example: “I went to the gym, after I finished my work.” The latter clause is dependent (or incomplete) because “after I finished my work” cannot stand alone as a sentence, so the comma between clauses is unnecessary. The correct way to format this sentence is, “I went to the gym after I finished my work.”

Drop Hyphens if the Compound Modifier Is an Adverb Ending in "-ly"

​​Hyphens are useful for compound modifiers used before a noun, e.g., “well-known,” “high-quality,” “state-of-the-art.” However, you don’t need the hyphen if one of the words in the compound modifier is an adverb that ends in the suffix “-ly,” such as “loudly,” “quickly,” or “extremely.” 

This rule is why you won’t find hyphens in compound modifiers such as “highly respected” and “deeply rooted.” A few other adverbs also shouldn’t be hyphenated when used as a modifier, including “very,” “most,” and “too.” Some incorrect examples include “very-talented,” “most-wanted,” and “too-kind.” Instead, drop the hyphen and leave a space between the modifying words. 

Advertisement
Don't Confuse Semicolons and Colons

These similar-looking punctuation marks are often mixed up, but here’s how to tell them apart. A semicolon essentially acts as a bridge joining two related clauses, as seen in the comma splice fix we looked at earlier. Here’s another example: “I went to the dentist; she gave me a good report.” This punctuation mark also has one other job: separating items in a list if the items themselves require commas. For example: “I’ve visited London, England; Dublin, Ireland; and Athens, Greece.” The semicolon provides a more significant delineation where throwing in additional commas would cause confusion.

Colons have more functions than semicolons. They’re used to introduce lists, explanations, and quotations, or to emphasize a particular idea. They often precede information set up in the first clause. For example, “She had one goal: to finish her first marathon before she turned 30.” This sentence uses a colon to create a pause and add emphasis. Colons can also be used to list things, as in, “I need three things on Monday morning: coffee, headphones, and more coffee.” However, if the word before the list is a verb or preposition, a colon isn’t necessary. For instance, “My favorite holidays are Christmas, St. Patrick’s Day, and Halloween” does not require a colon.

Another rule to keep in mind is that a colon must be preceded by an independent clause (a stand-alone sentence). Consider the incorrect example, “Her worry was: the deadline was quickly approaching.” The correct colon usage in this case is: “She had one worry: the quickly approaching deadline.” Finally, if the clause following the colon is dependent (as seen in the last example), the first word does not need to be capitalized. However, if the clause is independent (a complete sentence on its own), it should be capitalized, as seen in the example, “She had one worry: The deadline was quickly approaching.” 

Avoid Using Hyphens, En Dashes, and Em Dashes Interchangeably

As if English wasn’t complicated enough, it also features dashes of three different lengths that can significantly alter words and sentences. The shortest dash, the hyphen (-), is the most common. It’s used to create compound words such as “sister-in-law” or  modifiers including “clean-cut” and “well-dressed.” 

The em dash (—), the longest of the three, is the second most common. It functions as a pause, and depending on your goal, it can replace a comma, a colon, or parentheses. It’s used to set off extra information, as in, “The concert — though highly anticipated — ended early due to weather concerns.” It can also add emphasis: “The choice was clear — leave now or stay forever.” Whether or not to use spaces around the em dash depends on the specific style guide to which you adhere.

The en dash (–) is the least used of the three dashes. In standard American English, it indicates a range between numbers, as in, “pages 40–55,” and it replaces the word “to” in statements such as, “I took the New York–Los Angeles red-eye.” Interestingly, in British English, it sometimes performs the job of an em dash, creating a break to emphasize additional information.

With so many punctuation rules and nuances, it’s no wonder things can quickly get confusing. If you struggle with any of these errors, rest assured you’re not the only one. Knowing the tips and tricks to avoid them will elevate your writing and leave a lasting impression on your reader(s).

Featured image credit: spaxiax/ Shutterstock
Rachel Gresh
Freelance Writer
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
Advertisement
7 MIN READ

How Many Of These Products Do You Recognize As Brand Names?

Learn some fascinating stories behind the products that started as trademarks but became so popular they’re now used as everyday names.

by Bennett Kleinman
Man holding a roll of plastic bubble wrap

More often than shoppers might realize, everyday products come to be known by the name of the brand most associated with making them. Although a product may have a generic name (e.g., facial tissue), a company name (e.g., Kleenex) has become the more universally accepted term for the product. 

This phenomenon is called “genericization,” and it happens when a trademarked (aka brand) name is so widely used that it becomes the product’s identifier. Just look at Band-Aids, the brand name now commonly used to refer to any sort of adhesive bandage, orQ-tips (synonymous with cotton swabs) and, of course, Kleenex. These are some of the more well-known examples, but plenty of other brands have benefited from genericization as well.

Vaseline

Vaseline” is a brand name for petroleum jelly, a multiuse product found in virtually every drugstore. Today, you might hear “Vaseline” used to refer to petroleum jelly-based products in general, even though competitors such as Aquaphor serve the same purpose. So why has “Vaseline” become the preferred term? The answer is simply that it’s been around the longest: Vaseline became the first commercial petroleum jelly 150 years ago, and is now the most popular and enduring manufacturer of the stuff.

Styrofoam

“Styrofoam” is the brand name of a polystyrene foam product created by the Dow Chemical Company in 1941. The name is often used to describe any foam container, especially of the variety used to contain food and beverages, but these containers are actually made of expanded polystyrene foam — not Styrofoam at all. The real Styrofoam is manufactured by Dow for building insulation.

Advertisement
Dumpster

The term “Dumpster” was coined in November 1936, when George Dempster of Knoxville, Tennessee, introduced his line of large garbage bins that could be picked up by a specialized truck. He named the bins after himself, calling them “Dempster Dumpsters.” Thus, “Dumpster” became a genericized term for a mobile garbage receptacle in the United States. In the U.K. and other English-speaking countries, they’re usually called “skip bins.”

Popsicle

Popsicle” is in fact a trademarked brand name for ice pops, owned by Unilever. The history of the Popsicle begins more than 100 years ago in 1905, when 11-year old Frank Epperson invented flavored ice on a stick and called it an “Epsicle,” using the root of “icicle.” In 1923, Epperson patented the Popsicle (renamed because his children insisted on calling them “Pop’s ’sicles”), which was eventually bought by Unilever; the company soon began creating other product lines such as the Creamsicle and Fudgsicle (which are also trademarked terms).

Frisbee

Most Americans refer to plastic flying discs as “Frisbees,” but that term is actually the brand name of Mattel Toy Manufacturers’ patented disc. Mattel bought the toy in 1994 from Wham-O, which became the first company to produce them in 1957. 

The unique name stems from the Frisbie Pie Company in Bridgeport, Connecticut, which supplied pie tins to college campuses, where students would throw the empty tins at each other, yelling “Frisbie!” The nickname for the flying discs caught on, and Wham-O decided to adopt it for their new “Frisbee” toy. The Wham-O designer also developed the concept of “Frisbee golf,” another extension of genericization, because the popular game can be played with any type of flying disc.

Advertisement
Velcro

“Velcro” received the genericization treatment because “hook-and-loop fastener” doesn’t have quite the same ring to it. Velcro is a trademark owned by the U.K.’s Velcro Companies. The original design was named by combining the words “velvet” and “crochet,” an apt textural description of the product’s two different sides. 

In a satirical music video, company “lawyers” beg consumers to stop using the name “Velcro” when referring to other generic products. The “Don’t Say Velcro” campaign attempts to educate consumers on the difference between genuine Velcro products and other similar products in an attempt to prevent further genericization of the brand.

Jacuzzi

“Jacuzzi” is often treated as a synonymous term for any type of bubbling hot tub. But the word is actually a proprietary name belonging to Jacuzzi — a private company that makes bathtubs, showers, and other similar products. The origins of both the company and its name date to the early 20th century, when two brothers named Valeriano and Francesco Iacuzzi immigrated to the United States from Italy. Upon their arrival, an immigration official mistakenly wrote down their surname as “Jacuzzi,” and it stuck. 

The five other Iacuzzi/Jacuzzi siblings eventually made their way to the U.S. as well, including Candido, who invented a hydrotherapy pump in the 1940s. This pump was developed into a product meant for home use, which could turn any normal bathtub into a spa-like experience — and thus the first Jacuzzi-brand tubs were born. The term is trademarked by the company, meaning only hot tubs manufactured by Jacuzzi can accurately be referred to as such.

Novocain

Anyone who’s ever undergone a serious dental procedure has probably been injected with Novocain, a powerful numbing agent. But Novocain (the original German spelling — it’s spelled “Novocaine” in the United States) is the name of a product rather than a generic drug. 

This trademarked term is owned by Hospira, Inc., and is used for a drug known as “procaine hydrochloride.” Novocain specifically was invented by German chemist Alfred Einhorn in 1905 as a replacement for cocaine, which was a popular anesthetic  prior to Novocain’s creation. Einhorn chose the name “Novocain” as it combined the Latin nov- (“new”) with the suffix -caine, which was commonly ascribed to alkaloid anesthetics.

Advertisement
Bubble Wrap

In 1957, Marc Chavannes and Al Fielding created a textured plastic intended to be used as wallpaper, which was originally called “Air Cap.” Shortly thereafter, the product was renamed “Bubble Wrap,” which has developed into a widely used genericism today but is actually a trademarked term owned by the Sealed Air Corporation. While similar sheets of plastic containing fun-to-pop bubbles of gas are available on the market from other manufacturers — under generic terms such as “bubble packing” — true Bubble Wrap comes from that specific company.

Realtor

This may not be a product, per se, but Realtors provide an important paid service to those searching for a new home. However, the term “Realtor” refers to a very specific individual who’s a registered member of the National Association of Realtors (NAR). So while you may receive similar services from a nonregistered real estate agent, it would technically be incorrect to call them a “Realtor.” 

Registered Realtors adhere to a strict code of ethics and are authorized to use the term, suggesting you may receive a special level of service that stands out above the rest. The NAR says the preferred way to format the word is “REALTOR” in all caps, as that’s how it’s registered.

Seeing Eye Dog

The Seeing Eye is an organization based in Morristown, New Jersey, that’s been training dogs to guide visually impaired people since 1929. It was the first guide dog school founded outside Europe and is the oldest extant guide dog school in the world. Because it’s so well known, the term “Seeing Eye dog” has become synonymous with guide dogs in general, especially those trained to assist people who are blind or visually impaired. When you’re speaking generically, however, you should use the term “guide dog” — save “Seeing Eye dog” for pups that have been specifically trained by the Seeing Eye organization. 

Featured image credit: stocksnapper/ iStock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
2 MIN READ

When Should I Use “Well” vs. “Good”?

“How are you?” “I’m good.” It’s one of the most natural things to say, but it’s grammatically incorrect. Let’s learn the rules for why it’s wrong.

by Samantha Abernethy
female feeling relaxed outside

To remember the distinction between “well” and “good,” I think of a scene from the sitcom 30 Rock. Tracy Jordan (portrayed by actor Tracy Morgan) asks someone how they’re doing, and the person says, “I’m doing good.” Jordan responds: 

Superman does good; you're doing well. You need to study your grammar, son.

He was right. To explain this in the simplest terms, the two words are different parts of speech. “Good” is an adjective, meaning it modifies nouns, and “well” is an adverb that modifies verbs and adjectives. But of course there are exceptions, namely that “well” can also be an adjective, especially referring to health. All of the following examples are grammatically correct but demonstrate different meanings:

  • “She smells good” suggests that she has a pleasant scent.
  • “She smells well” implies that she has a strong sense of smell.
  • “He feels well” implies that he is in good health.

Linking verbs are a common source of confusion for using “well” and “good.” The verbs “feels,” “seems,” “looks,” and “is” are called linking verbs because they don’t show action as most verbs do, which might be why the adverb “well” and the adjective “good” are confusing in context. 

In the example from 30 Rock, “I’m doing good” is incorrect because “good” is an adjective. The adverb “well” is technically correct here because it modifies the verb “doing.” Another trick to remembering the difference is to substitute the word “quick,” which is an adjective. If you find the adverb “quickly” sounds better in the same context, the sentence needs the adverb “well.” 

  • She walked quickly. She walked well. 
  • He is a quick thinker. He is a good thinker.
  • I am quickly doing my homework. I am doing my homework well.

So, was James Brown grammatically incorrect when he sang, “I feel good”? Grammarians can argue “I feel well” is more accurate in a health context, but it doesn’t have quite the same ring to it. “I feel good” is fine for casual speech (and legendary songs). 

Featured image credit: pixdeluxe/ iStock
Samantha Abernethy
Freelance Writer
Samantha Abernethy is a freelancer in Chicago. When she isn't staring at a laptop, you can find her sniffing out the best coffee with her greyhound Ruby, or chasing her kids around the nearest library.
Advertisement