3 MIN READ

What Is the Oxford Comma (And Why Is It Debated)?

This tiny punctuation mark has been the source of heated debate for over a century, but whether you use it or not comes down to your personal writing style.

by Rachel Gresh
Keyboard comma key

In 1905, a great punctuation war was sparked when Horace Hart, main editor and printer at the Oxford University Press, published for the first time what would become known as the “Oxford comma.” This punctuation mark, also called a “serial comma,” a “Harvard comma,” or to some, an unnecessary comma, comes after the penultimate (next-to-last) item in a list, followed by the conjunction “and” or “or.”

Advertisement

In the sentence “Her favorite types of tea are peppermint, chamomile, and oolong,” the Oxford comma comes right after “chamomile.” However, omitting that final comma — “Her favorite types of tea are peppermint, chamomile and oolong” — is also correct. Those who omit the comma believe it’s superfluous because it comes before a conjunction separating the last word in the list. Grammatically, neither side of the serial comma debate is wrong. You can use it or skip it; it’s a matter of personal style. Just be consistent throughout the piece. However, you might have to set personal preference aside if you follow a specific set of writing rules. For example, the AP Stylebook discourages using the Oxford comma in simple lists, while the MLA Style Manual, The Chicago Manual of Style, and APA Style champion the Oxford comma. Of course, the Oxford University Press style guide (also called New Hart’s Rules) and Harvard University Press retain the comma, which is where its aliases came from.

In what might be one of the greatest grammar paradoxes in history, the University of Oxford Style Guide, which differs from New Hart’s Rules, discourages using the Oxford comma in most situations. This instruction is straight from its style guide: “Note that there is no comma between the penultimate item in a list and ‘and’/‘or’, unless required to prevent ambiguity.”

There are certainly instances where an Oxford comma can prevent ambiguity. Consider: “We invited the kids, Mary and William.” This can be interpreted in two ways. It could mean that the kids, whose names are Mary and William, were invited, or it could mean that the kids were invited, along with two other people named Mary and William. If you meant the latter, you might add an Oxford comma: “We invited the kids, Mary, and William.”

The Oxford comma debate remains an enduring tradition between passionate supporters and adversaries. You might even see “Has opinions on the Oxford comma” on the profile of someone who wishes to convey a level of nerd chic on their dating app profile. While we stand on the side of the Oxford comma at Word Smarts, our best advice is to embrace your personal preference unless otherwise directed by a style guide.

Featured image credit: hxdbzxy/ Shutterstock
Rachel Gresh
Freelance Writer
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
Advertisement
3 MIN READ

When Would You Use This Super-Rare Verb Tense?

Future perfect continuous in the passive voice is indeed a legitimate verb tense, but its complexity makes it an elusive part of the English language, rarely seen outside the walls of a classroom.

by Rachel Gresh
letters of the alphabet with verb speech bubble

When trying to make sense of historical dialogues, I’m often struck by the elaborate language — superfluous words, archaic vocabulary, and formal tones. While not quite a historical reproduction, the modern hip-hop musical Hamilton gave us some memorable tongue-twisting lines: “If it takes fighting a war for us to meet, it will have been worth it.” Here, “will have been” is in the future perfect tense. While the phrasing could be reduced to the simple past tense — “If it takes fighting a war for us to meet, it was worth it” — the rhythm of the future perfect tense adds a layer of emphasis and interest that captures more audience attention as Alexander Hamilton meets Eliza.

Advertisement

Let’s decode an even more intricate verb tense: the future perfect continuous in the passive voice. (According to one grammarian, this is the most rarely used verb tense in English.) Transforming Hamilton’s words to this tense results in: “If it takes fighting a war for us to meet, it will have been being made worth it.” This construction is not only a mouthful, but it’s nonsensical in this usage. The verb tense is not ignored because it’s complicated to say; it’s just useful only in certain contexts. 

The future perfect continuous passive tense has useful, albeit limited, applications. It depicts an action that will have been ongoing by a specific point in the future. For example: “The magazine will have been being published for three years by next January.” Here, the emphasis is on the duration of the magazine’s publication, highlighting the action itself rather than who or what is performing it.

To construct a sentence in the future perfect continuous tense, you need three auxiliary verbs: “will have been.” To shift into the passive voice, add the verb “being,” followed by a past participle (an “-ed” verb), and a “by” phrase to indicate when the action will conclude. Here’s the general formula for this uncommon tense: subject + “will have been” + “being” + past participle + “by” phrase.

Putting it all together looks like this: “The skyscraper will have been being constructed for a decade by this time next summer.” Alternatively, you can reposition the “by” phrase to the beginning: “By this time next summer, the skyscraper will have been being constructed for a decade.” On even rarer occasions, a “by” phrase isn’t needed: “Next year celebrates the 10th year that students will have been being trained at this center.”

This rare tense is undoubtedly long-winded and unwieldy, but it is a valid grammatical construct. That said, don’t feel pressured to use it in everyday conversation. Simpler alternatives often suffice, though examining the versatility of language can give us ideas for how to be better communicators. 

Featured image credit: Fauzi Muda/ Shutterstock
Rachel Gresh
Freelance Writer
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
Advertisement
3 MIN READ

Is It “Make Due” or “Make Do”?

We have to make due (or is it make do?) in tough situations. Let’s at least clear up the spelling.

by Samantha Abernethy
Top view, computer and hands typing in office

Short answer: The correct phrase is “make do,” but “make due” is a common mistake. We’re a curious bunch, though, so let’s examine why.

Advertisement

This phrase combines the separate verbs “make,” meaning “to form (something) by putting parts together or combining substances; construct; create,” and “do,” meaning “to perform (an action, the precise nature of which is often unspecified).” When combined, “make do” means “to manage to get by,” whether that involves settling for less or improvising a different solution. If you add a hyphen, “make-do” can also be used as an adjective and a noun, as in “a make-do repair” or “the make-dos are working.” When the verb phrase originated in the early 19th century, it was sometimes said as “make it do,” as it appears in Charlotte Brontë’s Jane Eyre in 1847: “‘Oh, very well!’ returned Miss Temple; ‘we must make it do, Barbara, I suppose.’” 

The adjective and noun forms came later, with the noun form showing up at the end of the 19th century and the adjective in the 1920s. Rudyard Kipling wrote in “Tales of ‘The Trade’” in 1916: “The full tale of their improvisations and ‘make-do’s’ will probably never come to light.” 

Writing “make due” is not a misspelling, but rather a malapropism, which is a mistake when someone incorrectly uses a similar-sounding word or phrase. What makes this malapropism particularly difficult to understand is that “make due” is not completely incorrect. As an adjective, “due” can mean sufficient, as in “due process” or “due diligence.” And until the 1940s, “make due” was a commonly accepted phrase. 

However, “make do” won the test of time with the popularity of this saying: “Use it up, wear it out, make it do, or do without.” The quote has been attributed to President Calvin Coolidge, but its origin is uncertain. The sentiment certainly suited life during the Great Depression, when people had to make do. During World War II, the saying became a popular slogan in the U.S. and the U.K. as people were encouraged to ration food, gasoline, and other materials to support the war effort. The attitude also applied to the noun form of “make-do” as the term was extended into a longer phrase around WWII: “make-do-and-mend,” which implied there was a process of ongoing repair. A publication from 1947 includes a quote with the line, “This age of bits and pieces, queues, rationing, and make-do-and-mend.”

What I love about “make do” is that the phrase in itself is improvised. There wasn’t a word that described just what was needed, so it was invented by throwing together two verbs to create the verb we wanted. We do the best with what we have on hand. 

Featured image credit:
Samantha Abernethy
Freelance Writer
Samantha Abernethy is a freelancer in Chicago. When she isn't staring at a laptop, you can find her sniffing out the best coffee with her greyhound Ruby, or chasing her kids around the nearest library.
Advertisement
2 MIN READ

Is It Spelled “Burnt” or “Burned”?

While both are technically correct, they have distinctions — namely the part of speech and the context of the burning.

by Bennett Kleinman
Slightly burnt and severely burnt bread

While “burnt” and “burned” aren’t exactly homophones (words that sound the same, but have different meanings, spellings, or both), choosing the correct version can still trip people up. Both words are considered acceptable forms of the word “burn,” making them somewhat interchangeable in most English-speaking countries. But for American English speakers, there’s a distinction, depending on whether the term is being used as an adjective or a verb.

Advertisement

Long before “burnt” or “burned” were in play, the Old English word brent was an adjective used to describe items scorched by fire. In the late 16th and early 17th centuries, “burnt” and “burned” became the preferred spellings. This parallel evolution makes it hard to discern between these terms, and non-American English speakers still use them interchangeably as adjectives (with a preference leaning toward “burnt”). But in the U.S., we usually use “burnt” when describing something’s appearance, as in the popular Crayola color burnt sienna. While both are technically acceptable, it’s less likely you’ll see an American write “burned sienna” or “burned toast.”

But the opposite holds true if the context calls for a past-tense verb. American English speakers are far more likely to use “burned” in an example such as “the chicken burned in the oven” or “I burned in the sun yesterday.” You can say “the chicken burnt” and still be grammatically correct, but it’s less common in the United States.

If you live in England, Australia, or any other non-U.S. country where English is predominantly spoken, it may be hard to find a difference between “burnt” and “burned.” But in the U.S., use “burnt” as an adjective and “burned” as a past-tense verb to avoid issues or confusion.

Featured image credit: Aleksandrs Samuilovs/ Shutterstock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
2 MIN READ

Why Do We Say “O’Clock”?

From sundials to mechanical clocks, a simple phrase reveals centuries of time-keeping history.

by Bennett Kleinman
Person Touching Black Two-bell Alarm Clock

“O’Clock” sounds like the surname of an Irish family whose quirk is being punctual. (Actually, that’s not a bad idea for an animated children’s TV series.) But back to the topic at hand — if someone says something is “happening at 5 o’clock,” it’s understood that it’s either at sunrise or happy hour. The term “o’clock” is an adverb that always follows a numeral to indicate the time of day. It’s almost exclusively a whole number, as you’d never say, “it’s one-thirty o’clock” — that just sounds odd.

Advertisement

“O’clock” is a shortening of the phrase “of the clock,” which itself comes from the Middle English “of the clokke.” Mechanical clocks, with faces and automatic hand movements,  originated in the late 14th century in Europe. These clocks replaced traditional light-based timekeeping methods (sundials) that had been used for centuries. When someone said “of the clokke,” it referred to the position of the mechanical hands on the clock’s face. As modern English evolved, the term shortened to “o’clock” around 1720. While many modern timekeeping devices lack a traditional clock face, it’s still standard to say “o’clock” as if you were imagining one.

The word “clock” didn’t exist prior to the 14th century. Instead, timekeeping devices were called daegmael — the Old English word for sundials and other similar devices. As new mechanical timekeeping devices were invented, the term “clock” was coined. Many early clocks had a chime or gong to mark certain times, and “clock” developed from the French word cloche, meaning “bell.”

Featured image credit: Stas Knop/ Pexels
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
2 MIN READ

When Should I Use “Further” Versus “Farther”?

Learn a simple trick to master the difference between “further” and “farther” — two commonly confused words that even skilled writers mix up.

by Jennifer A. Freeman
Open highway road

It’s a common conundrum: Is that store further or farther away? What’s the difference? Or do they mean the same thing?

Advertisement

“Further” and “farther” are two of the most notoriously confused words in English. While they have distinct usages today, they sprouted from the same etymological root hundreds of years ago. As the Oxford English Dictionary entry for “further” explains, both “further” and “farther” come from the Old English word fyrðrian, and the original usage of both words was a comparative form of “far,” meaning “more forward, more onward.” The only difference was in the vowels used to spell the words, and that was simply because spelling was not yet standardized at the time.

However, as English developed, the words deviated slightly in meaning, and now there’s a distinct usage for each. “Farther” holds onto the usage of “more far” in relation to physical distance. For example, “We walked farther today than we did yesterday.” There’s a measurable direct comparison. “Further” relies on the meaning of “more onward” in a metaphorical sense; it’s an extension of time or degree. For example, “I need to look into the issue further before I decide,” or “Let’s move the party further in the month.” There’s still a comparison, but no specific measurement.

A good trick to decide when to use “further” or “farther” is to ask the question, “How far?” If there’s a simple answer to that question, use “farther.” For example, “How far did you walk yesterday? A mile.” Compare that to “How far did you look into the issue?” There’s not a concrete answer to that question, so “further” is appropriate. You could technically answer the question of “How far should we move the party?” with a specific answer, but there are always exceptions in the English language. Just remember that “further” is used with extensions of time. 

Featured image credit: Brian Wangenheim/ Unsplash
Jennifer A. Freeman
Senior Editor, Word Smarts
Jennifer A. Freeman is the Senior Editor of Word Smarts and Word Daily. When she's not searching for a perfect synonym or reaching "Genius" level on Spelling Bee, she's playing with her Welsh Terrier in Greenville, SC.
Advertisement
2 MIN READ

What Is a Malapropism?

Whether “dancing the flamingo” or “visiting the Sixteenth Chapel,” this theater-inspired literary device is designed for a laugh.

by Bennett Kleinman
Woman Laughing While Reading Book

Speaking in English isn’t always that sample (simple), as it’s easy to make my steaks (mistakes). One of the most common arrows (errors) that people make is using a similar-sounding term in place of the correct word. While often unintentional, it produces a humerus (humorous) effect in many dramatic and comedic literary works. The concept is known as a “malapropism,” and this opening paragraph contains quiet (quite) a few examples of the literary device.

Advertisement

The term “malapropism” was inspired by an 18th-century English play called The Rivals. This 1775 comedic work by Richard Brinsley Sheridan features a character named Mrs. Malaprop, who unintentionally used incorrect — but similar-sounding — words, producing an amusing effect. Sheridan likely constructed her name from the French phrase mal à propos, meaning “inappropriate.” Mrs. Malaprop’s lines include: “He is the very pine-apple [pinnacle] of politeness,” and “I have since laid Sir Anthony’s preposition [proposition] before her.”

It took about 50 years after The Rivals premiered for the word “malapropism” to appear in publication, drawing direct connection to the Mrs. Malaprop character. An 1830 theater review read: “Mrs. Glover’s … Mrs. Malaprop … wants the highest relish of contrast in its malapropism.” But the literary device wasn’t invented by Sheridan’s play; it was merely given a name. An 1890 edition of Harper’s Magazine called to classic examples: “Lemaître has reproached Shakespeare for his love of Malapropisms.” 

In modern use, a malapropism is “the usually unintentionally humorous misuse or distortion of a word or phrase.” You might recognize examples in phrases such as “dancing the flamingo” (instead of “flamenco”), “Jesus healing the leopards” (“lepers”), or “going to Vatican City to visit the Sixteenth Chapel” (“Sistine”). If you find yourself inadvertently using the wrong word, just laugh and accept a gentle correction, because the malapropism probably amuses those around you as well.

Featured image credit: Irene Miller/ Shutterstock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
3 MIN READ

How Are Emojis Chosen?

Discover how new emojis make their way onto your phone — and why you might want to submit your own proposal.

by Jennifer A. Freeman
Emojis hovering over a laptop

If someone sends a text that says, “i need to tell you something,” it might inspire worry about whether it’s good or bad news. But what if they added some emojis? “i need to tell you something 👀😉” — now we have the allure of some potentially hot gossip coming our way. Text-based communication has always lacked context, even before computers and smartphones, but there’s something about the ease of digital channels that makes it even more likely someone will misunderstand the nuance of a message. 

Advertisement

In the early days of 1990s chat rooms and message boards, smileys (also called “emoticons”) conveyed a simple frown 🙁 or a wink 😉 with the keyboard, and the truly sophisticated could shrug ¯\_(ツ)_/¯. Soon, developers had an idea for image-based communications. The first emoji set was created in 1999 by Japanese artist Shigetaka Kurita for DOCOMO, Japan’s main mobile phone carrier. The 176 pixelated characters are now on permanent display at the New York Museum of Modern Art. Emojis exploded in popularity across Japan in the early 2000s, and by 2007, there was a petition in front of the Unicode Consortium to standardize the coding. 

Unicode is a global nonprofit founded in 1988 to provide character-encoding standards, and getting the organization on board with emojis meant these tiny graphics were a legitimate form of communication. The turning point came when engineers from Apple and Google joined forces and submitted an official proposal to adopt 625 new emoji characters into the Unicode Standard. The proposal was accepted in 2010, bringing emojis under the purview of Unicode. That means Unicode decides what is included in the library of emojis, and provides guidance on what they should look like. Each vendor (Apple, Facebook, Google, etc.) has their own designers to dictate how the artwork appears on their platform, but the types of emojis that are available are consistent across the board. 

Emoji 1.0 was released in 2015 with 722 emojis, and we’re now at Emoji 16.0, with more than 3,600 emojis including variations of faces, genders, skin tones, and more. Each new release contains dozens of emojis, or just a few new options. (We talked about the newest emojis in a recent edition.) But it’s a democratic process to select a new emoji: Anyone can submit a proposal to the Unicode Consortium, which might end up in a release. The key is to submit ideas that fulfill needs not served by existing emojis and have cultural importance. 

For example, the Lime 🍋‍🟩emoji is one of the most recent additions to the lexicon, added as part of Emoji 15.1 in 2023. The proposal suggests multiple uses, including for “hanging out and socializing” and representing “the limelight.” The proposal writers offered: “Limes are visually and purposefully distinct fruits that have cultural significance around the world. Limes are of particular prominence in Asian and Hispanic culture due to their abundance in [crops] and inclusion in foods and drinks.”

What would you submit for an emoji proposal?

Featured image credit: J Studios/ DigitalVision via Getty Images
Jennifer A. Freeman
Senior Editor, Word Smarts
Jennifer A. Freeman is the Senior Editor of Word Smarts and Word Daily. When she's not searching for a perfect synonym or reaching "Genius" level on Spelling Bee, she's playing with her Welsh Terrier in Greenville, SC.
Advertisement
2 MIN READ

What Is a Euphemism?

Euphemisms allow us to lessen the blow linguistically by choosing a word or expression that is less offensive or more palatable than the original.

by Rachel Gresh
Hands holding blank speech bubble signs

If you have a friend who was recently laid off, you might describe them as “between jobs” instead of as “unemployed.” This swap is called a euphemism — a form of figurative language used to discuss sensitive, negative, or taboo topics in a gentler or more socially acceptable way. 

Advertisement

Euphemisms are used in all styles of speech, from everyday conversation to formal communication. They tend to be kinder, milder, and less abrasive than the alternatives, or, at the very least, more indirect or vague in meaning. A euphemism can soften the impact of a negative or sensitive topic. For instance, instead of saying someone is broke, you might say they’re “in a rough patch financially,” which has a more polite tone. Similarly, “let go” often replaces “fired,” and if you back out of something, you might say you “threw in the towel” instead of “quit.” In the same sense, “oh my gosh” often replaces a potentially offensive use of religious figures. Euphemisms such as “darn,” “fudge,” “heck,” and “shoot” are stand-ins for curse words — we’ll let you guess the corresponding profanities.

Although euphemisms are incredibly popular in modern dialogue, they aren’t new. The term “euphemism” entered the English language around 1650, derived from the Greek word euphemismos, meaning “use of a favorable word in place of an inauspicious one, superstitious avoidance of words of ill-omen during religious ceremonies.” By the end of the 18th century, “euphemism” gained a broader usage of “choosing a less distasteful word or phrase than the one meant,” a definition that still holds true today.

Euphemisms aren’t exclusively used in harsh or embarrassing situations; they can also create variety, humor, or irony. If you come home to a shriveled-up houseplant post-vacation, it’s perfectly acceptable to say, “It died,” but you might lightheartedly use the idiomatic euphemism “It kicked the bucket” instead. “Kicked the bucket” is both an idiom (a phrase that means something different from how it sounds) and a euphemism serving as a less direct way of saying something died. 

Featured image credit: BlackSalmon/ iStock
Rachel Gresh
Freelance Writer
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
Advertisement
2 MIN READ

Why Do We Call It a “Pet Peeve”?

Do the sounds of someone chewing drive you to distraction? If you call that, or some other annoying habit, a “pet peeve,” you might want to learn where the term came from.

by Bennett Kleinman
annoyed adult woman hiding face, head down

Don’t you hate it when people arrive late after you tell them to be somewhere at a specific time? Or how about when people wear outside clothes on the bed? Or maybe it annoys you when people loudly scroll through TikTok on a public train and all you want to do is yell at them to use headphones!

… Sorry, that last one really rubs me the wrong way.

Advertisement

These are examples of pet peeves, or seemingly minor but still annoying issues. The term “pet peeve” is a recent creation that dates back to a little over a century ago, but let’s break down each component.

The word “pet” has been used as an adjective since the late 16th century, originally referring to privately owned animals. In the 19th century, the meaning broadened to include things or beliefs we hold near and dear. Soon, people began using it ironically in terms such as “pet hatred” and “pet aversion.” 

“Peevish,” meanwhile, is a 15th-century term used to describe things that evoke a feeling of spite or fretfulness. “Peeve” was created from “peevish” via back formation in the 20th century to fulfill a need for a term to describe personal annoyances. A back formation is when a new word is created by chopping off a real or supposed suffix or affix from an existing word.

“Pet” and “peeve” were combined in print in the 1910s, cementing the term in the public lexicon. A 1916 article from The Chicago Daily Tribune asked, “What is your little pet peeve? Hearing the baby scratch hubby’s collar.” This denoted one of the first known published instances of the term “pet peeve,” but people continued complaining in print throughout the 20th century. From a 1976 edition of the National Observer: “Poorly designed parking garages have riled me for a long time, but they’ve become a full-fledged pet peeve in recent years.” And in a 2002 copy of Time Out New York: “This touches on my biggest peeve with today’s..society.” More than a century after the term’s creation, folks continue to vent their pet peeves. The Instagram hashtag #petpeeve has 158,000 posts and counting. 

Featured image credit: Slladkaya/ Shutterstock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement