3 MIN READ

When Should You Use “Which” Instead of “That”?

Elevate your writing with this guide to using often-confused words such as “which” and “that” correctly.

by Bennett Kleinman
Man pointing index fingers in different direction

“Which” and “that” are a lot like identical twins, in that they’re pretty easy to mix up at first glance. But just like identical twins have unique traits that set them apart, so do these similar yet distinct terms. One reason we mix up “which” and “that” so often is the words were used interchangeably until the 1700s, and old habits die hard. But under today’s grammatical guidelines, there’s an appropriate context for “which,” and separate occasions for “that.”

Both “which” and “that” are relative pronouns, meaning they can refer to any related or previously mentioned nouns. But to understand how they differ, it’s useful to define the concepts of restrictive and nonrestrictive clauses. A restrictive clause adds essential information to a sentence: “The album that came out after her child was born changed her musical style.” In this example, the restrictive clause of “that came out after her child was born” is crucial to the meaning of this sentence. 

A nonrestrictive clause, however, works as a conversational aside, adding nonessential information: “The band’s first album, which was my favorite, had great backup singers.” It might be nice to know that you like an album, but the point of this sentence is the prowess of the backup singers, so the clause within the commas is considered nonrestrictive. As these examples show, it’s appropriate to use “that” in restrictive clauses, and “which” in nonrestrictive clauses.

Nonrestrictive clauses often appear at the end of sentences, not just in the middle like in our example above. For example, “David Bowie’s album Young Americans had famous backup singers, which included Luther Vandross.” Here’s our memory tip: If you need a comma, you’re probably dealing with a nonrestrictive clause, meaning “which” is almost always the correct choice. Commas set off info that, when removed, doesn’t impact the sentence’s clarity or meaning. If you don’t need a comma, use “that.” 

“That” has a variety of usages “which” cannot serve. It can act as a definite article referring to a specific noun (“That is my favorite album”), as a conjunction to connect two clauses (“I didn’t know that it was their first time performing together”), or as an adverb to add context before an adjective or verb (“I don’t want to spend that much money on concert tickets”). This makes “that” more versatile than “which.”

Featured image credit: Krakenimages/ Shutterstock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
2 MIN READ

What Is a Double Negative?

Learn why two negatives don’t always make a positive in the world of grammar.

by Bennett Kleinman
Wrong choice X button

In the real world, purging negativity is an important skill for a happy life. In the grammar world, purging double negativity is crucial for clear and concise writing. Double negatives are redundant thoughts made of multiple negative words; these result in complicated and confusing sentences. However, they aren’t not useful. Every once in a while, an appropriately used double negative can improve your writing, but those occasions are rare.

Advertisement

A double negative is any statement with two negative words. A person might say, for example, “I didn’t see nobody.” The two negatives are “did not” and “nobody.” The problem is that double negatives muddle the intentions, resulting in the opposite meaning. “I didn’t see anybody” would be clearer. Think back to math class — multiplying two negative numbers together cancels out the negative and turns it into a positive. It’s the same with words. Two negatives cancel each other out and turn the statement into a positive. Combining “didn’t” and “nobody” flips the meaning to imply the speaker did see somebody, which wasn’t the goal of the statement.

Common words in double negatives include negative determiners (“no” and “none of”), negative pronouns (“neither,” “no one”), negative adverbs (“not,” “never”), and negative verbs, which are created by adding “not” or making it a contraction (“wouldn’t,” “don’t”). The good news is, a double negative is usually easy to fix by removing one negative word. For example, “I cannot go nowhere tonight” can be fixed by removing “nowhere” to get “I cannot go tonight.”

There are rare instances where double negatives can add flair to your writing, however. If you’re hoping to emphasize a point, you might say, “I can’t not go to this party” for added oomph and drama. Or just ask the Rolling Stones, who famously sang, “(I Can’t Get No) Satisfaction.” In these cases, the double negatives are used for rhetorical effect. But otherwise, they should be avoided.

Featured image credit: Thx4Stock/ iStock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
2 MIN READ

What Does “Playing Devil’s Advocate” Really Mean?

From the halls of the Vatican to modern-day debates, discover the true meaning of “playing devil’s advocate.”

by Bennett Kleinman
Creepy Devil silhouette

When you hear “playing devil’s advocate,” your mind might first go to Keanu Reeves’ role in the 1997 thriller The Devil’s Advocate. And while that’s a pretty solid film, today’s edition is about something different: a figure of speech. Let’s examine the idiom’s origins, which date back to the 16th-century Roman Catholic Church.

Advertisement

The term “devil’s advocate” stems from a position in the Catholic Church known as the Promoter of the Faith (promotor fidei). This role emerged in the early 16th century during Pope Leo X’s reign and was formalized in 1587 by Pope Sixtus V. Whenever an important individual was nominated for beatification or canonization (parts of the process for granting sainthood), the promotor fidei was responsible for bringing to light any past wrongdoings or sins. This would fuel a critical debate to examine if the candidate’s positives outweighed the negatives. Given the promoter fidei’s focus on past wrongs, they came to be called advocatus diaboli, which translates to “devil’s advocate.”

In modern parlance, the idiom “playing devil’s advocate” is applied more broadly to debates on any topic, not just canonization. “Devil’s advocate” is defined as “a person who champions the less accepted cause for the sake of argument.” The person playing the role of devil’s advocate doesn’t need to believe the case they’re arguing; they may just be bringing up those points to make the argument more interesting or promote a more critical lens. Worst case, the devil’s advocate is just being annoying for the sake of argument, but let’s hope your friends aren’t that cruel.

Featured image credit: ardasavasciogullari/ iStock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
2 MIN READ

Why Do We Start Letters With “Dear”?

Learn the history of this centuries-old greeting and discover how it’s evolved in the age of email.

by Samantha Abernethy
Close-up of pen writing on white paper

In an early job, I got feedback that my email correspondence needed to be more sophisticated. I was firing off, “Hey, do you have that expense report?” while my boss preferred the formalities of a traditional letter, including a salutation (a “Dear” and title/name combo) and a complimentary closing (such as “Best regards” or “Sincerely”). That level of formality has dropped out of all but the most professional email communications, but the etiquette persists for handwritten letters. Where did “Dear” come from, what does it mean, and what other options are there for conscientious email writers?

Advertisement

The word “dear,” from the eighth-century Old English “deoare,” originally meant something was precious or costly, but evolved into calling out something/someone as special. (That first usage still exists, but it’s less common.) “Dear” was used, starting around the 14th century, as a salutation for only the most intimate letters: “Dearest sister,” “To my dear friend,” etc. The phrase “dearly beloved” was introduced in a 1662 Bible translation called the “Book of Common Prayer,” and it became a traditional component of wedding ceremonies (and the classic opener of a Prince tune), furthering the association of “dear” with loved ones. 

Around the 17th century, the term became the standard opening for most polite communication and a way to start any letter as a show of respect: “Dear Mr. Smith,” “Dear Sir or Madam,” etc. Eventually we moved on from regular letter writing as the primary mode of communication between family and friends, and in the 20th century, people wrote more memorandums than love letters. Once email became standard, some people retained the formality of the written structure, and others took the opportunity to let the “To” and “From” fields do the work for them — no salutations or closings needed. 

But many people — including the Washington Post’s Miss Manners — aren’t ready to let go of “dear,” no matter the format. Career experts recommend using “dear” as a salutation in formal email correspondence, such as cover letters, but only when you know the recipient’s name; the impersonal “Dear Sir or Madam” is definitely extinct. (A Google or LinkedIn search can help you out with names.) For standard missives between colleagues, consider starting the first message with “Hi [name]” and closing with your name as well. Future replies in the chain don’t need a salutation. 

Featured image credit: CurrywurstmitPommes/ Shutterstock
Samantha Abernethy
Freelance Writer
Samantha Abernethy is a freelancer in Chicago. When she isn't staring at a laptop, you can find her sniffing out the best coffee with her greyhound Ruby, or chasing her kids around the nearest library.
Advertisement
3 MIN READ

Is It OK To Start a Sentence With a Conjunction?

This grammar myth-busting article might just change your mind about kicking off sentences with conjunctions.

by Bennett Kleinman
Conjunctions word concept on cubes

Some bits of advice are instilled in us from a very young age: Eat your vegetables, look both ways before crossing the street, and, of course, never begin a sentence with a conjunction. The latter comes to us directly from grammar class, but is it really a rule? Nope. In fact, it’s merely a suggestion. Starting sentences with a conjunction is perfectly OK in a grammatical sense, and it may even improve your writing.

Advertisement

But let’s go back to basics first. A conjunction is a word that connects two clauses of sentences. Subordinating conjunctions (“because,” “since,” “after,” etc.) link independent and dependent clauses. Correlative conjunctions (“either/or,” “neither/nor,” “such/that,” etc.) join together two words or phrases of equal importance. For example: “Either I’m going to eat this sandwich, or I’m going to eat at home.” People use both types of conjunctions to start sentences, and nobody bats an eye. The controversy usually arises with a third type: coordinating conjunctions.

Coordinating conjunctions link together two independent clauses, and can best be remembered with the acronym FANBOYS: For, And, Nor, But, Or, Yet, and So. Reputable grammar guides, including the Chicago Manual of Style and the Merriam-Webster Dictionary of English Usage, say it’s acceptable to begin sentences with coordinating conjunctions. The main reason so many people are opposed to this idea is that using these conjunctions as an initial word too often can lead to bad writing habits. For instance, stringing together multiple sentences that start with a conjunction can sound like you’re talking like a 7-year-old: Today, I went to the park. And then I ate lunch. And I saw a dog. But the dog ran away. And then he ran back again. So I smiled. And my lunch fell on the ground.

As a general rule of thumb, it’s best to avoid beginning your sentences with conjunctions as you’re developing your writing skills. But as you’re honing your voice and writing style, it’s fine to experiment a bit, as evidenced by some of history’s most iconic prose. Consider Lord of the Rings author J.R.R. Tolkien, who once wrote, “Yet the Lord of Gondor is not to be made the tool of other men’s purposes, however worthy.” Or read S.E. Hinton’s The Outsiders, which contains the line, “I lie to myself all the time. But I never believe me.”

Featured image credit: Shutterstock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
2 MIN READ

Why Can’t Something Be “Very Unique”?

Discover why grammar sticklers cringe at “very unique” and when bending the rules might actually enhance your writing style.

by Bennett Kleinman
Unique toy duck among many other plain toy ducks.

There are times when you might want to add a bit of extra oomph to your words to get your point across. Consider the opening line of the U.S. Constitution: “We the People of the United States, in Order to form a more perfect Union…” Grammatically, describing something as “more perfect” is impossible. “Perfect” is, by definition, perfect. 

Advertisement

Absolute adjectives are words such as “unique,” “perfect,” or “impossible” — these terms are unequivocal on their own, shouldn’t be compared or intensified, and don’t deal in any level of varying degrees. They differ from comparative adjectives (“smarter,” “faster,” “lesser,” etc.), which deal in degrees or levels, and superlative adjectives (“smartest,” “fastest,” “least,” etc.), which have an element of judgment. But despite the fact that absolute adjectives technically should stand on their own, without modification, many writers opt to modify them as a point of style. 

Let’s examine this passage from Kenneth Grahame’s The Wind in the Willows: “‘Toad Hall,’ said the Toad proudly, ‘is an eligible self-contained gentleman’s residence very unique; dating in part from the fourteenth century, but replete with every modern convenience.'”

Should we presume to correct Grahame’s classic tale? The word “very” in the selection above is unnecessary from a strictly grammatical perspective, but it still plays an important role. It adds to Mr. Toad’s quirky tone and gives the reader a feel for the character and his home. In this case, his house is not only one-of-a-kind, but also so interesting that it’s worth noting. Another instance where a writer might find it useful to modify an absolute adjective is with an adverb that implies completeness. For instance, you may call something “completely final” in an effort to add emphasis and hammer home your point — as this topic is totally complete. 

Featured image credit: Lee Yiu Tung/ Shutterstock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
2 MIN READ

Why Is “Could Of” Wrong?

Discover why this common phrase is a grammatical misstep, and learn how to avoid this sneaky error that even native English speakers make.

by Bennett Kleinman
Man checking his phone

English is full of sentences that sound awkward but are grammatically correct. Consider trying to untangle the mind-bending “Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo.” (Hint: “Buffalo” has three distinct meanings and functions as a noun, proper noun, and verb in this riddle.) Then there are phrases that sound OK, but are actually grammatical errors. These mistakes are easier to catch when writing, and more difficult to recognize when someone is speaking (for example, “intensive purposes” vs. “intents and purposes”). One of the biggest culprits of the “easy to mishear” swap comes with contractions and prepositions; we’re saying one thing, but people are hearing another. 

Advertisement

Consider the contraction “could’ve.” “Could” implies both possibility and willingness. It often acts as a helper verb, which means it’s paired with a second verb to make the sentence clearer. For example, “I could go to the dance” implies a chance of attending the dance. But let’s imagine you missed that opportunity and you’re telling a friend about it. You might say, “I could of gone to the dance.” But wait — that’s only what it sounds like. The proper construction of the sentence is “I could’ve gone to the dance,” where “could’ve” is a contraction of “could” and “have.” 

As already noted, “could” requires a second verb so that it can make grammatical sense. This is why we need “have” instead of “of” — the latter may sound similar, but it’s an incorrectly used preposition. The contraction “could’ve” sounds very similar to “could of,” and while it may be near impossible to discern between the two if you’re speaking, swapping them will make a big difference in writing.

Featured image credit: Tim Samuel/ Pexels
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
2 MIN READ

How Soon Is “Once in a Blue Moon”?

Explore the truth behind the phrase “once in a blue moon” and why it might not be as rare as you think.

by Bennett Kleinman
Night sky with full bright moon in the clouds

We often say “once in a blue moon” to describe an event that happens infrequently, like a Detroit Lions championship or the McRib returning to the menu at McDonald’s. This usage dates back to the 16th century, when “the moon is blue” was used to describe an event that seemed impossible. In 1821, Pierce Egan used the idiom in his work Real Life in London, to describe how long it had been since two people had seen each other. This denoted a shift in the meaning of the phrase from “impossible” to “unlikely.” But why does a blue moon specifically carry this connotation? Why not a green or gold moon? 

Advertisement

In reference to the actual celestial body, people reported seeing a blue-colored moon after the eruption of Krakatoa in 1883. This hue was likely due to the sulfur dioxide and ash in the air changing the wavelengths of visible light, making the moon appear blue. 

When it comes to the lunar cycles, there are usually 12 full moons in a year, but occasionally there are 13 —  this is known as a “blue moon.” The concept of the 13th full moon was coined in the 1937 Maine Farmers’ Almanac. However, in 1946, amateur astronomer James Hugh Pruett misinterpreted that extant meaning. He wrote an article about “blue moons” as the second full moon in any given month. By this second definition, multiple blue moons can happen in a single year, though this occurs only once every 19 years. When it does happen, it usually takes place during January and March, as February’s unique 28-day (sometimes 29) length makes it more likely. Sometimes there will be no full moon in February at all, as the average duration between full moons is 29.5 days.

The two versions are similar lunar cycle concepts, but either way you track them, blue moons happen roughly once every 2.7 years. So while we may use “once in a blue moon” in an extremely vague sense, it actually refers to that specific period of time.

Featured image credit: muratart/ Shutterstock
Bennett Kleinman
Staff Writer
Bennett Kleinman is a New York City-based staff writer for Optimism. He is also a freelance comedy writer, devoted New York Yankees and New Jersey Devils fan, and thinks plain seltzer is the best drink ever invented.
Advertisement
2 MIN READ

What Is a Backronym?

You’ve heard of acronyms, but are you familiar with backronyms? In this linguistic case of chicken-and-egg, the abbreviation is conceived before the phrase.

by Rachel Gresh
SOS phone pole

Government agencies love their acronyms — and backronyms, too. For instance, NASA once named a treadmill “COLBERT” after late-night television host Stephen Colbert. Before unveiling a new module for the International Space Station, the agency launched an online poll for the public to submit ideas for the module’s name. Instead of the expected space-y options, people went with their favorite comedian. While NASA ultimately named the module “Tranquility,” the agency still honored the public’s choice by naming the ISS treadmill the “COLBERT” — a backronym that stands for “Combined Operational Load Bearing External Resistance Treadmill.” A backronym is essentially a reverse-engineered acronym; it turns an existing word into an acronym by piecing together relevant words until their first letters correctly spell the desired abbreviation. The name “COLBERT” came before the treadmill’s full name, which was created with the final abbreviation in mind.

Advertisement

The portmanteau of “backward” and “acronym” came about from a 1983 neologism (a newly coined word) contest in The Washington Post. Nowadays, backronyms are found everywhere, especially in the entertainment industry. The title of the James Bond thriller Spectre is a backronym for “Special Executive for Counterintelligence, Terrorism, Revenge, and Extortion.” The British English word “spectre” (or “specter” in American English) means “ghost” — an apt name for a covert organization tracking global supervillains. 

One of the most enduring backronyms must be “SOS,” which, contrary to popular belief, is not an acronym for “Save Our Ship,” nor does it stand for “Save Our Souls.” In 1906, “SOS” (… — … in Morse code) was chosen as the standard distress signal in Morse code because of its simple and distinct set of dots and dashes. The popular backronyms were later invented as a creative way to explain the origin of the code.

Thanks to these backronym origin stories, it can be tricky to differentiate an imaginative creation from a genuine acronym. While no one will fault you for misidentifying backronyms, discovering their true identity can include some interesting stories.

Featured image credit: Leonie Zettl/ Unsplash
Rachel Gresh
Freelance Writer
Rachel is a Washington, D.C.-based freelance writer. When she's not writing, you can find her wandering through a museum, exploring a new city, or advocating the importance of the Oxford comma.
Advertisement
2 MIN READ

Why Is the Letter “Z” Associated With Sleeping?

The last letter of the alphabet has a surprisingly rich history behind its association with sleep and snoring.

by Samantha Abernethy
The letter Z's in a dream cloud

My 4-year-old niece has a favorite blanket that she can’t sleep without. When I asked her if it had a name, she said something that sounded like “zhuzh.” Her parents helped translate: “It’s spelled with three ‘Zs.’” 

Advertisement

My niece has already learned enough of the English language to pick up on the relationship between sleep and the letter “Z,” but where did this link come from? We could chalk it up to simple onomatopoeia, but I know when I snore, it doesn’t sound nearly as sweet as it did when my niece tried to pronounce “zzz.”

Other languages have their own onomatopoetic ways to depict snoring, including “rrrrrr” in Spanish and “xppp” in Russian. But the three “Zs” are globally recognized, thanks to the popularity of American comics. The first instance of “zzz” being used as shorthand for sleep has been traced to the 1903 comic strip “Katzenjammer Kids,” which portrayed a man snoozing in a hammock. Figuring out how to depict sleep in comic strips and comic books is a tricky task. Sometimes sleep has been depicted as “grrk,” “honk-shoo,” “ZZRRGGHH,” or just “snore.” Or the illustrator has taken artistic license by adding a little drawing of a saw and a log to imply the rhythmic rumble of the idiom “sawing logs.” In time, that phrase evolved with technology into “snoring like a chainsaw.” 

But the “Zs” won out. In the 1940s, the verb “zonk,” meaning “fall or cause to fall suddenly and heavily asleep or lose consciousness” entered the lexicon, and in the 1980s, cartoonist Jim Davis used one big “Z” to demonstrate sleep in the “Garfield” comics.

As for how to pronounce “zzz,” it’s usually not meant to be pronounced out loud, although in the 1960s, the phrase “get some Zs” became common slang, and now the British have adopted the word “zizz” to mean “nap.” Whether said as “zhuzh,” like my niece does, or written “zzz” in the funny pages, we could likely all use a few more hours of sleep. 

Featured image credit: Peshkova/ Shutterstock
Samantha Abernethy
Freelance Writer
Samantha Abernethy is a freelancer in Chicago. When she isn't staring at a laptop, you can find her sniffing out the best coffee with her greyhound Ruby, or chasing her kids around the nearest library.
Advertisement