2 MIN READ

Who Is Susan of ‘Lazy Susan’?

Found in kitchens around the globe, the lazy Susan has revolutionized the dinner hosting game. Here’s how it earned its unusual title.

by Rachel Gresh
lazy susan sits on kitchen counter

How did such a useful kitchen accessory earn such a lackadaisical name? There are several clues as to the etymology of this peculiar moniker. Tracing the origins of the device and the name lead us on different paths, and the name is a surprisingly modern invention. 

Rotating trays have existed in kitchens around the world for centuries. They’re popularly used in Chinese restaurants, where they’re called cānzhuō zhuànpán or simply dinner-table turntables — pretty straightforward. But in the American kitchen, they boast an interesting name: lazy Susan

Susan likely wasn’t a real person, despite what some folk etymologies say about Thomas Jefferson, who neither invented the lazy Susan nor had a daughter named Susan. Instead, this term likely stems from an 18th-century practice of using the name Susan as a generic title for a household servant. Because these trays make serving easier without the need for extra help, “lazy” was tacked on, perhaps as a marketing tactic. 

According to a Q&A in the Chicago Tribune, one of the first published mentions of “lazy Susan” dates to an ad in a 1917 issue of Vanity Fair, specifically a Christmas promotion that highlighted household gifts. Here’s how the copywriter described this sensational kitchen device: “$8.50 forever seems an impossibly low wage for a good servant; and yet here you are; Lazy Susan, the cleverest waitress in the world, at your service!” 

At that time, World War I was being waged across the Atlantic, and women in the U.S. were likely looking for ways to increase efficiency in their homes. It seems that a humble serving tray, one that cleverly rotates and bears a slightly off-putting name, was the answer to their troubles. We prefer the name lazy Susan to another hostess helper: the dumbwaiter. This unfortunately named tool is either “a portable serving table or stand often with revolving shelves arranged in tiers” or “a small elevator used for conveying food and dishes or small goods from one story of a building to another.”

Featured image credit: JTobiason/ Adobe Stock
4 MIN READ

Who Invented Punctuation?

While it’s sometimes tricky to know how to properly use a semicolon, and English majors love to debate Oxford commas, we’d be lost without punctuation marks. But written language existed long before em dashes and exclamation points. Who invented punctuation marks?

by Stewart Edelstein
A period, question and exclamation mark post it notes

CANYOUREADTHISIBETITSHARDWITHOUTPUNCTUATIONITSHARDTOREADEVENSHORTSENTENCES 

Can you read this? I bet it’s hard. Without punctuation it’s hard to read even short sentences. Initially, ancient Greek was written in all caps with no punctuation or spacing. We even find inscriptions from ancient Rome written in all caps with only small dots breaking up the words. Speech, especially the eloquent and persuasive speech of politicians and elected officials, was valued more highly than the written word. But now punctuation makes all the difference. For example, the versatile “OK” can be a question (“OK?”), an agreement (“OK.”), or an exasperated exclamation (“OK!”). So, where did these punctuation marks come from?

Aristophanes of Byzantium, a third-century BCE Greek grammarian, introduced punctuation marks based on three dots, placed in high, middle, and low positions after a letter, designating the end of different types of sections — what we would now call sentences, paragraphs, and perhaps chapters as the longest sections. The dot system was adopted by the Romans and developed by using additional points to separate each word. Aristophanes’ system of dots was further expanded through the rise of Christianity in the fourth and fifth centuries. To spread the word of God, Christians preferred the written word, made clearer with punctuation. 

In the seventh century, Archbishop Isidore of Seville updated Aristophanes’ system, rearranging the dots so that pauses were short (low dot), medium (middle dot), or long (high dot) — our comma, colon, and period, respectively. Isidore’s breakthrough was to associate punctuation with meaning for the first time. The archbishop’s system used dots within sentences, and ends of sentences might have been marked with a group of two or three dots.

Advertisement

The question mark evolved from the punctus interrogatives created by eighth-century English scholar Alcuin of York, used to signal an upward inflection at the end of a sentence. The original form was a diagonal line above the last letter of the last word in a sentence, but it eventually morphed to the curled question mark we use today. The exclamation mark is from the Latin io, meaning “joy,” which medieval scribes wrote at the end of a sentence to express that emotion. In all caps, an “I” with an “O” below it (to save space) became the exclamation mark, with the “O” shrinking to a dot over time.

In 1494, Italian printer Aldus Manutius invented the semicolon. Its initial purpose, a pause the length between that of a comma and a colon, was revealed in its very form, combining half of each of those marks. Manutius also notably contributed to the development of punctuation by writing on its main purpose: clarifying syntax. 

By the height of the Renaissance, a hodgepodge of punctuation marks were found across writings, but the Gutenberg Bible, published around 1455, changed all that. Within 50 years of that publication, most punctuation marks that we use today were established, likely due to the increased use of the printing press. However, the rules guiding the usage of punctuation marks wouldn’t become standardized for some time. 

Those rules were established by the 17th century. A period (.) marks the end of a complete sentence or to mark abbreviations. The colon (:) indicates the beginning of a list, summary, or quotation. As it was originally intended with its pause length, the semicolon (;) serves a job halfway in between a comma and a period; it can substitute a period between two grammatically complete sentences that are closely connected. A comma (,) serves many roles. It can indicate a pause, going back to the original use of dots, and it can separate items in a series. It also may be used to set off information or before a coordinating conjunction (such as “and” or “but”) and a complete clause. Standard English language punctuation marks also include quotation marks, apostrophes, hyphens, dashes, parentheses, ellipses, and more, and the rules are detailed and contain many exceptions. 

Featured image credit: lisica1/ Adobe Stock
2 MIN READ

Should I Use ‘Less’ or ‘Fewer’?

Less is more, except when you should be using the word “fewer” instead. Here’s a quick explanation as to when you should use each of these two similar words.

by Bennett Kleinman
Stairs of letter tiles reading less

The words “less” and “fewer” are like a set of identical twin babies. It may seem difficult to distinguish between the two at first, but there are subtle differences that parents know to look for. Learning how to use these words properly isn’t as important as identifying your children, of course, but it’s still worth knowing.

The general rule of thumb is to use “fewer” when discussing things that can be counted, such as “fewer children” or “fewer books,” and to use “less” when the context is a measurable quantity, such as “less water” or “less rice” (you wouldn’t count individual grains of rice). While “less” is generally used for measurable quantities and “fewer” for countable amounts, there are some exceptions that muddy the waters a little bit.

For example, units of time are countable, but when using these comparative words, it fits better under the measurable quantity umbrella. It makes more sense to say “I have less than 24 hours to finish the project. “Less” is also used in comparisons of distance (e.g., “less than 10 miles”), money (“less than a dollar”), and weight (“less than 5 pounds”). That’s because people generally treat those concepts as measurable amounts rather than exact countable numbers. These examples show that despite a general rule, there is almost always an exception when it comes to the English language. 

Despite these distinctions, the confusion is understandable because “less” and “fewer” were once used interchangeably. That shifted in the late 18th century, at least in part due to writer and grammarian Robert Baker’s Remarks on the English Language, in which he discussed his own personal preference for when to use “less” vs. “fewer.” In time, people adopted Baker’s opinion as conventional wisdom, essentially redefining each word in line with the modern usage.

Featured image credit: cbies/ iStock
5 MIN READ

What Your Handwriting Says About You

According to graphologists, our handwriting can reveal much about our personalities, laying bare our souls through the dotting of an “i” or the crossing of a “t.”

by Tony Dunnell
Close-up of notebook with writing inside

Handwriting is one of our distinct biometric traits, just like our fingerprints, iris, face, voice, and gait — all of which, to varying degrees of reliability and uniqueness, can be used as physical identifiers. For at least a century, handwriting analysts have helped solve a wide variety of criminal cases by examining a suspect’s writing, normally by comparing and matching handwritten samples. And then there are graphologists, who take the examination of handwriting even further, seeing it as a direct expression of our personality. They look at various aspects of our handwriting — from the size of our letters to how we dot an “i” or cross a “t” — with each element potentially revealing a wealth of information, such as our confidence levels, emotional state, and even our general approach to life. 

Here we take a look at some of the key components that graphologists look for when examining handwriting, and what each element can say about us. It’s important to remember, however, that graphologists don’t normally focus on just a single, isolated symbol — it’s best to consider the various elements as a whole to get a bigger and potentially more accurate personality profile. 

Furthermore, graphology is not the precise study that handwriting analysis provides — take the assumptions about certain personality traits and their connection to handwriting with a grain of salt. Just because you form your letters in a certain way doesn’t mean you’re destined for a particular outcome. Instead, have some fun with the possibilities of graphology and what your handwriting might reveal.  

Advertisement
Emotional Indicators

The amount of pen pressure a person uses when writing is a key indicator of emotion for graphologists. A writer using heavy pressure (meaning the imprints of their handwriting can be felt on the other side of the page) is thought to be displaying more emotional intensity. More generally, heavy pressure writers often have strong constitutions and enjoy being active but may also be temperamental or irritable. Soft pen pressure, on the other hand, can be indicative of a yielding, hesitant personality. According to an interview with graphologist Annette Poizner in Reader’s Digest, “This may be somebody who grew up with a dominant or aggressive caretaker or sibling and never learned how to be assertive.” 

The way you dot your lowercase “i” also can be particularly telling. A small, precise dot directly above the letter stem suggests someone who is detail-oriented, precise, and methodical. But if the writer replaces the standard dot with a round circle — or even a small smiley face — it’s a strong sign of playfulness (or a desire for attention). A hastily drawn dot that turns into a small, sharp line, on the other hand, could be a sign of irritation or anger.

A preference for strong angularity, similarly, may indicate irritability and anger-management issues, but also honed critical thinking and debating powers. On the other hand, the presence of loops and rounds can indicate a writer with a more expressive, emotional, or sensitive state of mind. 

Advertisement
Introvert vs. Extrovert

The angle at which your letters lean can reveal whether you are an extrovert (right slant) or introvert (left slant), as can the size of your handwriting. Large, bold letters often belong to extroverts who enjoy being noticed and appreciated. These writers tend to be self-confident and may do well in leadership roles and social situations. Small writing often indicates the opposite — an introvert. According to graphologist John Beck, “The small writer by contrast does not like to be noticed, takes up an analytical attitude to everything, and likes to play a low social profile.” If your writing is medium-sized, it may represent a more balanced personality that can adapt well to different social situations.

Spacing comes into play, too. Generally speaking, the space between handwritten words tends to be about the width of one letter (a good rule of thumb being the width of a lowercase “o”). Gaps that are wider or narrower than normal can be revealing. Wide spacing may indicate that a person values their freedom highly, and particularly dislikes feeling overwhelmed or crowded. Narrow spacing suggests that the writer might crave contact with other people, doesn’t like being alone, and is possibly a bit needy.  

Cursive vs. Print

Graphical continuity — the way in which letters connect with each other in each word — is one of the fundamentals in handwriting. As children, we are taught cursive (joined-up) handwriting and print handwriting. What we stick with as adults can reveal information about our way of thinking. People who connect their letters apply logic over emotion — and might also be more conformist, conventional, and predictable. Conversely, disconnected letters might suggest someone who thinks more intuitively or creatively, and acts more on instinct. These shunners of cursive might well prefer to work independently and may be naturally artistic or inventive.

Featured image credit: Hiraman/ iStock
2 MIN READ

Why Do We Say ‘How the Tables Have Turned’?

While turntables are found in DJ booths, the phrase “how the tables have turned” originated in an entirely different entertainment medium.

by Bennett Kleinman
Board for playing backgammon with pieces and dice

Unless you’re eating a family-style meal with a lazy Susan in the middle, you’ll find that most tables aren’t designed to turn. This begs the question: How did “how the tables have turned” (or, if you’re a fan of The Office, “how the turn tables”) become a common expression? According to Merriam-Webster, “how the tables have turned” means “to bring about a reversal of the relative conditions or fortunes of two contending parties.” While this explanation might make sense to those who have used the turn of phrase, it doesn’t explain the history of the metaphor, or that it originated in the world of board games.

The saying comes from a specific type of game known as a “table game.” One of the earliest known examples is an ancient Roman game called tabula, derived from the Latin word for “board” or “plank.” Later, medieval people played a popular game called “tables” that acquired a new name in the mid-1600s: backgammon. It remains a popular table game today; in a nod to the legacy of its early years, the playing board is called a table. 

So, whether you say “how the tables have turned,” “turn the tables,” or follow Michael Scott’s lead, they all come from the idea of playing table games. According to the blog Phrase Finder, the saying refers to the literal act of reversing the board so that players are forced to play from an opponent’s position. Some games have it built into the rules, but it might also be a superstition for a losing player to turn the board to change up their luck and play from the winning side, whether there’s a real advantage to doing so or not.

We know table games existed long before this, but the phrase appeared in print in 1634 in a work by Robert Sanderson titled XII sermons. It read, “Whosoever thou art that dost another wrong, do but turn the tables: imagine thy neighbor were now playing thy game, and thou his.” The metaphor became common by the 1800s, and then use dramatically increased after the 1950s, possibly as it made the leap from gaming usage to widespread metaphorical usage. 

Featured image credit: Thales Antonio/ iStock
2 MIN READ

Is ‘Supposably’ a Real Word?

Supposedly, “supposably” is in the dictionary. While people often mix up these terms, they’re unique words with slightly different meanings.

by Bennett Kleinman
Close up of students going through dictionary

Much like with identical twins, it’s easy to mix up two words that look similar. This is why so many people confuse “supposedly” and “supposably,” the latter of which is often thought to be a misspelling or misuse of the former. But “supposably” is a very real word in its own right, so let’s take a look at what distinguishes these two words.

According to Merriam-Webster, “supposedly” means “according to what is or was said.” It dates to the end of the 16th century and comes from the Middle English supposen, meaning “to make a hypothesis.” “Supposably,” on the other hand, essentially acts as a synonym for the word “conceivably.” Its first usage is dated just a few decades later, in 1627. At first glance, these definitions are similar, but there’s a difference that we can attempt to clarify with the following hypothetical scenario.

Imagine your friend tells you about a new restaurant in town. She says “Supposedly, the chef trained in Paris and makes the best croissants outside of France.” This statement means it’s an opinion according to what your friend has heard. She’s repeating it to you, but there’s not a definitive confirmation. 

Now you’re looking forward to a visit to the café and you think to yourself: “If they really can make a croissant that rivals the ones I had in Paris, they would supposably be the best in the country.” “Supposably” here means it’s conceivable or possible that these croissants could be the best in the United States. 

These adverbs are very similar, and over time, “supposedly” has become the default choice. But there’s a nuance to their distinction, and “supposably” deserves to be used for those theoretical claims. 

Featured image credit: skynesher/ iStock
5 MIN READ

The Evolution of Measurement and the Words to Describe It

When coming up with new words, it’s important to take a measured approach — and in this case, we mean that quite literally. Here’s how measurement words such as “mile” and “foot” inched their way into the English language.

by Bennett Kleinman
Aerial view of a group of people running on a track

Various tools and units of measurement have served ancient and modern civilizations alike in helping to build, travel, and develop land. The earliest measurements were often based on materials directly available to people — namely, body parts and nature. But discrepancies arose from there. For instance, one person’s hand might be larger than another’s, so one man’s “12 hands” could be very different from his neighbor’s. As more precise measurements were needed, units of measurement were standardized. While the various amounts changed over time, we still use many of these early words for measurements today.

Using the Body as a Ruler

The human body has always been an easily accessible measuring tool, but over time, some standard measurements have been added to the definitions.

  • “Foot” has been used since medieval times to measure approximately the length of a man’s foot. It was standardized in the United States in relation to a meter as part of the Metric Act of 1866 and later the Mendenhall Order of 1893, which said that 1 foot equals 0.304801 meters.
  • “Inch” comes from the Latin uncia, meaning “a twelfth part” — apropos since 1 inch is 1/12 of a foot. At first an inch was roughly equal to the width of a man’s thumb, but in 1324, King Edward II declared it to be equal to three grains of barley lined up end to end.
  • “Span” is the length of a spread hand measured from the tip of the little finger to the tip of the thumb. The Romans (and later the English) considered this to be roughly 9 inches, while the Greek span was only 7 inches, as they measured the thumb to the forefinger. That pales in comparison to some modern spans — for NBA legend Michael Jordan, a span is equal to 11.375 inches.
  • “Yard” was initially the width of a man’s waistline. This evolved from the 12th century, when King Henry I determined that a yard was the distance from his nose to his thumb when his arm was outstretched. As for the word “yard,” it’s derived from the Middle English yerd, meaning “stick” or “rod.”
  • “Handbreadth” is the width of the average hand, generally accepted to be anywhere between 2.5 and 4 inches. Its etymology is quite literal, as it refers to the breadth (or width) of a hand.
  • “Pace” is the length of one step, while “double pace” is a step with each foot. The word came about in the late 13th century, from the Old French pas, meaning “a step.”
  • “Cubit” is a body-related unit of measurement that may have originated in ancient Egypt around 3000 BCE. One cubit equals the space between the tip of one’s elbow and the tip of the middle finger, which is generally around 18 inches.
Advertisement
Measuring From Farm to Table

While the metric system is commonly used in Europe, the British also use a supplemental system of measurement known as the imperial system, which was adopted through the British Weights and Measures Act of 1824. It replaced the Winchester system, which was in use from about the 15th century. Imperial measurements were based on nature and everyday activities and, similar to ancient measurement systems, on the human body. As agriculture expanded in England, larger measurements were needed.

The imperial system established some new terms and units of measurement for area, in particular.

  • 1 thou = 1/1,000 of an inch
  • 1 barleycorn = 1/3 of an inch
  • 1 chain = 66 feet
  • 1 furlong = 10 chains
  • 1 league = 3 miles
  • 1 perch = 272.25 square feet
  • 1 rood = 40 perches
  • 1 acre = 4 roods
  • 1 square mile = 640 acres

While some of the etymological origins for these words are clear (e.g., “thou” being derived from “thousandth”), others require explanation. Take “furlong,” for instance — it comes from the Old English furlang, referring to the length of a furrow (trench) in a 10-acre field. “Barleycorn” comes from the aforementioned anecdote about King Edward II, who declared 1 inch equal to 3 grains of barley. “Perch” is derived from the Old French perche, meaning “unit of linear measurement,” which was equal to 5.5 yards. To get to the imperial unit of a perch, 5.5 yards times 5.5 yards equals 272.25 square feet.

How Long Is a Mile?

Before the metric system was adopted in most European countries, the mile had origins in many languages. The Old English mil evolved into the Old Norse mila and the English “mile.” The Germanic root milja led to the Dutch mijl, the Middle Dutch mile, the German meile, and the Old High German mila. Latin-influenced languages derived from milia, with French becoming mille, Italian miglio, and Spanish milla.

While the words may have been (almost) the same, the actual distance was not standardized. In ancient Rome, a mile equaled 1,000 double paces — roughly 5,000 Roman feet, which equates to 4,860 modern feet. A medieval English mile measured 6,610 feet and the Old London mile measured 5,000 feet. Under the reign of Queen Elizabeth I, a 1593 statute did a few things for measurements: It established a shorter length for a foot, set the length of a furlong to 660 feet, and set the length of a mile to 8 furlongs, or 5,280 feet, which remains today.

Fun fact: In Middle English, a mile was also a measurement of time of about 20 minutes, which was roughly how long it took to walk a mile.

Featured image credit: Steven Lelham/ Unsplash
2 MIN READ

Why Do We Say ‘Take It With a Grain of Salt’?

Ever wonder why we say to “take it with a grain of salt”? This phrase traces back to ancient Rome — and may have once been quite literal.

by Tony Dunnell
Spilled salt shaker

When we tell someone to take information “with a grain of salt,” we’re recommending a healthy dose of skepticism — to not accept something at face value, or to have some doubt about a claim’s accuracy. But why salt, and why only a grain of it?

This idiom seems to have been around for so long that tracing its precise roots is complicated. But the leading theory goes all the way back to ancient Rome and Pliny the Elder’s encyclopedic 37-volume Naturalis Historia, published between 77 and 79 CE. Pliny recounts the story of how Roman general Gnaeus Pompeius found a poison antidote among the belongings of Mithridates VI, the ruler of the Hellenistic Kingdom of Pontus, following Mithridates’ defeat in 66 BCE. The instructions for the antidote, as described by Pliny, read as follows: “Take two dried walnuts, two figs, and twenty leaves of rue; pound them all together, with the addition of a grain of salt; if a person takes this mixture fasting, he will be proof against all poisons for that day.”

In Mithridates’ antidote, the grain of salt was quite literal — salt may have been included in the recipe due to the belief it could help neutralize poison, or simply because it would make the antidote more palatable. According to the theory, Pliny’s account of using salt to make poison ineffective became, over the centuries, a fitting metaphor for exercising caution when consuming questionable information. This theory isn’t beyond the realm of possibility, as Pliny’s Naturalis Historia has been studied for centuries — including during the 17th century, when the phrasing “with a grain of salt” reappeared.

According to the Oxford English Dictionary, one of the expression’s first known appearances in written English, in the sense of taking a statement with a certain amount of reserve, comes from John Trapp’s A commentary or exposition upon all the Epistles, which was published in 1647. Written examples of the idiom then became scarce for two centuries, before becoming far more frequent during the late 1800s and through the 20th century, by which time “taking it with a grain of salt” had become commonplace. It has lost Pliny’s literal connotation, yet it still stands as a guard against the poison of misinformation.  

Featured image credit:
5 MIN READ

The Rise of the ‘Linguistic Side-Eye’ — How To Convey Sarcasm Over Text

Sarcasm doesn’t always translate well over text — but that hasn’t stopped us from trying. Including emojis and alternating caps, here’s how giving a snarky side-eye is evolving in the digital age.

by Rachel Gresh
Woman reading text of phone with confused facial expression

Sarcasm is a form of verbal irony in which the speaker’s intended meaning is opposite to their literal words — for example, when someone mumbles, “What a great day,” after just missing the bus. It can bring humor, but it can also sting, reminding us just how powerful words can be. The etymology of “sarcasm” offers insight into its nature. It was derived from the Greek sarkazein, meaning “to speak bitterly” or “to sneer” but literally translating to “to strip off the flesh (like dogs).” This origin certainly paints a vivid picture of the power of sarcasm.

In modern times, understanding sarcasm is more difficult than ever due to generational divides and the emergence of new forms of communication technology. But for all its challenges, sarcasm can be a nuanced form of expression used to enhance our daily connections.

Social Impacts of Textual Taunting

Verbal sarcasm plays an integral role in human communication. Our understanding of it begins at a young age — many children can identify sarcasm and use it by the time they reach kindergarten. In one study of telephone conversations in the U.S., the phrase “yeah, right” was used sarcastically almost one-quarter of the times it was said.

This form of sarcasm is relatively easy to pick up on, but when we lose indicators such as vocal tone, facial expressions, and body language, we miss out on context. In three separate studies, psychologists at Chatham University concluded that, in general, we’re bad at detecting and gauging the emotional tone of emails. Most surprisingly, the results were poor regardless of whether a friend or a stranger sent the message. So, the next time you’re about to write a snarky or teasing text, you may want to consider its possible interpretations before hitting “send.”

Another of Chatham University’s studies showed that readers could recognize a sender’s anger but couldn’t gauge its intensity. This nuance is crucial in sarcasm. While the sender might be only slightly annoyed, a reader could misinterpret the message as if the sender were furious. Say you tell a friend you lost their sweater and they text back, “That’s just great, thanks!” You have no way of knowing how angry they really are. 

While sarcasm can foster rapport among friends, family, and co-workers who share a common understanding, it can also alienate or confuse those unfamiliar with the tone or context of the conversation. When we dish out the linguistic version of side-eye, we’re hoping the reader grasps this nuance without the need for face-to-face communication.

Advertisement
Modern Methods To Signal Sarcasm

Sarcasm isn’t just about what you say; it’s how you say it. And despite its potential for misinterpretation, digital sarcasm is a natural result of people communicating in new ways.  There are tools to help prevent ambiguity in your sarcastic messages by adding clarity, humor, and a touch of snark. Experts refer to these mechanisms as “textual paralinguistic devices.” Emojis, excessive capitalization, typed laughter, repeated letters, and excessive punctuation are all elements of this modern, nonverbal communication style. Each of these can convey a variety of emotions or tones, including sarcasm.

Punctuation — including ellipses, quotation marks, exclamation points, and asterisks — plays a big role in this phenomenon. If you ask your friend, “How was the interview?” and they respond, “It went great…,” you might infer that it actually went poorly. The use of ellipses in this manner reveals a generational divide. Younger generations use them to convey confusion, passive aggression, sarcasm, or uncertainty, while older generations use them more traditionally to separate thoughts or simulate spoken pauses.

Capitalization and italics work overtime in sarcastic situations, too: “It went GREAT” could be excited or sarcastic, depending on what they say next (context is key with textual sarcasm). Still, alternating capitalization almost always delivers sarcasm, as in, “It went gReAt.” This style, intended to mimic a singsong or mocking tone of voice, is seen across social media posts and memes.

Advertisement
Emerging Irony: Emojis

The newest tool for displaying sarcasm is the emoji. Used across social media platforms and in text messages, these tiny icons can help clear up ambiguity and convey tone. For instance, adding the Face With Rolling Eyes 🙄 or the Unamused Face 😒 after a cryptic text can clarify the intended meaning: “The interview went great 🙄” or “I’m SO excited for the party 😒.” Similarly, the Eyes 👀 emoji, which often implies that something is foolish, can be used alongside sarcasm, as in, “They SAID they’d finish it on time 👀.”

Some of these digital expressions can be interpreted quite differently, depending on which generation you ask. Psychologists note that older generations tend to use intuitive and straightforward emojis to avoid confusion. In comparison, younger generations often use emojis to express complex emotions and social intentions, including sarcasm. As a result, older generations may feel puzzled (🤔) while trying to interpret these messages.

For instance, even a simple Grinning Face 😀 can indicate pure sarcasm from a Gen Zer. “I failed my final 😀” conveys the same sentiment as “I failed my final 😡.” Older generations might be confused as to why a person would smile over a bad grade, while younger generations see the angry face as too on the nose, opting instead for a sarcastic grin. Similarly, the Upside-Down Face 🙃 can add sarcastic humor to a difficult situation: “I ruined my new shoes in the rain 🙃.” Even a well-placed Thumbs Up Sign 👍 can deliver sarcasm.

While sarcasm remains a vital part of language, it’s essential to consider your audience when using it in text, especially when there is a generational gap. Used correctly, however, linguistic sarcasm can be one of the most creative forms of digital communication, rich in nuance and humor, providing a playful way to connect with others. 

Featured image credit: Ivan Pantic/ iStock
3 MIN READ

Why Is the Etymological Origin of ‘Dog’ a Mystery?

“Dog” is one of the simplest words in English. However, its origins are anything but. For centuries, “hound” ruled the lexicon, leaving linguists puzzled by how “dog” suddenly took over.

by Tony Dunnell
Golden retriever puppy sitting next to mature boxer dog

In the English language, there are few words simpler than “dog.” Succinct and monosyllabic, it’s a perfectly commonplace word for good boys, pooches, mutts, pups, and man’s best friend. Etymologically, however, “dog” is a mystery. About seven centuries ago, dogs were commonly known as hounds, a word that came from the Old English hund. But by around 1500 CE, “hound” had been largely replaced by the word “dog” — a surprising replacement that seemingly appeared out of the blue, with few earlier forms to even fully explain the word’s existence. 

One theory among linguists is that “dog” comes from the Old English word dox, which described a type of color or shade that could have been dark, golden, or yellow, all of which could apply to dogs. Another possible connection comes from the Old English word dugan, meaning “to be good,” “of use,” or “strong” — all of which, again, can be applied to our faithful four-legged friends. Then there’s the Old English docga, a rarely used word that may have been applied to a specific, strong breed of dog, possibly the mastiff. 

Adding to the lexicographical confusion is the fact that docga was used more often in early Middle English as a deprecatory or abusive term directed at people, with no dogs involved. This, however, could be one explanation for the rise of the word “dog.” According to linguist Colin Gorrie, it’s possible that “dog” — in its canine sense — began as a term for a particular, despised kind of dog. But then with time, the word “dog” lost its negative implication, was repurposed as a term of affection, and somehow stuck. By the 16th century, it had become commonplace and largely supplanted “hound,” which today is typically used to refer to specific breeds of hunting dogs. Versions of “dog” appeared in many European languages around this same time: dogue in French, dogge in Danish, and Dogge in German.  

All in all, there’s a lot of speculation and theorizing involved in the origin story of Fido, with not much hard evidence in the historical record, ensuring that the etymology of the word “dog” remains one of the great mysteries of English. And perhaps that should come as no surprise. After all, the connection between humans and dogs can be traced back some 11,000 years, to the end of the last ice age — a long way back for even the simplest of words.  

Featured image credit: walik/ iStock