In September of 1982, a handful of researchers at Carnegie Mellon University using the ARPANET, the proto-internet network established by the Department of Defense that included several forum-like bulletin boards, got into a multi-day discussion about the behavior of elevators in free-fall. It involved several physics questions. What would happen if someone left a lit candle in there? A small puddle of mercury? A helium balloon? Let’s say several two-pound pigeons flew the coop straight into the cable-cut elevator—would they fly around in panic? If they sucked up the helium from the balloon, would their squawks reach whistle range?
The answers flitted between sincerity and sarcasm, theoretical physics and the empirical fact that every poster shared the same university building. “Because of a recent physics experiment, the leftmost elevator has been contaminated with mercury,” one researcher joked. “There is also some slight fire damage.” For some, the gag did not land. A somewhat self-serious administrator followed up to clarify that the elevators were fine, that mercury spills were dangerous, that yelling “fire” in a crowded chat room was bad news. “The reaction was: yeah, OK, Rudy,” said Carnegie Mellon Professor Emeritus Scott Fahlman, the man credited with inventing the emoticon.
To cure Rudy’s terminal seriousness, the researchers proposed a system for flagging jokes. One suggested an asterisk (*) in the subject line of sarcastic messages. Another insisted the percent sign (%) was a superior symbol. A third said an ampersand (&) resembled a “jolly fat man in convulsions of laughter.” A forth maintained the hash (#) looked like two lips with teeth showing. (“This is the expected result,” he added, “if someone actually laughs their head off.”) It was Fahlman, then a junior faculty member, who guided the falling elevator debacle to the ground. Mid-morning on Sept. 19, 1982, he wrote:
I propose that the following character sequence for joke markers:
Read it sideways. Actually, it is probably more economical to mark things that are NOT jokes, given current trends. For this, use
Is the Trump Administration Really Going to Ban TikTok in America?
What Even Is TV Anymore?
Almost 38 years after Fahlman’s message, Friday marks the sixth “World Emoji Day.” The informal anniversary held each July 17 (a nod to the date displayed on the calendar icon) began in 2014, when Emojipedia founder Jeremy Burge noticed every week seemed to involve some “fun, dumb Twitter holiday.” Billed as a “global celebration” of emoji—where participants vote in the World Emoji Awards, attend Facebook emoji events, or sing the #WorldEmojiDay anthem—the holiday is mostly used for hashtag campaigns, brand announcements, and corporate promotions, somewhat to Burge’s dismay. But it is also a quiet gauge for how widely the familiar icons have proliferated in pop culture.
The first emoji arrived in 1997, when designers at SoftBank, the Japanese cellphone carrier, released a set of 90 coded images, many of which resembled those available on smartphones today. (Others have contended the set of 176 emoji released by Shigetaka Kurita of NTT DoCoMo marked the first set; the collection was acquired by the New York Museum of Modern Art in 2016). Much like the members of Fahlman’s bulletin board, these initial versions had trouble communicating. They were all encoded differently. A heart on one device might appear as scrambled numbers on another. It wasn’t until 2010, after several Google employees petitioned the Unicode Consortium, a non-profit that standardizes character encoding across all devices, that a shared digital language for emoji emerged.
The origins of emoji and emoticons have no obvious relationship. The similarities of their names is pure coincidence. Emoji comes from the Japanese word e (picture) and moji (character); emoticon, from the English blend of emotion and icon. “I think the idea of trying to put a face or a tone with your text is a lot older than any kind of computer-era way to implement it,” Burge said. “There’s been earlier examples of typewriters and other books that have stickers in them, and the like.”
But it is likewise true that Fahlman’s initial irony hack—and that of a rival invention claim from a man named Kevin McKenzie—spread as the ARPANET grew to include more universities and later gave way to private networks, before its decommission in 1990. “There were only 12 or 15 universities on the ARPANET at that time. That was the edge of the universe,” Fahlman said of his initial post. “Once it had infected all those places there was nowhere else it could spread. But that very year, the ARPANET, which had only been D.O.D. and universities, got turned over to civilian control. Then, there were 100 universities. It was like these sailing ships discovering new islands, and suddenly all these islands have rats.”
In 1991, when The New Hacker’s Dictionary—an update of the informal programmer glossary known as the Jargon File—came out, it had an entry on emoticons, citing Fahlman. “Note for the newbie,” the definition added. “Overuse of the smiley is a mark of loserhood! More than one per paragraph is a fairly sure sign that you’ve gone over the line.”
Much like emoji debates of recent years, the emoticon became a battleground in the larger conversation of how to communicate online. “When it caught on there were mixed feelings,” Fahlman said. “Originally, I’d say 80 percent thought it was cool, and 20 percent thought it was a certain kind of slime—a perversion of civilized communication.”
Few people had stronger opinions on these late additions to the lexicon than Neal Stephenson, the science-fiction author whose novels, Snow Crash and Cryptonomicon, became cyberpunk classics. In a 1993 edition of The New Republic, Stephenson railed against the perceived need to mark sarcasm in online communication. “Irony, it seems, is like nitroglycerin,” he wrote, “too tricky to be good for much, and so best left in the hands of fanatics or trained professionals.” Stephenson later recanted his complaints. In 2003, the author noted on his website that, upon rereading both Fahlman’s and his own writing on the subject, “I end up agreeing with Fahlman, and thinking that this Stephenson kid must be living in some kind of fantasy world.”