A few days ago, a friend of mine posted an article on Facebook concerning the use of the word ‘literally’ in the increasingly popular figurative sense. The article unfortunately seems to have disappeared at the moment, but the gist of it was pointing out that most dictionaries have now appended this alternate meaning to the word’s definition, and explaining that this usage may have originated with, or was at least first recorded in, Frances Brooke’s The History of Emily Montague (published in 1769).
Naturally, this sparked a conversation on whether or not this sort of language development is acceptable. Prior to encountering this, I had no idea there was even a movement to gain legitimacy for this non-literal use of ‘literally’, as it’s kind of the butt of diction jokes everywhere, but there are apparently many who feel that rejecting it (or any other semantic shift) amounts to needless linguistic authoritarianism.
My own take on the matter was as such:
The evolution of language and words is a natural, inevitable thing, and in the general case it is something to be embraced. That said, stability is a necessity of language’s functionality, thus any given modification cannot be assumed to possess intrinsic merit.
Language is a tool of communication, of which clarity is an important aspect, and I should assert that preserving its ability to convey meaning is a not an unworthy goal, particularly in an instance such as this wherein the suggested secondary interpretation of a term, when used in the same context, implies something strictly antithetical to what the accepted definition would. This dilution of precision, while admittedly neither entirely untenable nor without precedent, is nevertheless customarily unfavorable.
(That the word was used in this sense a few centuries ago scarcely argues its virtue – words have been used improperly since words were first words, and most such misuses do not incur a change in their respective societal perceptions!)
But, I’ll concede to being guilty of a little linguistic snobbery. People will say what they will, and language will be thus, regardless of what may or may not be in its own best interest. (>^-‘)>
I’m curious to hear where others lie on the issue, though. Any thoughts to add?
I saw that screencap of the new definition floating around social media. While I’m all for the fluidity of language, this is upsetting in the sense that the use of the word has been essentially reversed. It’s no longer just a matter of diction or nuance now: from an etymological standpoint, the word has a long history of meaning what it means. Plus, there is already a word for what “literal” has become.
It’s the principle that upsets me more than anything: do something the wrong way enough times and eventually it will become accepted. This is a poor precedent to set. I know it’s a losing battle, though. The same has happened to much more beautiful languages throughout history. English was never the most sophisticated beast from a linguistic standpoint, but it’s still frustrating to see things like this slide.
I’m…just going to see myself out before I get on a soapbox about Greek and Latin…
Great post. :)
That’s essentially my concern in this instance – if a word can mean its own opposite in any given scenario, it effectively means nothing at all. If I were to literally eat a horse out of hunger, I’d like to be able to say so without being assumed to have meant that I did not, in fact, eat a horse at all. (>^-‘)>
The “wrong so long it’s right” phenomenon has indeed claimed many victims. I still cringe every time I hear ‘irregardless’ or see bastardized clitics in the form of “Charles’ apple” or some such!
Exactly. If “literally” can mean “figuratively”, then what word do I use when I mean “literally”? Really and truly? In actuality?
Maybe literally-literally, to emphasize that you literally mean the word ‘literally’ literally? Can ‘literally’ be an escape word to denote that what follows (including itself) is to be taken at face-value? Then again, isn’t it supposed to, anyway?
…I guess that’s the problem. (>^-‘)>
I have always watched the changes that have taken place in the language with a deep suspicion. Working in news, I saw many young reporters make up a word, or use one they had heard rather than finding the right word. Yes, the magnificent thing about English is it is fluid and ever-evolving but condoning a misuse because it has become the norm is not evolution but rather the reverse.
Indeed. One should not mistake the benefit of the ability to change with the notion that every change is a positive one.
It is hardly the first word to become its own antonym, and it is in fact listed on wikipedia’s [auto-antonym page](http://en.wikipedia.org/wiki/Auto-antonym)
That said, just because auto-antonyms are a common and accepted part of the language, that doesn’t mean I have to like it when I see a perfectly useful word join their ranks.
Which is precisely why I said “not without precedent, but nevertheless customarily unfavorable”. (>^-‘)>
I am a word snob as well. You can’t have the same word with two conflicting meanings.
It makes it needlessly difficult to express the intended meaning!