Artificial intelligence has grown by leaps and bounds over the past few years – even over the past few months. It’s now possible to generate convincingly-human text with just a simple prompt, telling an AI what to spit out. Some have even advocated the practice of using AI to write blog posts and other content, greatly speeding up the process and freeing up humans for other tasks.

While that’s not a practice that I can support – for a variety of reasons – it’s led many to wonder if artificial intelligence could one day replace human writers altogether. And I have to admit, the thought has crossed my mind.

Fortunately, I don’t think it’s going to happen – at least not for now and not in the way some think.

How do AI writers “learn”?

Despite what some might claim, AI writing assistants like Jasper don’t actually think. Instead, they’ve been trained with buckets of data, having “read” gigabytes of text from real, human authors in an effort to sound human.

These tools know how to sound like many famous authors – both present and past – and can make true-sounding claims based on data they’ve collected from other sources. While that may sound impressive, this isn’t learning in the sense that you and I learn.

When we – humans – learn, we take in information, some of which is true and some of which is false, and we form our own thoughts and opinions. We can formulate our own new material based on the information we’ve taken in instead of rearranging and rewording content from others.

If you look at the sum of all data that was used to train the models upon which AI writing assistants were built as building blocks, you can think about what they do as rearranging and placing blocks in a way that they predict seems correct based on a prompt. Let me say that again: even the best AI writer can only take previously-written content and rearrange it in a way that it predicts is most likely correct based on a prompt.

In other words, they can build a tower with blocks they already have, but they can’t create new blocks like you and I can.

At least not yet.

Writing with AI is like a digital game of Telephone.

I really love the analogy that fellow Medium writer William Egan uses in his article, Why AI Can’t Replace Writers. He writes:

“Like a long round of The Telephone Game, each AI-generated re-write decreases the chance that the article is factually accurate. What begins as factual accurate information will become more distorted with each iteration, and eventually, it will become impossible to separate fact from AI-generated error.”

Even if an AI writing tool generates content that sounds enough like a human to publish, it may not be factually correct. It might be relying on incorrect information from another source – meaning your newly-written content is also incorrect.

This presents a real problem for anyone hoping to use AI to spit out huge chunks of text, like blog posts, while relaying factual information that provides value to a potential audience.

William gives an example in which he asked Jasper “What is Digital Nursing?” The answer he received sounds close enough to something a human would write, but when you look closely, you can spot a glaring issue: The AI completely misunderstood what digital nursing is and then proceeded to “write” several paragraphs based on that faulty information.

You can imagine how someone who isn’t as familiar with the subject as William is might have relied on Jasper’s output blindly, only further propagating false information.

And here’s the real kicker: If someone “writes” something using AI and that content is later used to train AI further, isn’t that just compounding the issue?

Google (rightly) doesn’t like content written with AI.

The undisputed king of search, Google, clearly doesn’t like this trend of using AI to write website content. And as someone who happens to be a flesh and blood human (at least I think so), I’m glad they’ve taken the stance they have.

On August 18th of 2022, Google published More content by people, for people in Search.

While Danny Sullivan, the author of the post, doesn’t come out and say “artificial intelligence” or “AI” in his post, you can read between the lines:

“Many of us have experienced the frustration of visiting a web page that seems like it has what we’re looking for, but doesn’t live up to our expectations. The content might not have the insights you want, or it may not even seem like it was created for, or even by, a person.” (Emphasis added)

He goes on to say:

“We know people don’t find content helpful if it seems like it was designed to attract clicks rather than inform readers. So starting next week for English users globally, we’re rolling out a series of improvements to Search to make it easier for people to find helpful content made by, and for, people.” (Emphasis added)

What’s not a person? An AI writer. An artificial intelligence writing assistant can churn out content that sounds human, but human it is not.

Google wants to prioritize content that’s been created by real people – real, live, flesh and blood people – not a machine. That’s why AI content doesn’t rank as well in search – nor should it.

As William Egan says in his Medium post, “Google isn’t naming names, but the message is very clear.”

Indeed it is.

AI writers can only simulate opinions and emotion.

However almost-human an AI may sound, it can only ever simulate human thought and feeling. As I mentioned above, AI can’t take in information, think about it, form an opinion, and express feelings. Sure, it might be able to string together the words “in my opinion” or “I feel…” but it’s just not the same.

I can’t speak for anyone else, of course, but I know that I don’t care what a machine “thinks” about an issue. I want to know what a real, flesh and blood human being thinks. Would I want to read or listen to a sermon at church written by an AI pastor? Do I care about what Jasper thinks about current events? Can a machine-written piece about mental health, really offer support from someone who has experienced the same thing?

To all of these questions, I answer with a resounding, “NO.”

There’s just something unique about reading the words of another human. It is a profoundly lonely thing to read only the words of a computer – words that didn’t come from the mind of another real intelligence.

After all, there’s a reason that AI stands for artificial intelligence.

As a human, you are able to read a news story, think about it, form your own opinions, and support your opinion with facts. You’re able to feel based on information you’ve received and, as you feel, create content that’s informed by true human emotion.

A machine can’t do that.

How should writers today use AI tools?

Much to the chagrin of some, I detest the suggestion that artificial intelligence should be used to spit out full blog posts and articles. I think that’s a lazy practice that cheapens the writing profession for everyone and makes a mockery of those who actually put effort into crafting content that provides value. In fact, I think there are some dangers to consider when using AI to write articles.

But that doesn’t mean that artificial intelligence is bad or that it has no place in an ethical writer’s repertoire. On the contrary, I firmly believe that AI can be a huge asset to those who want to write their own content. I’ve enjoyed using Jasper, but there are plenty of great alternatives out there – with more on the horizon.

I’ve found AI very helpful for:

  • Coming up with content topic ideas
  • Planning content outlines
  • Rewording phrases and sentences in a way that flows better
  • Doing very-preliminary research and fact-finding that will be verified by a human before publication

It’s been a great help to me for overcoming writer’s block and planning out my content. It can make suggestions that I either run with or disregard. (And I do reject a fair bit of what it suggests.)

Whether you’re creating website content or writing for Medium, AI might be helpful. But it should never take the place of actually writing your own content. Please don’t buy the lie that it’s okay to have AI generate posts for you, removing the need to do your own writing. Some well meaning people suggest that, but I don’t think it’s a good plan.

Even in my limited experience using AI, I’ve found it to be a powerful tool to assist with my writing, but it can’t replace me. It can’t come up with new, original thoughts. All it can do is take previous building blocks handed to it – either by a prompt or through training with other, previously-written content – and rearrange them into something that sounds new, even if it isn’t really.

AI might be able to suggest I write a post on a certain topic, but I don’t trust it to write that post for me because it might be working with incorrect information, simply regurgitating that for me. That’s no way to provide value to my readers. And if you’re doing that, I don’t want to read your AI written content. Period.

How will AI impact writing in the future?

Many people have taken it upon themselves to speculate and make predictions about the future and what it will look like with artificial intelligence around every corner. I won’t say I’m not concerned. I do think that AI will have an impact on jobs, but just what kind of impact and how big… well, that’s yet to be determined.

I used to think that AI writing assistants meant that in the future, human copywriters and editors would be out of a job – or at least, less employable. But as I’ve come to understand more about how these tools work and just what they can do (and not do) I’ve come to the position that AI will be a powerful tool for content creators, not the means of mass unemployment – at least… not in the way I once thought.

Some jobs may be in less demand due to the advent of AI content creation tools. When AI can block out content in seconds what it would take a human hours, why pay the human – right? But because AI can only remix information it’s previously “learned” to generate what it predicts is correct based on a prompt – and that without real human thoughts and emotion – there will still be a place for humans.

On the other hand, these new tools could, conceivably, power a massive growth in the content creation marketplace.

With AI by our sides to help us get past writer’s block, how much more content could we produce? And better content at that, in theory.

I believe writers of the future will be expected to be familiar with or even use tools like Jasper and its many alternatives. But I can’t imagine that anyone familiar with these tools could truly think that they could replace real human beings who can think and really evaluate information, relaying it in an understandable way, building lasting connections with people on the other side of the screen.

Much could be said about this, though I suspect about as much would be speculation. The reality is no one knows for sure just how much artificial intelligence will impact writers or those in other creative industries. But I think we can say that no matter how much AI advances, it will never be truly human. No matter how much it spins and reworks prior content into something it calls “new” (but isn’t really), it can never create what’s most important about human-generated content.