Writing and Videos

Music and branding, research, teaching, music.

How Writers Should Be Using AI Right Now

AI Writers Are Scary Good

Artificial intelligence is having a moment right now, especially if you are a professional creator. Especially if you are a writer. In this article, I want to propose one way that authors can utilize AI writers ethically—saving time and producing more writing, while saving money—by using AI writers as research assistants.

If you have tried any of the artificial intelligence engines such ChatGPT, Notion AI, or Lex, you know what all the fuss is about. The technology is jaw-dropping in its ability. You can ask Chat GPT to write a story for you, and it will. You can ask it to outline a book for you—it will. You can ask it to make a spreadsheet for you—it will. You can ask it to write functional Python or Javascript code, and it will do that too.

If you’ve spent your professional career as a writer, wrangling words on a page to express meaning, the ease with which these AI writers can produce coherent prose can seem like a dark and enticing magic.

The technology holds huge promise, and many pressing questions about ethics. The implications for education are staggering. If a university student can produce a convincing essay on any topic in under thirty seconds, and the result can evade plagiarism detection, how can professors hope to police learning? How will the writing profession absorb, react, and develop as a result of the proliferation of AI writers?

For writers, it feels like something fundamental has changed.

And this is only the tip of the iceberg. We will look back on these early examples and consider them but a drop in a much more vast ocean that is beyond what we can imagine now.

One natural reaction to the emergence of AI writers is despair—why continue writing or creating, if AI can do it faster, more accurately, and better than you? Another, opposite reaction, is greediness. Greediness at the promise of getting something for nothing. Imagine Chat GPT content farms that churn out search-engine-optimized prose that attracts clicks and subscribes and harvests ad money at a volume previously unimaginable. Surely, they already exist.

But how can we take a more nuanced view? If history is any tutor, AI writers will find a place in our ecosystems just like all other technology has.

Chat GPT is neither a savior, nor should it be burnt at the stake as a heretic. It’s a tool. A tool just like the typewriter was a tool. Like the encyclopedia, personal computers, Google search, and Wikipedia are tools. Is using a thesaurus cheating? AI writers are a tool that we can use ethically, to our advantage, to make us more human and creative.

To start, I propose that a good metaphor for using AI writers is as a research assistant.

Using AI Writers as Research Assistants

What is a research assistant? Let’s ask Chat GPT:

Q: In one sentence, what is a research assistant for an author? Here is the response I received:

“A research assistant is a person who helps an author with research for a book, article, or other written work by gathering and organizing information, conducting interviews, reviewing and synthesizing literature, and completing other tasks to support the author's research process.”

So a research assistant is valuable because they do lots of time-consuming leg work, which saves the author time and energy. Often, a research assistant does non-creative tasks, such as compiling statistics (how many books has Salman Rushdie written, what are their titles, when were they released, and who published them?), finding examples of a theme (who were the most prominent feminists during the first year of World War I), or summarizing large chunks of literature on a topic (what are the most influential books on the creative process written in the 1960s?).

Answering these questions is pretty straightforward, as long as you have strong research skills, an attention to detail, and a good measure of persistence. Oh, and a whole lot of time. These questions could be answered by a research assistant doing part-time work in a week or two.

They are also questions that Chat GPT can answer in under 90 seconds (74 seconds, to be precise; I timed it). And while the answers aren’t perfect, they are actionable. And that’s all an author could ask of a research assistant’s work anyway—just enough to give a direction for the author to follow up on the work from there.

Having a research assistant is a status symbol for an author. And having an RA is something of a dream for many writers—I have always listened to other authors talk about their RAs with a bit of envy.

And being a research assistant can be great training for a future writing career. Ryan Holiday often mentions as part of his origin story that he dropped out of college to apprentice as a research assistant under the author Robert Greene, of The 48 Laws of Power fame. There is something of an aura around research assistants.

But research assistants are scarce, and most authors don’t have them. This is probably because there is only a meager market for research assistants—most people who have the skills needed want to be creators themselves (low supply), and most authors who need research assistants don’t have the money to pay for one (low demand).

I’ve often wondered how much better and more productive authors could be if we all had research assistants. A couple months ago, that was an unimaginable dream. Now, with AI writers, it doesn’t seem so far fetched.

Using AI Writers Ethically

The prospect of AI writers becoming widely used in writing-based industries such as the news media, law, scholarship, business communication, and others makes me extremely uncomfortable. Using the metaphor of a research assistant is also useful to help us discern where to draw the line between using a tool and unethical plagiarism. Here are some principles to keep in mind as you use AI writers in your own work:

Don’t “plagiarize” the AI writer. In the same way an author wouldn’t use passages written by a research assistant wholesale and without modification, writers shouldn’t use text written solely by AI.

Use AI writers for traditional research assistant tasks. The research assistant fills in dates and figures, summarizes texts, and gathers information on entire subject areas and historical examples.

Use AI to cut through the noise. Using an AI writer can help you, the author, home in on examples—books, historical figures, theories—that are relevant to the thesis you are trying to support. You can use an AI writer to find these examples, then follow up on them with your own research.

Stay in close proximity to your expertise. It’s better to use AI as a research assistant in areas you have some expertise in. Ask in fields that you understand or that are tangential to your field of expertise, so that you can judge. (Don’t ask for theoretical math principles if you have no experience with that.)

You are always responsible. Human research assistants are fallible. They make mistakes. And the same is true of AI writers. You have to fact-check and double check.

Understand the nature of your results. AI will tell you what’s known. AI writers, by easily corralling and communicating what’s known, may lead to an increased emphasis on truly novel research—adding knowledge that is not currently part of the consensus.

AI writers are just getting started, and it seems inevitable that they will change the nature of the writing profession in fundamental ways. Using AI writers as research assistants is one way that professional writers can use this technology ethically to support and extend their output.

What do you think?

What do you think? Is using an AI writer as a research assistant ethical? Are there other uses for AI writers that you have uncovered? Share your thoughts in the comments.

Mark Samples