
5 Ways Writers Can Prep for 2025 Goal Setting
Before we roll on to the new writing year, let’s harness our optimism for the blank slate before us and prepare for our 2025 Goal Setting just for writers.
Today, we discuss writers and A.I. Keep in mind, that this is an ever-evolving topic where the ins and outs change quickly, but here are a few things writers should know about A.I. in early 2024.

True A.I. attempts to create programs that can think. Right now, we don’t have any true artificial intelligence programs, though programmers are working toward that goal. According to Ted Chiang: “[We are] a long way off from being able to create a single human-equivalent A.I., let alone billions of them.”
But there are programs that people are calling artificial intelligence, even though they are something very different. The kind of programs we’re talking about (things like Midjourney for art or ChatGPT for text) aren’t trying to imitate thought. They aren’t trying to get computers to think like humans, or even to create like humans. Not really.
These programs today are basically compilers. In the words of author Jason Sanford: “[Machine-learning] programs train on and then generate new versions of what people have already created before.”
In effect, machine-learning uses complex math to analyze massive amounts of data (written text in the case of programs like ChatGPT and images in the case of programs like Midjourney). Then when given a prompt, the program uses this pool of analyzed data to create an approximation of what a human would produce when given the same prompt. The results are rarely difficult to tell apart from a piece produced by a human. That’s not quite true. An artist has no trouble spotting A.I.-created images. A writer, editor, or well-read nonwriter will have no trouble spotting A.I.-created text.
A.I. makes mistakes that humans generally don’t because humans interpret data differently. Humans think. Computers do not. As a result, A.I. art (at least early on) often included obvious errors like too many fingers or nostrils in odd places on faces. And A.I. text is usually stilted and error-ridden. But, as developers tweak programs, the output is getting better. Will it ever mirror the work of a human? Probably not, or not well anyway, for a variety of reasons.

That doesn’t mean the creators of these programs aren’t working to reduce the flaws, but they are dealing with computers. And with computers, everything is numbers. You can get the computer to ignore certain things if you can devise a way to do it, but the computer will only be following directions. It won’t be thinking and evaluating. It isn’t actual intelligence. It’s a machine that has analyzed a mass of data and analyzed it to be able to make predictions based on complex math in order to guess what a person might create. Not smart people. Not talented people. It can’t make those kinds of evaluations. It’s simply going for “generic human” and even then, it falls short—a lot.
Another problem is that the massive amount of data needed to train A.I. had to come from somewhere. It was all created by a human. (This is pretty much true at this point, though if A.I. continues to train from things found on the Internet, it will eventually be training based on A.I.-generated art along with human-generated art. That is hardly likely to produce good results.) The humans who created and own the rights to the art/writing were not asked before their materials were used to “train” the programs. Overall, artists and authors have not been pleased, and they have certainly not benefitted from being the unpaid teachers of these programs.
To quote a statement by the Writers Guild of America: “A.I. software does not create anything. It generates a regurgitation of what it’s fed. If it’s been fed both copyright-protected and public domain content, it cannot distinguish between the two. … To the contrary, plagiarism is a feature of the A.I. process.”
Do writers ever use A.I in acceptable ways to help them create? It depends on how you define acceptable. In his ebook, Creativity in the Age of Machine Learning, Jason Sanford said he talked to an author who used Midjourney to help visualize scenes for stories. Another used ChatGPT when brainstorming ideas for stories. And another used A.I. to help create a synopsis of their book for submission. For some, each of these situations would fall under the heading of assisted tech for creators. For others, any use of A.I. is suspicious and unacceptable. Those with this viewpoint tend to believe that since the programs are also being used to create complete works that are then sold or used by companies that might once have employed an artist or writer, the programs should be avoided by all creators.

Also, short stories and articles created by A.I. have flooded submission systems at magazines and newspapers. As editors don’t have time to weed out these submissions, the logical result is to shut down submissions. That certainly doesn’t help living, human authors, but magazines and newspapers are often on the edge financially already. They cannot afford the extra staff needed to weed out these inappropriate submissions.
An interesting side note on the flood of A.I. stories. Editors say they have no problem spotting stories created by machine-learning programs. This isn’t shocking. People whose job depends upon them evaluating good writing are going to spot the things that make machine-compiled stories awkward and unusable. These kinds of machine-learning programs tend to produce material with awkward language and serious flaws. The producers of these programs know it, though they use romantic terms for these errors, saying the programs “hallucinate” or suffer from “delusions.” This is simply another way to say the materials they produce are flawed and should not be accepted at face value.

Right now, it’s difficult to foresee what will happen with these machine-learning programs. Many writers worry about what these programs will mean for their livelihood, especially if the serious flaws in the programs are eventually ironed out. And the legality of the way machine-learning was trained is being tested in the courts.
Right now, public opinion is mixed on whether these programs are a plus or a minus for society. In the meantime, it will be interesting to watch the proceedings. But don’t be sucked in by companies that want you to think of these programs as alive or creative or aware. They’re programs. They don’t dream or hallucinate, though sometimes they’re glitchy, as you might expect with complex programs.
They are meant to be a tool, but the jury is still out as to whether this tool will be a good thing or not. Either way, it’s likely to be something we’re still talking about for the next few years as the software continues to change and improve.
With over 100 books in publication, Jan Fields writes both chapter books for children and mystery novels for adults. She’s also known for a variety of experiences teaching writing, from one session SCBWI events to lengthier Highlights Foundation workshops to these blog posts for the Institute of Children’s Literature. As a former ICL instructor, Jan enjoys equipping writers for success in whatever way she can.

Before we roll on to the new writing year, let’s harness our optimism for the blank slate before us and prepare for our 2025 Goal Setting just for writers.

Writers can be thin-skinned when it comes to getting feedback on their work. Let’s look at 4 ways to positively deal with constructive criticism!

Rejection is part of the territory when it comes to being a writer. Today we offer reflection for writers to help redirect your efforts after a rejection.
1000 N. West Street #1200, Wilmington, DE 19801
© 2024 Direct Learning Systems, Inc. All rights reserved.
1000 N. West Street #1200, Wilmington, DE 19801
© 2024 Direct Learning Systems, Inc. All rights reserved.
1000 N. West Street #1200, Wilmington, DE 19801
© 2024 Direct Learning Systems, Inc. All rights reserved.
1000 N. West Street #1200, Wilmington, DE 19801
1000 N. West Street #1200, Wilmington, DE 19801
2 Comments
Thanks, Jan. As always an excellent article.
I’ve already had a client move to using software to compile drafts based on client interviews. Authors then edit the drafts at a lower fee than given when we didn’t have this tool.