12 Comments
User's avatar
James Wang's avatar

One general comment I'll make myself—I think one thing I didn't cover in the piece is that the problem with AI writing, when done poorly, is asymmetry. Specifically, expecting the reader to give you time and attention... when you didn't spend much yourself. Ultimately, this is what characterizes "AI slop."

Utilizing AI in writing, when done poorly, does this. This is still new, and I'm experimenting. But I suspect that when done well, it should be indistinguishable or better than writing without AI. Why? You don't really do anything different in the craft—at least not for anything you actually release to the world. And, if anything, it does the "low differentiation" parts of linking news articles or cross-checking stats, freeing you up to spend more time on what actually makes your writing yours. For me, it certainly doesn't "save time" I need to spend on a piece. It mainly reallocates how I spend it.

Shwetank Kumar's avatar

One thing worth adding on the asymmetry is that it runs in the reader's head before they finish the first paragraph. Readers know AI can generate fluent prose, so they pre-filter — deciding whether you spent real effort before they invest theirs. Writers using AI well end up paying a trust tax generated by writers using it badly.

Which reframes "indistinguishable or better." The bar isn't output quality anymore, it's visible signals of effort — specificity, a point of view that couldn't have come from a generic prompt. AI makes the generic parts of writing cheap, which puts more weight on the parts only you can produce.

Devansh's avatar

I think part of the distrust for ai comes from a association of input to output. I spent 10 hours on this so it must be good ts.

Lacking ways to measure quality we use proxies, many of which are effort related. This is where AI pushes back, changing out value calculations

Brent Naseath's avatar

So essentially you are saying that AI is ghostwriting your articles and you are reviewing them and taking credit for it like everybody else does so it's okay? When I see books supposedly written by famous people that are obviously written by ghostwriters, even if the famous person gives them some information and tells them the topic, I have no respect for the famous person at all. It actually hurts my relationship with them, i.e. their brand. But that's just me.

I've also used AI to see if I can get better quality writing. I write the chapter and have AI review it. I've tried it several times with all four major AIs. It often changes the meaning of what I'm saying and it introduces AI speak. So, even though some of it was more eloquent, I chose to be authentic instead. But that's just me.

I think the basis of people reading the work and following it is the relationship you have with them which is built through your writing. So essentially, the relationship is now with AI when AI does the writing. I might as well just take the intro and plot of a book and ask AI to write a book for me based on the same and read that book and skip the first that's for sale. Right?

James Wang's avatar

Not really. Ghostwriting is much more “end-to-end.” That, in my opinion, is truly not-authorship with someone slapping their name on it at the end. I think it’s pretty hard to argue in either my case or Megan McArdle’s case that authorship is in question. It’s not to say that some people’s use of AI doesn’t straight up veer into “ghostwriting”, but the point I’m making here is more that isn’t that different than the normal editorial process with humans in the loop.

What I’d personally characterize what I do (even putting aside this particular article, as said, actually was written from beginning to end just with good ol’ fashioned typing and zero expansion anyway), is similar to having an analyst expand my (quite extensive) outline into prose…

And then largely changing a bunch of things to make the prose my own.

In cases of more… viewpoint based articles, it’s usually a quite extensive revamp. In the case of more “here’s some economic data and what I think,” it’s usually much less extensive. If you look at the fully open process I share in the link where I went through it, you’ll see the draft’s feel/tone pretty radically shifts from my edits. And then I often go and change/shift things multiple times.

Regardless, I think I’m mainly trying to make the point that the tools don’t writing what it is—it’s the decision of what and how you craft the writing.

I think perhaps I should have gotten into “AI slop” in this article (I did when I talked about this topic in my book), but putting aside obviously bad, “close my eyes and just ship what the AI writes” what you decide to ship is what matters. You mention not taking AI’s “better” suggestions to be authentic. I think that means the AI suggestions aren’t better! The entire point to the craft of writing is choosing how to express yourself, regardless of whatever technical finesse there is in it or not.

I never allow something to get posted if it isn’t written in the way I’d write it… which likely means the AI, no matter how much I give it feedback over time, will likely never have a “zero changes” article. Because even if it perfectly emulates me (which it doesn’t), my prose will probably change based on simply mood or feel at the time I write. But that’s all right and is part of what it means to write something.

Brent Naseath's avatar

Well, it must be working because I've always taken your articles as your voice and authentic. I'm not sure you've said anything in your comments that wasn't in your article but the discussion helped clarify the topic for me. So thank you for the effort.

James Wang's avatar

Of course. I think it’s a really important conversation and I appreciate your push and engagement here as always. It helped draw more things out and clarify things I should have covered in more depth.

I view my responsibility this way: I’m going to push the boundaries—because if AI really is going to replace thinking/writing/everything in a serious way… I want to be first to know and then let everyone know what’s coming.

As it stands, it remains a useful tool, but still merely a tool. It’s no more capable of replacing a human author’s craft than Microsoft Word—even if it does a better facsimile at first glance. It’s still quite obvious something is missing when a human is absent or delinquent (well, aside from rote reports or “business writing,” which already are missing “soul,” human or AI).

Bill Bishop's avatar

Thank you, illuminating and useful post. Such a smart way to leverage ai

Geoff Campbell's avatar

The mechanics of writing are pretty simple.

What would ai contribute to the learning process?

James Wang's avatar

It doesn't. Specifically, all of the hardest things in writing... it doesn't help with at all. How should I think about the topic? What is an interesting way to construct the argument/metaphor/etc.? How do I make this flow better?

Mechanics-wise, though... In non-fiction/technical writing, there are many specific things you need to do that don't add huge value, but you just need to do them. Linking the specific sources (even if they're just news articles). Pulling specific examples from a text you know you want to grab them from. Grabbing and downloading specific charts. Ensuring that specific numbers are right, versus "roughly right" that I know off the top of my head.

This isn't "writing" per say. It's NOT hard. Which, to your point, is pretty simple... but has to be done. THOSE are the things I think AI can help the most with, but it's also something that certain writers have people to help with as well—research assistants, editors (of various capacities), copyeditors, fact-checkers, etc. It doesn't help with "core" of writing, whatever that is.

Geoff Campbell's avatar

Yeah, sorry. I was imagining kids in school learning how to create a paragraph and present an idea or argument. The AI I know is a fabulous search tool but not a creator. It's quite the production to put together a professional paper!

James Wang's avatar

Oh, that. Yeah, I'm not certain. A vague, nagging worry I've had while writing this piece is some students/younger people still developing their skills might use AI to "opt out" of learning how to write. I've mostly said in my articles that I don't think it'll be a systematic issue, but it could be for certain individuals.

For how it could benefit... Just thinking out loud, AI is poor at creating novel arguments itself, but it would likely be able to give fast feedback/critiques if students want to iterate on their writing. It's a different "skill" to be able to use rules to evaluate whether or not an argument is good versus creating one from scratch.