Axios got traction because it heavily condensed news into more scannable content for the twitter, insta, Tok crowd.
So AI is this on massive steroids. It is unsettling but it seems a recurring need to point out that across the board many of "it's because of AI" things were already happening. "Post truth" is one I'm most interested in.
AI condenses it all on a surreal and unsettling timeline. But humans are still humans.
And to me, that means that I will continue to seek out and pay for good writing like The Atlantic. btw I've enjoyed listening to articles via their auto-generated NOA AI voice thing.
Additionally, not all writing serves the same purpose. The article makes these sweeping claims about "all of writing". Gets clicks I guess, but to the point, most of why and what people read is toward some immediate and functional need. Like work, like some way to make money, indirectly. Some hack. Some fast-forwarding of "the point". No wonder AI is taking over that job.
And then there's creative expression and connection. And yes I know AI is taking over all the creative industries too. What I'm saying is we've always been separating "the masses" from those that "appreciate real art".
Same. New yorker is the other mag I subscribed to.
Until 3 weeks ago I had a high cortisol inducing morning read: nyt, wsj, axios, politico. I went on a weeklong camping trip with no phone and haven't logged into those yet. It's fine.
I agree with this in general but with caveats. For example I think reading national-sized news every day sucks. But if you're of a specific demographic it might be useful to keep pretty up to date on nuanced issues, like if you're a gun owner you will probably want to keep up to date on gun licensing in your area. Or if you're a trans person it's pretty important nowadays to be very aware of laws being passed to dictate your legally going to whatever bathroom or something.
I have this theory that the post-truth era began with the invention of the printing press and gained iteratively more traction with each revolution in information technology.
I think you're right, but I also think it's worthwhile to look at Edward Bernays in the early 1900s and his specific influence on how companies and governments to this day shape deliberately shape public opinion in their favor. There's an argument that his work and the work of his contemporaries was a critical point in the flooding of the collective consciousness with what we would consider propaganda, misinformation, or covert advertising.
"Is Claude Code junk food, though? ... although I have barely written a line of code on my own, the cognitive work of learning the architecture — developing a new epistemological framework for “how developers think” — feels real."
Might this also apply to learning about writing? If have barely written a line of prose on my own, but spent a year generating a large corpus of it aided by these fabulous machines, might I also come to understand "how writers think"?
I love the later description of writing as a "special, irreplaceable form of thinking forged from solitary perception and [enormous amounts of] labor", where “style isn’t something you apply later; it’s embedded in your perception" (according to Amis). Could such a statement ever apply to something as crass as software development?
Art has two facets. First is if you like it. If you do, you don't need to care where it came from. Second is the art as cultured and defined by the artistic elites. They don't care if art is liked or likable, they care about the pedigree, i.e. where it came from, and that it fits what they consider worthy art. Between these two is what I call filler art: stuff that's rather indifferent and not very notable, but often crosses over some minimum bar that it's accepted by, and maybe popular among average people who aren't that seriously interested in art.
In the first category, AI is no problem. If you enjoy what you see or hear, it doesn't make a difference if it was created by which kind of artist or AI. In the second category, for the elite, AI art is no less unacceptable than current popular art or, for that matter, anything at all that doesn't fit their own definition of real art. Makes no difference. Then the filler art.. the bar there is not very high but it will likely improve with AI. It's nothing that's been seriously invested in so far, and it's cheaper to let AI create it rather than poorly paid people.
a lot of artists don't mind use AI for art outside their field
I was in a fashion show in tokyo in 2024.
i noticed their fashion was all human designed. but they had a lot of posters, video, and music that was AI generated.
I point blank asked the curator why he used AI for some stuff but didn't enhance the fashion with AI. I was a bit naive because I was actually curious to see if AI wasn't ready for fashion or maybe they were going for an aesthetic. I genuinely was trying to learn and not point out a hypocrisy.
he got mad and didn't answer. i guess it is because they didn't want to pay for everything else. big lesson learned in what to ask lol.
I think people hate AI generated writing more than they like human curated writing. At the same time, I find that people like AI content more than my writing. I write, comment, and blog in many different places and I notice that my AI generated content does much better in terms of engagement. I'm not a writer, I code, so it might be that my writing is not professional. Whereas my code-by-hand still edges out against AI.
We need to value human content more. I find that many real people eventually get banned while the bots are always forced to follow rules. The Dead Internet hypothesis sounds more inevitable under these conditions.
Indeed we all now have a neuron that fires every time we sense AI content. However, maybe we need to train another neuron that activates when content is genuine.
As much as the general public seems to be turning against AI, people only seem to care when they're aware it's AI. Those of us intentionally aware of it are better tuned to identify LLM-speak and generated slop.
Most human writing isn't good. Take LinkedIn, for example. It didn't suddenly become bad because of LLM-slop posts - humans pioneered its now-ubiquitous style. And now even when something is human-written, we're already seeing humans absorb linguistic patterns common to LLM writing. That said, I'm confident slop from any platform with user-generated content will eventually fade away from my feeds because the algorithms will pick up on that as a signal. (edit to add from my feeds)
What concerns me most is that there's absolutely no way this isn't detrimental to students. While AI can be a tool in STEM, I'm hearing from teachers among family and friends that everything students write is from an LLM.
Leaning on AI to write code I'd otherwise write myself might be a slight net negative on my ability to write future code - but brains are elastic enough that I could close an n month gap in 1/2n months time or something.
From middle school to university, students are doing everything for the first time, and there's no recovering habits or memories that never formed in the first place. They made the ACT easier 2 years ago (reduced # of questions) and in the US the average score has set a new record low every year since then. Not only is there no clear path to improvement, there's an even clearer path to things getting worse.
I wonder whether we will see a shift back toward human generated, organic content, writing that is not perfectly polished or exhaustively articulated. For an LLM, it is effortless to smooth every edge and fully flesh out every thought. For humans, it is not.
After two years of reading increasing amounts of LLM generated text, I find myself appreciating something different: concise, slightly rough writing that is not optimized to perfection, but clearly written by another human being
About the article that's referenced in the beginning - that sentiment presented in it honestly sounds like AI version of cryptocurrency euphoria just as the bubble burst. "You are not ready for what's going to happen to the economy", "crypto will replace tradfi, experts agree". The article is sitting at almost 100M views after just a week and has strong FOMO vibes. To be honest, it's very conflicting for me to believe that, because I've been using AI and compared to crypto, it doesn't just feel like magic, it also does magic. However, I can't help but think of this parallel and the possibilty that somehow the AI bubble could right now be starting to stall/regress. The only problem is that I just don't see how such a scenario would play out, given how good and useful these tools are
I agree with the assessment that pure writing (by a human) is over. Content is going to matter a lot more.
It's going to be tough for fiction authors to break through. Sadly, I don't think the average consumer has sufficiently good taste to tell when something is genuinely novel. People often prefer the carefully formulated familiar garbage over the creative gems; this was true before AI and, IMO, will continue to be true after AI. This is not just about writing, it's about art in general.
There will be a subset of people who can see through the form and see substance and those will be able to identify non-AI work but they will continue to be a minority. The masses will happily consume the slop. The masses have poor taste and they're more interested in "comfort food" ideas than actually novel ideas. Novelty just doesn't do it for them. Most people are not curious, new ideas don't interest them. These people will live and breathe AI slop and they will feel uncomfortable if presented with new material, even if wrapped in a layer of AI (e.g. human-written core ideas, rewritten by AI).
I feel like that about most books, music and pop culture in general; it was slop and it will continue to be slop... It was the same basic ideas about elves, dragons, wizards, orcs, kings, queens, etc... Just reorganized and mashed with different overarching storylines "a difficult journey" or "epic battles" with different wording.
Most people don't understand the difference between pure AI-generated content (seeded by a small human input) and human-generated content which was rewritten by AI (seeded by a large human input) because most people don't care about and never cared about substance. Their entire lives may be about form over substance.
As we move further into a world where data exfiltration is becoming more sophisticated, local-first processing isn't just a luxury—it’s a necessity. Hardware is finally powerful enough to handle what used to require a massive backend infrastructure.
Axios got traction because it heavily condensed news into more scannable content for the twitter, insta, Tok crowd.
So AI is this on massive steroids. It is unsettling but it seems a recurring need to point out that across the board many of "it's because of AI" things were already happening. "Post truth" is one I'm most interested in.
AI condenses it all on a surreal and unsettling timeline. But humans are still humans.
And to me, that means that I will continue to seek out and pay for good writing like The Atlantic. btw I've enjoyed listening to articles via their auto-generated NOA AI voice thing.
Additionally, not all writing serves the same purpose. The article makes these sweeping claims about "all of writing". Gets clicks I guess, but to the point, most of why and what people read is toward some immediate and functional need. Like work, like some way to make money, indirectly. Some hack. Some fast-forwarding of "the point". No wonder AI is taking over that job.
And then there's creative expression and connection. And yes I know AI is taking over all the creative industries too. What I'm saying is we've always been separating "the masses" from those that "appreciate real art".
Same story.
Same. New yorker is the other mag I subscribed to.
Until 3 weeks ago I had a high cortisol inducing morning read: nyt, wsj, axios, politico. I went on a weeklong camping trip with no phone and haven't logged into those yet. It's fine.
People think I'm nuts when I tell them I ditched subscriptions for those sites and only check them maybe once a week, if that.
But what you said is 100% true, it's fine. When things in your life provide net negative value it's in your best interest to ditch them.
I agree with this in general but with caveats. For example I think reading national-sized news every day sucks. But if you're of a specific demographic it might be useful to keep pretty up to date on nuanced issues, like if you're a gun owner you will probably want to keep up to date on gun licensing in your area. Or if you're a trans person it's pretty important nowadays to be very aware of laws being passed to dictate your legally going to whatever bathroom or something.
> "Post truth" is one I'm most interested in.
I have this theory that the post-truth era began with the invention of the printing press and gained iteratively more traction with each revolution in information technology.
So slightly before 1440 was peak Truth for humanity?
I think you're right, but I also think it's worthwhile to look at Edward Bernays in the early 1900s and his specific influence on how companies and governments to this day shape deliberately shape public opinion in their favor. There's an argument that his work and the work of his contemporaries was a critical point in the flooding of the collective consciousness with what we would consider propaganda, misinformation, or covert advertising.
"Is Claude Code junk food, though? ... although I have barely written a line of code on my own, the cognitive work of learning the architecture — developing a new epistemological framework for “how developers think” — feels real."
Might this also apply to learning about writing? If have barely written a line of prose on my own, but spent a year generating a large corpus of it aided by these fabulous machines, might I also come to understand "how writers think"?
I love the later description of writing as a "special, irreplaceable form of thinking forged from solitary perception and [enormous amounts of] labor", where “style isn’t something you apply later; it’s embedded in your perception" (according to Amis). Could such a statement ever apply to something as crass as software development?
My current bugbear is how art is held up as creativity and worthy of societal protection and scorn against AI muscling in on it
While the same people in the same comments say it’s fine to replace programming with it
When pressed they talk about creativity, as if software development has none…
Art has two facets. First is if you like it. If you do, you don't need to care where it came from. Second is the art as cultured and defined by the artistic elites. They don't care if art is liked or likable, they care about the pedigree, i.e. where it came from, and that it fits what they consider worthy art. Between these two is what I call filler art: stuff that's rather indifferent and not very notable, but often crosses over some minimum bar that it's accepted by, and maybe popular among average people who aren't that seriously interested in art.
In the first category, AI is no problem. If you enjoy what you see or hear, it doesn't make a difference if it was created by which kind of artist or AI. In the second category, for the elite, AI art is no less unacceptable than current popular art or, for that matter, anything at all that doesn't fit their own definition of real art. Makes no difference. Then the filler art.. the bar there is not very high but it will likely improve with AI. It's nothing that's been seriously invested in so far, and it's cheaper to let AI create it rather than poorly paid people.
The easiest job to automate is someone else’s.
a lot of artists don't mind use AI for art outside their field
I was in a fashion show in tokyo in 2024.
i noticed their fashion was all human designed. but they had a lot of posters, video, and music that was AI generated.
I point blank asked the curator why he used AI for some stuff but didn't enhance the fashion with AI. I was a bit naive because I was actually curious to see if AI wasn't ready for fashion or maybe they were going for an aesthetic. I genuinely was trying to learn and not point out a hypocrisy.
he got mad and didn't answer. i guess it is because they didn't want to pay for everything else. big lesson learned in what to ask lol.
I think people hate AI generated writing more than they like human curated writing. At the same time, I find that people like AI content more than my writing. I write, comment, and blog in many different places and I notice that my AI generated content does much better in terms of engagement. I'm not a writer, I code, so it might be that my writing is not professional. Whereas my code-by-hand still edges out against AI.
We need to value human content more. I find that many real people eventually get banned while the bots are always forced to follow rules. The Dead Internet hypothesis sounds more inevitable under these conditions.
Indeed we all now have a neuron that fires every time we sense AI content. However, maybe we need to train another neuron that activates when content is genuine.
As much as the general public seems to be turning against AI, people only seem to care when they're aware it's AI. Those of us intentionally aware of it are better tuned to identify LLM-speak and generated slop.
Most human writing isn't good. Take LinkedIn, for example. It didn't suddenly become bad because of LLM-slop posts - humans pioneered its now-ubiquitous style. And now even when something is human-written, we're already seeing humans absorb linguistic patterns common to LLM writing. That said, I'm confident slop from any platform with user-generated content will eventually fade away from my feeds because the algorithms will pick up on that as a signal. (edit to add from my feeds)
What concerns me most is that there's absolutely no way this isn't detrimental to students. While AI can be a tool in STEM, I'm hearing from teachers among family and friends that everything students write is from an LLM.
Leaning on AI to write code I'd otherwise write myself might be a slight net negative on my ability to write future code - but brains are elastic enough that I could close an n month gap in 1/2n months time or something.
From middle school to university, students are doing everything for the first time, and there's no recovering habits or memories that never formed in the first place. They made the ACT easier 2 years ago (reduced # of questions) and in the US the average score has set a new record low every year since then. Not only is there no clear path to improvement, there's an even clearer path to things getting worse.
I wonder whether we will see a shift back toward human generated, organic content, writing that is not perfectly polished or exhaustively articulated. For an LLM, it is effortless to smooth every edge and fully flesh out every thought. For humans, it is not.
After two years of reading increasing amounts of LLM generated text, I find myself appreciating something different: concise, slightly rough writing that is not optimized to perfection, but clearly written by another human being
About the article that's referenced in the beginning - that sentiment presented in it honestly sounds like AI version of cryptocurrency euphoria just as the bubble burst. "You are not ready for what's going to happen to the economy", "crypto will replace tradfi, experts agree". The article is sitting at almost 100M views after just a week and has strong FOMO vibes. To be honest, it's very conflicting for me to believe that, because I've been using AI and compared to crypto, it doesn't just feel like magic, it also does magic. However, I can't help but think of this parallel and the possibilty that somehow the AI bubble could right now be starting to stall/regress. The only problem is that I just don't see how such a scenario would play out, given how good and useful these tools are
I agree with the assessment that pure writing (by a human) is over. Content is going to matter a lot more.
It's going to be tough for fiction authors to break through. Sadly, I don't think the average consumer has sufficiently good taste to tell when something is genuinely novel. People often prefer the carefully formulated familiar garbage over the creative gems; this was true before AI and, IMO, will continue to be true after AI. This is not just about writing, it's about art in general.
There will be a subset of people who can see through the form and see substance and those will be able to identify non-AI work but they will continue to be a minority. The masses will happily consume the slop. The masses have poor taste and they're more interested in "comfort food" ideas than actually novel ideas. Novelty just doesn't do it for them. Most people are not curious, new ideas don't interest them. These people will live and breathe AI slop and they will feel uncomfortable if presented with new material, even if wrapped in a layer of AI (e.g. human-written core ideas, rewritten by AI).
I feel like that about most books, music and pop culture in general; it was slop and it will continue to be slop... It was the same basic ideas about elves, dragons, wizards, orcs, kings, queens, etc... Just reorganized and mashed with different overarching storylines "a difficult journey" or "epic battles" with different wording.
Most people don't understand the difference between pure AI-generated content (seeded by a small human input) and human-generated content which was rewritten by AI (seeded by a large human input) because most people don't care about and never cared about substance. Their entire lives may be about form over substance.
Who or what is "the masses" actually?
What is the difference between writing and content?
I would guess he's looking to compare the equivalent of fast-food to fine-dining or nutritious eating.
As we move further into a world where data exfiltration is becoming more sophisticated, local-first processing isn't just a luxury—it’s a necessity. Hardware is finally powerful enough to handle what used to require a massive backend infrastructure.
My spidey-sense: the "it isn't X, it's Y" construct and the dreaded em dash.