2023: The Rise and Fall of AI (that took out Social Media with it)

Three different images of a “pretty girl” generated in Stable Diffusion seem to be variants of the same model.

The woman you see in these three pictures is not real, but her image can be found everywhere these days. She is the 2023 version of “Super Smiley Ariane” the Canadian stock photo model who appeared in ads all over the world during the 2010s.

The woman in the picture is called “default generic AI face” girl according to this reddit post, and even shows up sort of in the experimental AI images I made earlier this year.

In less than a year, AI went from a promised new future to an industry with no real future, at least not as everyone envisioned it.

Sure, it brought us some tools to make life easier. I use Eleven Labs AI voicing in my games, mostly because it is all I can afford, and cheap voicing is better than no voicing. But I can also recognize that the voice work would be significantly better with professional voice actors, and no I don’t buy the arguments that “It will eventually get better and exceed humans” because it won’t.

This is 4 years of computer science studies in college talking. The limitations of machine computation have been known since the beginning, and was literally the math that created the computer science field.

As a result, I go to Reddit, and see the Daz3D reddit blooming again, and the Stable Diffusion reddit, lagging in new content. That surprises even me.

“default generic AI face” girl shows up in all styles of AI generated art (Pictures from Deviant Art)

The decline of AI image art

Many of the things that caused the decline of AI art I predicted in my many early essays on the topic. 1. You can create an incredibly good one-off image like above, but you can’t make a series as there is too much variance between generated images. 2. It is becoming more and more clear that most AI art crosses the plagiarism and copyright lines. Frankensteining images from multiple sources isn’t “free use”. 3. The US Patent and Copyright Office refuses to copyright AI generated art or works, meaning they cannot be used to make money. 4. Steam will not approve games made with AI art or written by AI. 5. YouTube will now flag videos that use AI art and may be grounds for demonetizing unless properly labeled.

Are there legitimate uses for AI art? Well yes. After generating AI versions of some of my game art, I found that it wasn’t the “realism” that made better pictures, it was the lighting. So, I improved the lighting of many images from Date Ariane Remastered and Something’s In The Air Redux but kept them as fully 3D images.

The main reason for keeping them 3D is consistency. AI images may be prettier or more realistic, but very inconsistent. Consistent animation and storytelling in AI is actually quite hard. 3D rendering consistency is much easier.

The truth is AI art went from “amazing” to “bland” very quickly in most people’s eyes. My experience doing it myself created some decent images, but I felt I wasn’t making anything myself. I haven’t played with AI since last May because of this.

While I agree that the AI version looks better than the 3D one, it took over two dozen attempts to generate the AI version and a lot of fancy controlnet and custom SD models, and I can’t consistently make these for an entire production. 3D is a consistent aesthetic that is much easier to work with.

In 3D I can control everything, letting the render appear exactly how I want it.

Compared to AI tools, 3D tools are easier to use, easier to understand, and feel more natural. I can control the camera’s position, focus, and depth. I can put two subjects in a picture and not have them look like each other. I can very easily create simple animations in 3D, with hands that look like hands. I can fully control where all the light is coming from and how bright it must be.

That is not the case in AI. I use all the recommended AI tools to generate pictures, and AI Ariane still looks too much like “default generic AI face” girl.

Making AI models, and controlnet images to use are time consuming and not user friendly. In the time it takes to create two consistent images in AI, I can render an entire scene in 3D.

AI art will never go away but its uses will not dominate. Creating backgrounds, 3D textures, or test images in AI that artists can use in their real art will be the primary use of the tools.

Video discussing why AI art will never catch on as a legit art field.

The decline of AI writing

At the height of AI mania in March, there were huge numbers of warnings about students using AI to write college essays, professional writers using AI to write articles and books. The problem with these doom predictions became evident very quickly, AI written stuff is boring, inaccurate and forgettable. Writing full books and essays in AI is super obvious and easily detectable.

The fix has quickly come to light: People are using AI to assist in writing, researching, finding more examples, creating writing prompts, correcting spelling and grammar mistakes. Tools to do these things are already in our software and phones. This is the useful side of AI tech: helping humans do faster and better work.

The problem for AI companies is that the real money was in the AI generated books and articles that nobody is reading, because it is shit. It also suffers from the same issues as AI pictures: uncopyrightable and borders too close to plagiarism and copyright theft.

The truth is we still don’t know what AI can and can’t do

AI as an assist process seems to be where the future is. One example is Nvidia’s video cards is using AI routines called DLSS or “Deep Learning Super Sampling” to create additional frames of video to make video in games look smoother. There will no doubt be other “good” examples in the future.

There are big negatives though, too. Bethesda used AI to fill 1000 planets with junk basically in Starfield, leading to Starfield‘s “mixed” review on Steam, the biggest complaint (including by me) being the game is filled with the same repeated content over and over. That’s AI enhanced procedural generation making a mess of an otherwise fun game. Those 1000 planets were supposed to be the source of endless exploring. Instead, once you explore about 10 planets, you explored them all.

And don’t get me started on the AI generated “suggested content” algorithms used by social media platforms, which are ruining the fun of social media. These companies are using this software to try and boost their “engagement” numbers and instead they are discouraging engagement. YouTube creators are realizing the joys of sexy thumbnails with click bait titles that have little to do with the actual content. More on the decline of social media below…

The Writers and Actors Strikes did serious harm to AI’s potential

The Copyright office decision was black eye number 1 for the AI industry. The WGA and SAG strikes were black eye number 2. Both strikes had AI and its uses in film production as major dispute points. AMPTP wanted to freely use AI to replace human writers, and generate AI characters to replace extras in background shots, or fully AI generate actors who died.

Obviously, this did not go over well with the striking actors or writers. It is during these months of the strikes that the limitations of AI were becoming more obvious. Ultimately, the WGA strike ended with the AMPTP fully surrendering to the writers take on AI. AI can be used by writers if it helps their workload, but it is up to them. Producers cannot force writers to use AI and cannot pay them less for using it.

The AMPTP ended the actors strike with similar concessions. AI generated actors cannot be use without the full permission and compensation of the original actor, or their estate.

The only part of film and TV making that is likely to see an increase in AI is in the special effects area, which has become a thankless job without AI, and if AI can do the hard time-consuming stuff, it is better for everybody.

But no, people using AI to generate full movies is a AI bros pipe dream.

The legalities of scraping copyrighted sources for AI data generation is still rolling in the courts and could be black eye 3 or a knockout punch depending on the outcome.

AI causing the decline of social media

The sudden rise in AI got a lot of backlash. First there was the issue of “scraping the internet” for words to use. Twitter (it was still called Twitter then) started charging for API access to make money or block the AI bots scraping the site. Reddit thought that was a good idea and soon followed with a swift backlash by third party app developers. The traffic on both suffered greatly.

Deviant Art embraced the AI movement, and opened the door for the AI art bros, and now that’s mostly all there is there.

Early AI generated parodies of Wes Anderson directing non-Wes Anderson movies was funny, for about a month. Oddly the best Wes Anderson parody this year was a movie called “Asteroid City” which wasn’t made with AI, it was made by the actual Wes Anderson.

AI generated memes went from funny, to boring, very quickly. Watching odd AI rendering mistakes like Will Smith eating spaghetti is funny the first time, but different celebrities eating different foods is the same joke over and over.

Memes have been so prevalent in the internet age that AI could generate them with ease. Social media became inundated with AI memes that were not very funny, and often filled with lies. Humor is the funniest when it is based on truth.

The invasion of AI into social media was not a good thing, and in a way killed the popularity of ALL memes. Social media suffered because of it. All the big sites declined in popularity because there were so many bots generating so many lies. One of the less popular, Tumblr, has retreated to “maintenance mode”, with Automattic, the current owners moving most of the staff to other projects (a rare non-layoff closure. Automattic also owns WordPress that hosts this site).

Threads is the only growing social media left because it was started by Meta as a Twitter replacer, and while most of the former twitter influencers moved there, and AI generated posting is forbidden, the overall traffic is still well below the Twitter golden era.

YouTube, the original video version of social media until TikTok overtook it, is also declined in viewership. Whether due to its failing algorithm, its increase in advertisement time, its growing library of AI generated content, or its public fight with ad blockers, it is hard to say. But YouTube recognizes the problem of videos made from AI images and text, and is threatening to demonetize all of them.

Good! Sitting through 2 minutes of ads just to be algorithmically fed an AI generated inaccurate video of a topic you accidentally googled 5 months ago is not an enjoyable experience.

AI art and text has lowered our acceptance of things we see online, which may eventually be a good thing, except it means wading through a lot of crap to find the accurate version of what we are looking for.

AI Assisted search is literally killing the Internet

One of the new technologies for 2023 is the AI assisted search engine. It started with Bing at the beginning of the year, and I was an early adapter, it’s now available to everybody.

Google started seeing a drop in the search market and quickly fast tracked their own AI assisted search engine, quickly relegating their awful “old” search engine, to the trash heap. The problem is this new AI generated search breaks the optimization used by most of the web to make money, and because AI search just pulls the most relevant info from a site and posts it as a result, AI assisted search is killing that ability for websites to make money.

AI Search engines are quickly destroying the $50 billion dollar Search Engine Optimization (SEO) industry.

According to this video, there are many famous websites now up for sale because of search changes like this.

This phenomenon plus this article in The Atlantic called “Life Really Is Better Without The Internet”, makes me question if the internet has any future at all.

The AI World is a Cult

The drama surrounding the firing and rehiring of Sam Altman as CEO of OpenAI is to me primary evidence that no one in the AI community has any clue what they are doing. The AI world has become a cult of its own marketing pitch, a pitch so interwoven into our modern culture that it is easy to see how anyone can get swallowed up.

“Artificial Intelligence” is a sci-fi based idea that for the last 60 years, movies and TV has been teaching us to be excited and frightened of its potential.

What we have today is “Large Language Models” or LLMs, or as I call it “Artificial Impersonation” which is also exciting and frightening in its own way (and that is what is selling the illusion) but is not actually intelligent.

What the AI world is praying for is AGI or “Artificial General Intelligence” and despite all the marketing, will never be achieved by LLM.

LLMs are very good at impersonating human quality writing, because it is using millions of internet writers as a baseline. LLMs are very good at impersonating human quality art, because it is using millions of internet artists as a baseline. AGI is a goal of exceeding human intelligence, an artificial Einstein of sorts. The problem is that there are not a million Einsteins online to use as a baseline.

The official story of Sam Altman’s fall and re-rise at OpenAI is documented in a series of articles at The Atlantic, the latest being this one.

AI Art is Amplifying itself

I want to close out with speculation that is hilariously funny to me, and may prove to be the real Achilles heel of LLMs.

LLMs gained their power and reputation from scraping the internet of its content and using that as a basis for the large language models. But as these models grow in popularity, as more of the internet becomes AI generated, as “default generic AI face” girl gets more overused, the LLM models of the future are going to use their own output as input. Basically, what is called in engineering an informational “feedback loop”.

The effect this may have on AI is only starting to become known. AI works by creating in a generic middle of all its scraped samples, and as the scraped samples become more AI generated, “Artificial Impersonation” will just impersonate itself ad infinitum until all the AI girls look like “default generic AI face” girl.

ETA: The Atlantic does their own story on AI killing social media

This is a later story in the Atlantic, it’s likely paywalled, so I’ll give a brief summary. 

An interesting report in the Atlantic called “Nobody Knows What’s Happening Online Anymore” is quite eye opening.

Has society gotten so divided that there are no longer any major trends going on to follow?

“The sprawl has become disorienting. Some of my peers in the media have written about how the internet has started to feel “placeless” and more ephemeral, even like it is “evaporating.” Perhaps this is because, as my colleague Ian Bogost has argued, “the age of social media is ending,” and there is no clear replacement. Or maybe artificial intelligence is flooding the internet with synthetic information and killing the old web. Behind these theories is the same general perception: Understanding what is actually happening online has become harder than ever.”

“Nobody Knows What’s Happening Online Anymore” by By Charlie Warzel in The Atlantic

Examples listed in the article include quoting Netflix who said their most popular show in the spring time was The Night Agent, with a watch rate of more than 800 million hours globally, a show most people have never even heard of.

Claims about an Osama Bin Laden’s “Letter to America” going viral had more articles about the phenomenon than actual people that saw any of the so called viral videos about it, a trivially small number. And finally:

“According to TikTok’s year-end report, the most popular videos in the U.S.—clips racking up as many as half a billion views each—aren’t topical at all. They include makeup tutorials, food ASMR, a woman showing off a huge house cat, and a guy spray-painting his ceiling to look like Iron Man. As a Verge headline noted earlier this month, “TikTok’s biggest hits are videos you’ve probably never seen.” Other platforms have the same issue: Facebook’s most recent “Widely Viewed Content Report” is full of vapid, pixelated, mostly repackaged memes and videos getting tens of millions of views.”

ibid

It seems the age of “viral memes and videos” is truly over, fractured into pieces, thanks at least in part to AI.

23 comments

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.