i really don't like the future of YouTube.

Igor BerlenkoIgor Berlenko
8 min read

The future of YouTube risks losing its human connection as AI takes over, transforming creators into mere content machines.

Before we start this video, I want to make something very clear: this video is not a knock on Colin and Samir. This video is not a knock on any creator that is doing reporting or doing podcasts on the future of YouTube or AI. I think Colin and Samir are great creators, and the amount of help they've given me personally as a creator, going from zero to 700k subscribers, has been immense. I can never pay them back enough.

In this video, they are doing some reporting on the future of YouTube. They attended a conference where the CEO of YouTube and many executives discussed what YouTube will look like in the next 5 to 50 years. The video is titled "YouTube is About to Change." Before you watch this video, do me a favor: go watch their video from start to finish and gather your own personal thoughts on it.

I did this this morning because it's Saturday, and I always use the weekends to gain better insight into what's happening in the world of YouTube. Personally, I found this video extremely disturbing. If you've watched this channel before, you may already know my opinion on AI. I think AI, in general, is an interesting technology, but the way that companies and investors have been viewing it as a $5 billion cash grab is sucking the humanity out of the internet. This is particularly concerning as it applies to YouTube, where I believe the only remaining advantage is the ability for someone to connect with a creator.

I hope that as you watch these videos, you not only take away the content or the technology or the insights that I provide but also my personal opinion on these matters. I encourage you to use that perspective to connect with me a little bit closer.

Now, I want to draw a picture real quick before we dive deeper into this topic and describe why I think the future of AI, as Colin and Samir describe it, is so dangerous for the platform. If you remember YouTube during the 2013 to 2015 era, you had the Casey Neistat phenomenon. He was someone you wanted to get personally attached to; you would watch his vlogs to see how his life was going. He didn’t teach you how to make a million dollars or how to start your next online business, and he didn’t bombard your brain with dopamine. You connected with Casey.

I think this was the beginning of what I call the Casey-Mr. Beast roller coaster. On one end, you have Casey, where you are emotionally connecting with a creator, and on the other end, you have Mr. Beast, who represents the kind of dopamine rush—fast-paced content and retention editing. It’s important to note that being on the top or the bottom of this spectrum is not a testament to quality or ethics; it simply reflects different approaches to engaging with the audience.

Both creators tap into different aspects of the human psyche, depending on what’s happening in the world. I think Casey Neistat resonated during the 2016 political climate, which was a bit rough, while Mr. Beast emerged during a time when we needed stimulation due to the challenges posed by COVID-19.

=> 00:04:30

We're craving deeper human connections in content, not AI-generated replicas.

You know, sine waves look like you have the Beast, the Mr. Beast, where Casey was connection to a Creator. The B is the kind of dopamine rush—boom boom boom—retention editing. How much content can we shove at you as fast as possible? I think they both attack or attach to a different part of the human psyche, depending on what's going on in the world too.

I think Casey Neistat did this around 2016; you know, politics were a little rough during that time. Then Mr. Beast emerged during a period when we needed stimulation because of how badly COVID affected us and how much time we spent in our rooms looking for something to stimulate us. With the onset of another political period, and again, Mr. Beast has not fallen off, but I think just emotionally, people are striving for something new. We are looking for another Casey.

I think we are entering another Casey Neistat era of YouTube, where humans want more connection. You have to remember that almost the entirety of human psychology is wired around the tribalism of human beings, where they want to emotionally and socially attach to other people, tell rumors, tell stories, and be a part of a tribe. That's how we're wired.

To see YouTube go in this direction, at the bottom of this part of the video, it says AI takes over. Right now, at this conference, Mark Zuckerberg was actually there. It was a YouTube conference, but Mark Zuckerberg was there to represent Meta. They did a showcase of a creator where YouTube trained a model off of his likeness, and then Mark Zuckerberg was able to have a conversation with him.

Let's watch this real; he is essentially FaceTiming the AI video version of him, and Zuck is having a conversation with the AI avatar version of Don Allen. "Congrats on the new book that you just released. You know, what's the main thing that you're hoping that people take away from?" That's not him. You know, it's doing the model; it has to query what they want to talk about. "The main thing I want people to take away from my book is..."

Now, if you are a human being watching this video, I don't see how you could see that as a good thing. I don't understand how YouTube, as a platform that has become popular in the last 10 or so years—almost exclusively off of human connection, off of learning the stories of people, off of connecting to people going through journeys like Michelle who makes videos about trying new things, or Casey Neistat where he's going through his life as an entrepreneur, husband, or boyfriend in New York—could push the platform toward having AI models of creators be accessible to viewers.

They go even a step further in this video, discussing how YouTube wants to change the way we search for content. Instead of searching for content that a person has created, they are offering the idea that maybe in the future, we can put into a search bar the kind of video we want to watch and then have AI generate the video as a function of the model trained by the creator's content.

Now again, if you are a human being that wants to connect to a person or wants to go onto YouTube to have some kind of discovery or emotional connection, I don't see how anybody would want this feature. I don't understand why anybody would want this.

=> 00:08:41

The future of YouTube shouldn't be about AI-generated content; it's about the genuine connection with creators that keeps us coming back.

YouTube is exploring a future where, instead of searching for content created by individuals, users could input the type of video they wish to watch, and AI would generate that video based on a model trained by the creator's content. However, if you are a human being that wants to connect to a person or seeks some form of discovery or emotional connection on YouTube, it is difficult to understand why anyone would want this feature.

In a recent discussion, the question arose: does the average user care that they're watching AI-generated content? Is it even a problem for people? For instance, consider watching a Johnny Harris video that states it was "made with AI." If you cannot discern which parts are AI-generated, would you care? The answer seems to be no; many viewers likely wouldn't mind. However, if Johnny were in Hawaii shooting a video and wished he could be in his set, the situation changes.

The crux of the issue lies in emotional investment. I would care if I watched a video of a creator that I was emotionally invested into. If I eagerly anticipated their uploads every Thursday to discover their latest insights, only to find that the entire video was generated by AI while the creator was on vacation, I would be deeply disappointed. There is no scenario in which I would continue to support a creator who relied entirely on AI for their content.

While I don't want to dwell on this topic too long, I feel compelled to express my concerns. As someone who has been consuming YouTube content since its inception, I find this trend to be detrimental to the platform. If we acknowledge that these waves exist, we might be at the bottom of another resurgence, but replacing genuine creators with AI models could lead to the collapse of the YouTube platform. I genuinely believe that YouTube will struggle to survive another decade if it becomes synonymous with AI-generated content. People may begin to associate YouTube with AI content farming, similar to how Millennials have distanced themselves from Facebook due to its association with Boomers.

On this channel, I typically focus on programming content, including software security and cybersecurity. Recently, I wrote a fuzzer in Go and am learning about fuzz testing, which involves identifying vulnerabilities in software. If you enjoy these topics, please consider subscribing.

This morning, I felt particularly emotional after watching a YouTube video by Colin and Samir. I want to clarify that this is not a critique of them as creators; I think they are great. However, I find the content of this specific video and some of their opinions to be slightly disturbing and out of touch. I am curious to hear your thoughts on this matter. Please share your opinions in the comments below. Interestingly, many comments on their video express sentiments like, "this is terrifying, I hate this, no one wants AI." What do you think? I look forward to seeing your responses. Goodbye!

0
Subscribe to my newsletter

Read articles from Igor Berlenko directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Igor Berlenko
Igor Berlenko