Tim Kendall | Netflix’s ‘The Social Dilemma’, Pinterest President, Facebook Director of Monetization

Moment CEO Tim Kendall joins Alex Moazed to discuss a plethora of topics concerning social media platforms, online censorship, matchmaking algorithms, and the impact digitization has had on social well being.

Tim architected early monetization strategies for Facebook and Pinterest as Facebook’s first Director of Monetization and President of Pinterest. Tim currently serves as CEO of Moment, which has helped over 8 million people build healthier relationships with their phones and is fighting to reimagine the tech industry as one built for its users. Additionally, Kendall serves as a board member of UCSF Benioff Children’s Hospital.

Download the Moment app here: app.inthemoment.io/

House Committee on Energy and Commerce Testimony of Tim Kendall: Link

Originally Aired: 11/09/20

00:00 – Subscribe for Tech & Business News Daily
01:08 – Algorithms Want Users to Spend the Maximum Amount of Time
03:55 – An Attention Extractive Based Business Model
08:36 – Realization That Something Was Wrong
09:44 – Digital Behaviors Mimicking Qualities of Addictive Substances
11:43 – How Moment App Helps Develop Awareness
14:28 – Never Been More Connected and Lonely
15:17 – The Importance of Social Well Being
20:13 – Matchmaking vs. Censorship
22:10 – Do Social Algorithms Reward “Fake News”?
26:06 – What Happens to Startups in Survival Mode
29:44 – Facebook’s Market Dominance
32:39 – Fixing the Social Dilemma
37:35 – Can Matchmaking Be Externalized?
40:05 – Even Netflix is Guilty
41:48 – Importance of Transparency
44:22 – Platforms Don’t Want to be Arbiter of Truth
50:06 – Tech Platforms Protecting Political Speech
54:01 – Do You Allow Your Kids to Use Facebook?
56:24 – Future of Social Platforms for Children

Subscribe to the Applico YouTube Channel

Full Transcript:

Alex Moazed (00:00:08):
I’m Alex Moazed and welcome to Winner Take All, where we talk about the constant battle between large tech monopolies and traditional incumbents. I’m very excited that today we have guest speaker Tim Kendall. You may have seen him on the recent documentary called The Social Dilemma. He has presented materials in front of Congress on what’s going on with platforms. He’s been working in Facebook at the early days, Pinterest, and most recently, he is also CEO of Moment, which we will definitely be getting into today. So, Tim, great to have you and thanks for joining us.

Tim Kendall (00:00:46):
Yeah. Thanks, Alex, for having me on.

Alex Moazed (00:00:46):
If I was to give my one sentence overview and overgeneralize things, I want to hear it from you. It seems like what you were saying, whether is in the testimony, which we’ve actually covered on the show, obviously, the documentary that so many folks have seen, Social Dilemma. It seems like what you’re getting at here is these big content platforms like Facebook, they’re matchmaking, the algorithms that are connecting users with what content they see, has been, I would say corrupted by greed, right? These algorithms have been trained to send you stuff that maximizes for engagement and what the algorithms have figured out is that the way they get the most engagement is to send you the salacious, triggering content, much of it fake content, but nonetheless, these algos have performed their job and have really maximized engagement, and then they serve more ads and make more money. In my offer [inaudible 00:01:54], how would you put it in your words?

Tim Kendall (00:01:56):
Well, I think it’s pretty close. I think that the word greed implies that there’s some intent… That we can absolutely know the intention of the company, which I don’t think we can, their underlying motive. It also implies that maybe the algorithm has a motive, which it doesn’t. But what I think it’s safe to say is that the algorithm has been given a mandate, which is to say, “Okay, we got Alex, on this service, algorithm go figure out how to get Alex to spend more time on it tomorrow and a little bit more time on it the next day.” And it doesn’t obviously… The company doesn’t supply the algorithm with a reason why. But look, Facebook and these other companies are publicly traded and they need to grow at a steady and persistent rate in order to maintain their large valuations and their large valuations are how they keep and retain and attract the people who build the service.

Tim Kendall (00:03:16):
So, it’s this inner woven system that’s built upon itself, that really does start with the product and the service, which is designed to extract more and increasing amounts of our attention in order to generate increasing amounts of revenue, which then leads to a persistent and growing valuation, which then allows us to hire more and more people to work on these algorithms to make them better and more sophisticated. I think, maybe a slightly different tack is that in terms of describing the problem, is you’ve got an attention extractive based business model paired up with an all knowing technology that is getting smarter and more sophisticated every day. And it’s a technology that in many ways, in certain dimensions knows you better than you know yourself, and certainly knows the dimensions of your human weaknesses better than you understand them. And I think that’s what’s what’s scary.

Tim Kendall (00:04:38):
And I think the other thing that’s scary is that there’s this separation between the leaders of the company and then the actual algorithm is in part what’s scary because the management leaders can say, “What’s so bad about getting Alex to spend a little bit more time on the platform tomorrow?” And they can say that with a straight face when we’re talking about an artificially intelligent algorithm that’s taking care of the content aggregation and sorting on its own, because they don’t have to necessarily know the how of this algorithm. And in many cases, I don’t think completely do understand the how. And this is really how we’ve gotten into this jam, which is that you’ve got an algorithm that started by virtue of what you’ve mandated it to do, started to wreak havoc on the individual in terms of their mental health and on society in terms of the fabric of how society functions.

Tim Kendall (00:05:52):
And what Facebook and others seem to be continuing to do is they’re a bit late to the party in understanding the severity of the ramifications. I think the best illustration of this… And I’ll pause. The best illustration of this is the 2016 election, whereby it is quite clear in retrospect that that 2016 election was swayed and the artificial intelligence algorithms combined with a meddling, probably tipped the election, the company including its leader didn’t acknowledge or completely understand that until months or years later. And that is what this disaggregation of the leadership and then the actual underlying algorithm. That’s an example [inaudible 00:06:50].

Alex Moazed (00:06:51):
This idea of there’s a black box, you got matchmaking is in the book. We actually describe matchmaking as one of the four core functions of the platform business model, along with rules and standards, which I think we’ll get into a little bit later today when it comes to censorship and how do you curate that ecosystem, but just around the matchmaking, I think is what you’re getting at here is there’s a disconnect between management, whether what their understanding is, what their goals are as a public company, what they need to do to serve as shareholders, et cetera. And then essentially, well, what is the mechanism that this black box, this matchmaking, this algorithm, this AI is achieving these results? And to what end? And I actually have up here, is your testimony to the House Committee on Energy and Commerce, we had covered this when you had first done this, where you’re talking about here, “I was Director of Monetization at Facebook.”

Alex Moazed (00:08:00):
And it seems like there is this moment for you, when you start to say, “This is great, we can achieve these KPIs and growth and engagement, but at what cost?” And there is this kind of moral epiphany or tipping point, where you said… You look at the impact that this engagement is having and you look at how the algorithm is accomplishing this engagement and you look at what that’s doing to the individual and you say, “This isn’t good.” Right? Was there that kind of moment in time, that threshold where you just said, “Ah, this is just too much?”

Tim Kendall (00:08:44):
Yeah. I’m a little embarrassed to admit that it wasn’t that long ago. I think that I came at it maybe a little bit differently than most, who are now speaking out about it in that I think that I was initially, the flag in my head was really around mental well being in that I started noticing when I was still at Pinterest, looking around the restaurant seeing everybody on their phone, seeing families of four or five and no one was talking to each other. And I’ve shared this in other forums. I have seen, I’ve been around addiction for a long time in my family and extended family and so I know how it operates and how corrosive it can be. And I know the generalistic behaviors that are associated with it. And so I was noticing in myself and I was noticing in people around me that there are absolutely, there are ways that we treat this phone that mimic how people treat addictive substances. In terms of denial, in terms of not wanting to part with it, having withdrawal characteristics when they are separated from their phone.

Tim Kendall (00:10:11):
And so that’s what threw the flag up for me originally was, “Wow, there is something about this supercomputer in our pocket and the services that have been built on top of it that are just sucking us in a way where…” And this is really where the definition of addiction plays, where we are unable to make decisions that are in our best interest in the medium and long term, because of the sheer magnitude of the temptation in the short term. This is what happens with drugs and alcohol, right? The alcoholic, or the drug addict knows that the next day they’re not going to feel good. And they know that there could be even longer term ramifications of them drinking a lot tonight. But the near term feedback loop is just too enticing. And we see that in the abstract that plays out with a phone every hour of the day for a lot of us.

Alex Moazed (00:11:15):
Absolutely. And I want to come… Let’s go to… We’re going to have a lot to dig into the nitty gritty on Facebook and these content platforms. But I think this is a nice segue to say, we’ve seen for example I would say, they have tried, Apple’s giving you some controls and lock in, but I’ve got Moment up here and then I’m sharing. Take me through this. To me, what’s interesting is, there is a component here which is saying how do you cut down on your phone usage and kind of this addiction, which is absolutely true. And there’s a myriad of evidence to support that. But also about building stronger and closer relationships with that key circle. I thought that was also a very interesting and a part to this. So, what’s your overview for how would you describe Moment? And I think that’s some of your day to day these days. Right?

Tim Kendall (00:12:23):
Yeah, I think there’re three pieces to Moment. And the third piece is actually kind of turned into a whole nother project that I’m working on, that I can touch on. Because it does relate to how we interact and have relationships and sustain relationships with people. But with Moment, it’s really about helping people develop awareness, people are not aware most of the time of how much they use their phone. Most people think they use it two hours a day. We ask them that at the front end of their experience, we have asked them that historically. And then they measure it and they realize, “Oh, it’s like four hours a day.” So they’re off, but that’s how disconnected perception is from reality. And just that awareness wake up is helpful for people, then we have a whole series of tips and tricks for people who want them to get their device into a realm where people can behave more deliberately around their phone as opposed to sort of unconsciously, where they can’t really predict, I’m going to spend an hour today on my phone or five hours. I don’t know, we’ll see what happens.

Tim Kendall (00:13:32):
And then the last thing is, we have found and this is true in a lot of the research on behavioral addiction, which is really what this phone thing is, that groups of people and co committing to change, behavior change is really one of the most effective way to catalyze behavior change and sustain it. So within Moment, we have an ability for you to form a group of people who can kind of co develop habits and keep each other accountable, because the group feature within Moment allows you to see how much each person in the group is using their phone. So, that’s pretty useful.

Alex Moazed (00:14:12):
And you’ve had over eight million users, right? Something like that today?

Tim Kendall (00:14:15):
Yeah, we have eight million people who have at various points in the last several years who have downloaded the product. So, a lot of demand for it. And then I think in looking at this problem, we’ve learned that there’s an adjunct problem, which I would summarize as we’ve never been more connected to one another in the history of the world while at the same time we’ve probably never as a collective species felt more lonely.

Alex Moazed (00:14:47):
Right.

Tim Kendall (00:14:48):
Which to us presents an opportunity whereby, well, that suggests that maybe these services that are supposed to help with our socialness, aren’t actually working as designed or certainly aren’t working in our best interest. So, we’ve actually endeavored to build a product and we’ve built about five different prototypes in this area, we’re still experimenting, it’s going to take a while I think to get it right, which is, okay, if I started from scratch and I just wanted to build something for Alex that would really help him create and sustain, really sustain because your friendships already exist, think of the five to 10 people in your life to whom you care about the most. How do we help you dedicate your mind share and attention to those people in a sustained way that helps you feel close to them. Because it turns out that if you can achieve that, you actually can do more for someone’s health and well being than if you put them on a diet or give them an exercise regimen or have them quit smoking cigarettes.

Tim Kendall (00:16:03):
Your social well being is actually the biggest predictor of longevity, it’s the biggest predictor of pushing out the onset of disease and it’s the biggest predictor of how good you feel about the quality of your life at the end of it. So it seems silly that there isn’t something that guides and helps people to do that, we sort of leave it all to chance. We’re counting calories over here and making sure that I spend two hours a week on my peloton, but I’m not keeping track of anything as it relates to my relationships. It’s tricky, because you don’t want to turn it into a customer relationship management software program, right? You got all your friends at Salesforce, so it’s just trying to thread that needle. But we think it’s a really interesting problem and one that really does, it really is a byproduct opportunity of how social networks have been built with this attention extractive model, which has allowed them to develop into things that don’t really serve us around this dimension of just helping me stay close with the people that really matter.

Alex Moazed (00:17:18):
It’s breadth versus depth-

Tim Kendall (00:17:19):
Yes.

Alex Moazed (00:17:20):
… to a centimeter deep. And I’ve got all these people, some I know, some I don’t know and I’m just scrolling through the feed, versus now especially with COVID, where you will have just that proximity humans are social creatures and now you literally can’t have that proximity in many circumstances. And not to mention, then the vacuum that has probably been created, is happily sucked up by these social networks and content platforms.

Tim Kendall (00:17:51):
Yes.

Alex Moazed (00:17:51):
And there’s your dilemma in a nutshell. And so I really love that idea of… I think the time tracking and screen time, that is great. And I think having an independent product separate from the actual addiction creator, that is say, Apple on my iPhone, makes sense. I’d want a neutral party to help me cure my own addiction. Curious if you ever run into any issues with Apple, but I’m sure not yet. But maybe eventually. But anyway, this next thing, how do you deepen those relationships and just being that mindfulness and attentive and aware? And I think, that’s something just society and people need. So I think that’s very exciting. And hope you keep us posted on when that product also comes out. Let’s go back to the front punching bag, that is Facebook. As I mentioned, on the show we talk a lot about platform models and all these kinds of things. One of the things that we talk a lot about is matchmaking on one side, we touch on that a little bit, the algorithms, “Hey, Alex, here’s your newsfeed, here’s what’s going on in it.” Maximizing engagement.

Alex Moazed (00:19:13):
The other side, which you also talk a lot about, we haven’t gotten into it yet, is around what we would qualify rules and standards. How do you figure out who has access to the network and then how do you curate and regulate the usage once they’re in the network and you’re using these rules and standards to incentivize good behavior and not bad behavior? You talk a lot about censorship in addition to the matchmaking and the algorithm part of this equation, but those are two, both gargantuan topics in and of themselves. If you were to sit here and say which one of these is the bigger problem, or if you could only solve one, which one do you think has the bigger impact? Is that a fair question? Is there an either, if you had to choose one or the other, does one pop out at you saying, “This is the big kahuna?”

Tim Kendall (00:20:13):
Label the problems again.

Alex Moazed (00:20:14):
Matchmaking, the AI black box algorithm, giving you this salacious content to maximize engagement. That’s one. And then censorship. What is right, what is wrong censorship? Hey, people are posting this, it’s harassing, it’s insensitive, what are we going to do? Are we kicking them off? Are we violating free speech? Kind of the censorship bucket, which is pretty big these days too.

Tim Kendall (00:20:42):
My intuition is that you… I think, if you solve the incentives around the matchmaking, you actually get at the censorship issues.

Alex Moazed (00:20:56):
Yeah, I am right there with you.

Tim Kendall (00:20:59):
Yeah. So, I think that it’s… I’ve always thought it was and I’m not the first person to say this. I’ve always thought it was silly that Facebook said that they’re not the arbiters of truth. But I guess maybe technically they’re not, the algorithm is though. The algorithm is absolutely playing the arbiter role on an individualized basis. They’re distorting what truth is depending on who’s on there. And we can talk about this later, we can talk about this now it has been, I would say astounding and heartening and surprising and encouraging how much Facebook has moved on this issue in the last six to eight weeks.

Alex Moazed (00:21:50):
You were the Director of Monetization. I think for example, if we look at… You got these algorithms, they’re maximizing attention. And the algorithms don’t really care if the news is fake or true. If anything with the algorithms probably, maybe they don’t know, maybe they do know, the more salacious material, the more triggering material is probably more fake than it is real. And that actually does better in the algorithm.

Tim Kendall (00:22:23):
Correct.

Alex Moazed (00:22:24):
And then if you say, “Okay, well, let’s look at the media industry.” Well, Facebook and Google have pretty much destroyed their business model. You got these media organizations that are just struggling to survive. I feel like every few months there’s more layoffs. And is it almost that Facebook has brought along the… The media industry has had to learn how to survive and they are now playing in a system. I’ve got another little chart here that shows a nice little graph of just the percentage of people getting their news on social media just going up and up and up and down for the traditional media folks. So you’re being disintermediated, Facebook and Google are doing the disintermediation. And what those algorithms are saying is, the more salacious and triggering stuff, regardless of whether it’s true, is probably going to get you more ad dollars. And by the way, you’re barely profitable as it is. So, it’s just like a self fulfilling prophecy that unfortunately going back to your point, if you can solve that matchmaking challenge, it can help cure some of these other things that we see going on as a byproduct.

Tim Kendall (00:23:41):
Yeah, I think that’s fair. It’s interesting. I have seen some commentary on this that I happen to agree with. When you think about the degree to which our country has become polarized by virtue of misinformation and echo chambers. Much of the commentary, if you read it in a larger context going back decades, the very beginning of this is actually cable news. When they started seeing and I’m blanking on exactly what the metric was that they used to understand polarized groups of people, but they really started seeing a divergence when cable news took. So when CNN and Fox in the mid 90s, or mid to late 90s, really became these… They had their own.

Tim Kendall (00:24:50):
They had their own algorithms, they were tuned for different parts of the political spectrum and people tuned in and they both offer different reality distortion field. And what Facebook… So that was news and that was traditional media curated to fit a group. And in a sense, what Facebook has done is just taken that playbook and automized it and just multiplied it, created a monster on the basis of that. Multiplied out to as Roger McNamee said in the film three billion Truman shows,

Alex Moazed (00:25:33):
Yeah. I’ve got my other chart up here now, which is just the downward sloping graph of Americans trust in mass media.

Tim Kendall (00:25:41):
Yeah.

Alex Moazed (00:25:42):
It’s much easier to have standards, really strongly upheld standards, when you’re profitable. When you’re not profitable and death is around the corner, you’re in survival mode. And you know this as the startup guy. Survival mode and profit mode are two very different modes.

Tim Kendall (00:26:05):
I think what happens when you’re in the survival mode is… And it’s hard for people to imagine Facebook being in survival mode, but there really were times when we were. There were periods over which growth was flat, there were periods over which it wasn’t clear how we were going to grow revenue year over year. And the interesting thing that happens from an organizational psychology standpoint, is when as an organization in survival mode, what you tell yourself… Certainly what I told myself at times and I’ve heard about this playing out in other organizations, is you think, “Well, we’ll go do the right thing later.” Right? We’ll make sure that this thing is buttoned up once we have a little slack in the system. And that’s always… That’s the realities of capitalism and the quarterly reporting, et cetera, et cetera, make that really hard.

Alex Moazed (00:27:13):
You’re mentioning this that user growth is so critical, right? And to me, the interesting thing if we look at the past few quarters of performance, you had Q2, you had the boycott Facebook stuff go into full gear. How does Facebook do Q2? Actually fantastic. I bought the dip, Facebook was not damaged. Now, you have Q3 where there’s just a lot of disagreement on either side of the political spectrum in terms of how Facebook has handled itself for better or for worse, they mostly actually probably both sides think for worse. But we also just saw their Q3 results and they’re fine. If anything, they were actually great. You line that up against Twitter, Twitter had flat user growth. Flat. From Q2 to Q3 of this year, 36 million DAUs in the US. And you know that their product matchmaking teams were hitting that algo as aggressively as possible just to maintain par. God forbid, it should decline.

Alex Moazed (00:28:24):
But we saw Twitter stock fall over 20% day after they released earnings. To me, what that’s reflective of and maybe some of that goes to maybe what some of what you’re getting at here is just the… The name of the book is Modern Monopolies. I actually don’t think Twitter is a monopoly. There are 30 billion market cap company, they have a strong niche, but it’s a niche. They don’t have multiple content platforms. They should have I think had their own version of TikTok and they missed that boat. But Facebook has Facebook and WhatsApp and Instagram. It is full on platform conglomerate status.

Tim Kendall (00:28:59):
Yeah.

Alex Moazed (00:29:01):
And to me, even though there are disgruntled users almost on both sides or all over the place, you see the juggernaut just actually continue on pretty much, completely undamaged if not stronger for it. And I think you do see vulnerability with the smaller ones like a Twitter, which would be happy to be one 10th the size of Facebook, and it’s obviously way smaller. Do you see that monopoly status as something that a Facebook or a Google holds and I guess, their willingness to change being influenced by just that market dominance position? Or am I off here?

Tim Kendall (00:29:46):
I think both companies clearly have tremendous power. And I listened to an interview with Emily Chang from Bloomberg and Bill Gurley, who’s a well known venture capitalist from Benchmark and she asked the same question about Facebook and Google. And he was diplomatic, but he said, “Look.” He said, “We at Benchmark, we invest in these startups and a lot of times they get into various flavors of conversations with Facebook or Google. And those conversations feel very one sided. There is a clear David and Goliath dynamic.” And I think that’s been true for a while. And I think that I don’t spend as much time thinking about Google’s businesses, I do Facebook’s because I’ve never worked there and I just haven’t spent as much time thinking about just the positional structural dynamics of Google’s business.

Tim Kendall (00:31:08):
But Facebook’s business, they’ve got a series of Metcalfe’s law networks, that as long as a service is providing generally a good enough service for people… Metcalfe’s law network affects services that are unassailable in a way that a service like Microsoft or Amazon or others, they have a different kind of network effect, but the Metcalfe’s law telephone type network effect, original AT&T network effect, that’s a hard thing to put it down in for startups.

Alex Moazed (00:32:05):
So let’s run that thread. You’re kind of unassailable here, that’s why you’re going to the congressional Commerce Committee. Right? That’s why I think what we’re saying here is, this is the role of government, is to try to fix someone that is unassailable and is harming their users, their customers. That’s why we have multiple different ways that the government could get involved, whether it’s through the courts, whether it’s congressionally with laws. Let’s say that you had the power anointed to say, “Okay, I’m going to fix this.” What do you think is the best mechanism to do that? There’s a lot of talk about 230, there’s just a lot of different theories out there, even if you had the power to do something, what do you do? What’s your first inclination?

Tim Kendall (00:33:18):
I have a somewhat low probability idealistic path. But look, if I had a magic one, this is what I would suggest we do. I would like to see the leaders of these companies and let’s just for the moment to simplify it, just say Facebook and governments that presumably are responsible for regulating or not, the case maybe not regulating Facebook. And then their consumers and there are self anointed and institutionally annoyed and consumer advocates out there, who have the voice of consumers interests at heart. I think those groups, so mark governments from all over the world and then consumer advocates, an anointed leader or two, I think they need to get together and try to align on what we think reality is today. And what reality is today, in my mind is that we likely have an existential crisis on our hands with the combination of an extractive-based business model, combined with all knowing and increasingly sophisticated AI. That can actually be mitigated if we can arrive at a business model for Facebook that aligns users interests with the interest of Facebook, because right now they’re divergent.

Tim Kendall (00:35:10):
Alex’s best interests are just not in the best interests of Facebook, because your best interest is likely for your health and well being and the goodness of society spend less time on the product, they need you to spend more. So what I would love is for that group to get together and say, “Okay, let’s share collective responsibility for how we got here.” The government’s responsible, they let it all happen, they didn’t challenge a single acquisition, they didn’t take a peek at 230 until the last year or two. Consumers have been complicit in this in a sense, we make decisions, we do have independent will. And then the companies have a part to play in this for sure. So, if we can spiritually say, “Look, the three of us all played a part in this, let’s co create a path out of this.” What does the path out of this look like? Well, I think it’s not too dissimilar from going from extractive energy economies, fossil fuel dependent economies to an economy that gets energy from clean sources and green sources.

Tim Kendall (00:36:27):
So we start to come up with the solar version of Facebook or the electric version of Facebook. And then we need Facebook’s commitment and government’s commitment and then consumer support to allow that to happen. Consumer advocates would need to advocate for consumers to pay for this sort of thing. If that’s in fact, the model we agree to go, government would need to both create incentives, I think probably tremendous tax incentives for this to happen. Because we need a credible path for them to segue and not a road to 100 billion dollar a year in revenue. And I think you can do that with tax incentives potentially.

Alex Moazed (00:37:16):
You brought up AT&T, you had the government say, “Basically we’re going to accept that you’re a monopoly, but we’re going put guard rails around you, we’re going to protect things that you might take advantage of.” Right? Like kicking people off the network or overcharging, these kinds of things. You made the point earlier, the matchmaking here, the algorithm, that curation, is there a way to open that up to externalize curation and matchmaking? For example, if we took these media companies, which their business model has been destroyed and said, “You know what? Facebook is not going to be allowed to have their own matchmaking algorithm. Now, it’s going to be up to third parties.” That could be CNN and Fox News, they could use humans or they could use algorithms or a mixture of both. It reminds me of like Drudge Report for example. Whereas Matt Drudge for 20 years, it was a one man show and he’s just curating and he’s basically just a curator. And is there a form of value creation in the form of curation?

Alex Moazed (00:38:36):
And can we remove the platform’s ability to own that or at least put handicaps on that and maybe focus this kind of amazing brainstorm session that we have and say, “Your matchmaking isn’t going to remain same. Let’s brainstorm a better version to do matchmaking which Facebook, you’re not going to be happy about, because now you can’t predict to a tee what your earnings are going to be and it’s going to be out of your control, but it’s going to be better for society and maybe we’ll give you some incentives to make you happy.” Is there a focus conversation in that structure that you think would at least have a shot?

Tim Kendall (00:39:19):
I think it’s a really compelling brainstorm to have. And I do think it is a way to get at this. It could absolutely be a way to get at this misaligned incentive and to give people some agency over… Okay, I know I don’t have total agency, but at least I can have some preference around who curates what I’m doing or what I’m seeing. So, I think it’s interesting. I also think that this is just a caveat on even going down the path of subscription. I think even Netflix is guilty here of being an extractive attention economy participant, even though they’re going down what we say is sort of best practice for consumer service, which is subscription. But look, they are trying to… And you can just see all the tactics in the product, the pre rolls, the aggressive episode one flows into episode two and then the programming is increasingly being influenced by the viewership data from the past. They’re absolutely using technology to prey on our human weakness.

Tim Kendall (00:40:51):
Now, the model is subscription, but clearly there’s a crazy earning score that they had. I think it was three or four years ago, where some analyst asked them their earning score. “Can you talk a little bit about your competition?” Expecting them to make a commentary on Apple TV or HBO Plus or whatever, Disney Plus. They said, “Oh, our competition is really our customers sleep and their relationships.” Which couldn’t be a more appropriate characterization of the misalignment of their interest with your interests. Their interest is that you get less sleep and have fewer high quality relationships or certainly more shallow relationships, because you just don’t have the time to nurture them. That’s their incentive. Your incentive is I think, not that.

Alex Moazed (00:41:49):
That black box, what goes into that black box? How can we… Transparency is the cure for for all evil and misgivings. So, how can we shine a big fat light on it? And maybe we don’t get all the way there, but-

Tim Kendall (00:42:08):
Look, I think transparency helps. I totally agree with you. That’s a key principle. I think the part of transparency that seems critical is us as an end user, just having more transparency. And we talk about this at Moment, and we’re not the first one to think about it. What is the impact? What is the impact on you the individual of spending five hours a day on your phone? What happens quantitatively to your psychological well being and what happens physiologically as a result of this. And what’s happening neurologically, that is a mystery, that is not transparent, but that is a knowable problem that we can get to. And I think if people understood the cost. It’s now relatively clear that if I eat a lot of cookies every night, there will be a consequence to that. And it’s probably increasingly in people’s intuition quantifiable. But it would be helpful for instance to know, “Okay, if I subscribe to Netflix, what is the average impact on my hours of sleep I get at night, versus the people who don’t subscribe to Netflix?”

Tim Kendall (00:43:29):
There’s a transparency just in terms of the services impact on health and well being that is not there. But I believe is knowable and I do think there’s an opportunity there. And then it allows us to make more informed choices. This was the issue with tobacco. You could smoke cigarettes in the 30s, 40s and 50s and you could feel real good about it, because you didn’t think you were doing anything bad to your body.

Alex Moazed (00:43:56):
And all of that is before, or you even talk about kids. And what’s happening to these kids when they get a cell phone in their hands at age seven. And what is that doing to the wiring and the development of the brain? So, absolutely. But let’s touch on the censorship thing a little bit. So, you’ve elsewhere talked about these platforms, Facebook, they don’t want to be this arbiter of truth. And for whatever reason, they have been forced to start taking a side. And once you do take a side or once you kick off someone here or you shadow ban there, how do you draw that line in an objective way? And it seems like it is an oxymoron. There is no way to continue to draw that line in an objective way, you’re always now going to be put in a position that you took some kind of action and you know you have pissed off another part of your community.

Tim Kendall (00:45:11):
And I think that’s why by the way, Mark, probably smartly as it related to Facebook, put this off for as long as possible.

Alex Moazed (00:45:18):
Philosophically, do you think that… Do you see a difference in these content platforms, these social networks as some being… Are they all too closed off and they need to be more open? Are some getting the balance right and some are too closed? Are they all to open and need to be more closed?

Tim Kendall (00:45:44):
What do you mean by open and closed in this context?

Alex Moazed (00:45:48):
I mean the level of censorship which is hard to measure. Those stats are difficult to measure. But there’s multiple kinds of censorship that we’ve talked about on the show. One extreme is, I’m kicking you off, I’m banning you from the platform forever, the other one is, I’m suspending your account, the other one is I’m shadow banning, I’m limiting the visibility your message gets. Another one would be, now I’m going to modify, I’m going to put an alert, I’m going to modify the content. And when we have this conversation on the show, I try to talk about non political examples, because everything is so partisan these days. But when it comes to COVID for example, there’s all these stories about that Chinese virologist from Hong Kong, who was saying that COVID was made in a lab. And all those videos and her account has been… She’s been banned from multiple platforms, the videos have been taken down.

Alex Moazed (00:46:50):
If you’re even to show the video of her talking about it, then your account is going to be banned. That’s what I mean about censorship. Do you think that these platforms are getting too comfortable with censoring, maybe not everyone, but certain populations and it’s hard to draw back those lines because you’ve crossed them? Or how do you see it?

Tim Kendall (00:47:19):
Well, it’s so hard. And I know that’s not meant to be a cop out. But if you look back in history around… Look, I have no reason to believe that COVID was created in a lab. I don’t believe that. But if you look back in history and you think about some of the newest ideas that then became mainstream, the new ideas are condemned and the people who talk about them are maligned. But we know historically, that some small percentage of those become mainstream and adopted. And so what is the process by which craziness gets properly filtered, versus potentially interesting doesn’t get overly penalized because it’s not proven? And that’s a hard… I don’t empathize with the social networks having that on their shoulders right now. But guess what? They do and they’re going to have to come up with a principled way to… I don’t know what you call that problem, but if it’s solved incorrectly, you could imagine we’ll never have the counterfactual, but you could imagine a bunch of really powerful ideas in the future that never see the light of day because of the way that the lines are drawn.

Alex Moazed (00:49:17):
Earth is the center of the universe. And if you say otherwise, well, you’re an idiot and you’re out of here. I’m with you. I personally would say that when it comes to the censorship debate, it’s obviously very tough to measure. I would say that Facebook has played that game better than others. And I would say that Zuckerberg has come under a lot of fire for it on both sides. To me, it comes back to the leader and the individual who I feel like there’s still something in Zuck which is saying, “We need to resist being too much whack a mole and censoring too much.”

Tim Kendall (00:50:02):
I agree with you. I actually think that his defending and coming under a lot of scrutiny around protecting political speech is very important. And actually, in principle I agree with him. And I also agreed with the caveat that they came up with the qualifier to political speech, which is that, “Look, political speech, we’re not going to correct, we’re not going to label it. But look, if it’s going to incite violence, it has crossed the line. And I think that’s the right line to draw with political speech. And I think what we’ve seen recently, if you rewind two, three, four months ago, I think that there was a view that as long as hate speech doesn’t incite direct violence, it’s okay. Holocaust denial is okay, as long as in your Holocaust denial group, you don’t say, go do the following things that are violent and hurtful to a group of people.

Tim Kendall (00:51:16):
And I think what… I respect the principle that they used to draw that line, that original line, however, I think that it’s a really tough thing to argue that a group that is inciting hate is not almost definitionally inciting violence by implication. And I think that’s where they landed, I wasn’t in the room. But to me, that’s why they are now starting to say, “Well, look, if you incite hate you are inciting violence.”

Alex Moazed (00:51:55):
Well, it’s also interesting that we saw, is they started to limit leading up to the election the sharing capabilities. You can only send this message to five people or-

Tim Kendall (00:52:08):
Yes.

Alex Moazed (00:52:10):
I don’t think they were tampering with the algo, because they did that black box to keep humming, but how do I try and not on the fringe clamped down on some of these features I’ve built that get this viral nature going?

Tim Kendall (00:52:27):
Yeah, yeah. And they’re having to make judgment calls and invent on the fly. Deciding to throttle the distribution of stop the steal on Thursday or Friday of last week, do I think that was the right thing to do given the risk of violence? Yeah, I think it’s probably defendable. But it’s a judgment call and it’s reasonable that 70 million people who voted for Trump think that it’s censorship.

Alex Moazed (00:53:09):
The last thing, we have one or two questions from the audience and we’re going a little bit over, Tim, thank you for your time. The funny thing is, there’s those leaked audio clips. Although Zuckerberg has clearly made gobs and gobs of money off of Facebook, yeah, that guy’s day to day is at times probably unenviable. But I know that at times probably very enviable. But anyway, what he was saying in these leaked audio clips is, “Hey, guys, actually, the majority of our users seem to be conservatives and the majority of our user complaints are that we are censoring conservatives rather than the other way around,” which was the narrative from the employees in Facebook, which of all their political donations are through the roof for the left. So you got all these really interesting, very complicated dynamics. But the question here is, “Hey, Tim, do you allow your kids to use Facebook? At what age is it okay to start?”

Tim Kendall (00:54:10):
Well, I get to punt because my kids are four and six. So, no, they’re not. And I do… For what it’s worth, we’re pretty draconian about using devices at all. Mainly through trial and error, we’ve realized that when they’re on an iPad for a long time, at the end of it they’re not as pleasant to be around. And so it’s not worth the… yeah, they’re occupied for that hour that I gave them an iPad, but I pay for it when they have to put it down and my wife and I have learned it’s not worth it. So, their screen time and they probably do 10 minutes a week, 20 minutes a week at most. And look, we’re really fortunate in the pandemic in that we both have enough flexibility that we’re able to do that. I don’t think that’s necessarily… I don’t say that that should be the standard for quarantine at all, because I think people’s circumstances vary and that just may not be realistic, particularly in quarantine. I’m just sharing with you.

Alex Moazed (00:55:26):
20 minutes a week is like, does that even count? That just seems like a tease. You’re just saying, “There’s a whole universe, but I’m not going to let you see it child.” That’s very interesting.

Tim Kendall (00:55:45):
With kids that young, they forget. What we found is with kids that young when it’s daily, it’s habitual. It’s in their day to day consciousness, “Hey, I want to watch a show.” But if it’s every four or five, eight days it doesn’t get habituated and they forget. And it’s not tempting them in a way that… Well, at least that’s what I’m telling myself. I might be delusional.

Alex Moazed (00:56:10):
No, that’s the whole premise here about Moment, about addiction. How do you how do you break that cycle? And I think your point is, once you’re in the cycle, it’s near impossible to break. It’s just a matter of can you limit it?

Tim Kendall (00:56:25):
That’s right. Yeah. And then look, I think that it’s a really tricky situation as kids get older, because you can be draconian about these social media services, but if your kid goes to school with 30 other kids, it’s unlikely that their parents are going to be aligned with you. Now, there are frameworks, probably the most well known is wait till eight, which is basically a framework that you can employ with your kids and the kids parents in your son or daughter’s classroom. And people do it as early as kindergarten. They try to get the whole group of parents to align on a norm, which is that we’re not going to give our kids smartphones until they’re in eighth grade, which is whatever 13, 14. You give a dumb phone, but the point is to keep them off of these social networking services. I am hopeful and I do you think there’s an interesting business to be built around this whereby there’s almost a self contained social network that people younger than 13 can use. But it’s literally just like a messaging platform. Maybe this already exists and I just don’t know it.

Tim Kendall (00:57:41):
A messaging platform for the third graders or fifth graders that they can use to communicate and be social with one another, but it’s closed, it’s private, the data is protected and it has this natural way of deferring the point at which they get on a snap or an Instagram or TikTok. I think that’s where we’re going to head because I just think there are too many stories of 10 year olds being on TikTok and that just leading to all kinds of downstream issues. And the other thing you can do with these self contained networks that potentially the school can be an administrator on, is a lot of the challenges they have at that age 10 plus, is there’s bullying online, but it’s a he said versus she said and there’s no way because there’s no administrative privileges for me as a parent or an administrator at school on TikTok, I don’t know who to believe. So, it’s a he said she said battle between kids and parents and administrators. It’s a mess.

Tim Kendall (00:59:03):
I think that has to get and likely will get solved especially on the heels of things like The Social Dilemma and this just coming into consciousness. And I think when it does, it’s going to alleviate I hope at least some of the challenges that parents are facing with smartphones and the social media services and when the right age is to let them get on them.

Alex Moazed (00:59:28):
That’s a great point to leave it on. That’s a trillion dollar idea if anyone wants to pursue it. And Tim, I just wanted to truly say thank you for putting the word out there and taking a stand and coming on the show today obviously. And if you haven’t already, go check out Moment and definitely hope to have you back on and keep us posted when the friend relationship app is coming out and thanks again sir. Thanks for joining us.

Tim Kendall (01:00:02):
Thanks, Alex.

#SocialHealth #SocialMediaAddiction #FacebookInsider


Filed under: Winner Take All | Topics:

B2B Distribution Technology

Sign up for our weekly newsletter covering B2B technology innovation


Top Posts

  • B2B Chemical Marketplaces and Tech Startups: Landscape and State of the Industry

    Read more

  • Platform vs. Linear: Business Models 101

    Read more

  • Amazon Business – 2020 Report

    Read more

  • Platform Business Model – Definition | What is it? | Explanation

    Read more

  • The Value of Digital Transformation: How Investors Evaluate “Tech”

    Read more