Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
This is the BBC. This
0:30
is the BBC. This
1:00
is the media show from BBC Radio 4. In
1:24
a moment, the executive chairman of Sky News
1:26
Group David Rhodes on his plans for Sky
1:29
News and on his years as
1:31
a senior TV news exec in New York.
1:33
If you look at the front page
1:35
of the Financial Times, the headline reads
1:38
Open AI and Meta Poised for Artificial
1:40
Intelligence Leap with Bots That Reason.
1:43
Bots that Reason is definitely a phrase that
1:45
requires some explanation. We'll talk to Madamita Murdjie
1:47
who wrote the story and who's looking closely
1:49
at how AI is changing
1:51
how media is made. Madamita,
1:54
it's your job to report on AI
1:56
for the FT. I wonder
1:58
what you see as the prime Murray challenges of
2:00
getting across the concept that of within
2:02
it to your readers and. And
2:05
and will abuse. I think
2:07
the big challenges, the complexity of the
2:09
technology and how much people kind of
2:11
believe they already know about this second
2:13
as making sure that were breaking down
2:15
these misconceptions about the fact that it's
2:17
gonna be in at the next Terminator,
2:19
that's gonna destroy us all and all
2:21
that it's gonna save everything and solve
2:23
every problem you know it isn't v
2:25
to very kind of clear binaries that
2:27
all of the nuance in betweens and
2:29
it's bringing that across. I think that
2:31
it's the hardest challenge and you'll see
2:33
that from all the comments on the
2:35
that story like. Four. Hundred of them. So.
2:37
We're going to get into how best to explain
2:39
artificial intelligence with your help with the help of
2:41
David Rohde some sky news. And a
2:43
little later would also be focusing on
2:45
Donald Trump social media platform Treat Social
2:47
and Bt these. I is the former
2:49
Chief product officer of Trees Social and
2:52
he is with us. Billie welcome to
2:54
the media So I wanted to ask
2:56
as we won't be speaking to for
2:58
bit but I really want to ask
3:00
at the top when you met Donald
3:02
Trump a moral law guy, What did
3:04
he ask you to build? I
3:07
he wanted to build something, they
3:09
continued to help his voice good
3:11
distributed and and and if it
3:13
was paramount them that we build
3:15
something that allowed everyone to have
3:17
free speech. Ah so in his
3:19
his focus was always free speech.
3:22
And as it will be having a lot more about that later.
3:24
It was be to Billie in a little while but that's
3:26
all for David Roads is here in the media says to
3:29
the hi David Thanks coming in now Thanks think you guys
3:31
for having me and so like I'm used to have been
3:33
on your side of the table or least working with the
3:35
people in your side of the tables as can be. New.
3:38
For made actually answering questions and is of
3:40
the you're looking forward to when. They met
3:42
not as if I suspect has. Asked me as
3:44
a soda. Still has a right where your
3:46
job title now as Executive Chairman for the
3:48
Sky News Group, but you'd been President a
3:51
Cbs News in the Us you the youngest
3:53
person to ever take such a senior job
3:55
and Us Tv News before that you ahead
3:57
of Us television A Bloomberg Evil Sabina Vice
3:59
President. Use for Fox News, see
4:01
how some big gigs in the
4:04
Us to and I and I
4:06
wonder how you compare being an
4:08
exact within the Uk media with
4:10
big news exec a New York.
4:13
It's coming up on for years living
4:15
here in the Uk. It's
4:17
a journalism pulsars. It's second
4:19
to none and. When
4:21
you are an institution like institution
4:24
like Sky News, you get to
4:26
do things cover things that. I.
4:29
In a lot of other organizations
4:31
I've been involved with, you'd have
4:33
to find a small place to
4:35
sit that story. But here, the
4:37
audience proposition the audience expectations of
4:39
much more of that sort of
4:41
coverage. So. Something. Like the
4:44
Oleksyn of all need, Death comes along,
4:46
and a lot of places in America
4:48
you'd be. Trying to find seventy
4:51
five seconds. And a program
4:53
where you can address that here that's
4:55
special coverage as an audience expectation we're
4:57
going to spend a lot of on
4:59
Had we had as sky a lot
5:01
of resource out in the field on
5:03
it and we saw real audience impact
5:05
from responding to it. And Will
5:07
actually is your job. It's Executive chamomile.
5:09
Does that mean. Vs Guy in charge
5:11
of all hair and makeup. Gonna
5:14
call. It a bit early that and help me with my daughter that's
5:16
a me. I let the that I
5:18
had thought know what they're not
5:21
a had not as I said
5:23
that. Now what it means is
5:25
we have a group that spans
5:27
about seven hundred percent newsroom in
5:29
the English language with bureau's around
5:31
the world. We have an Italian
5:33
language service, etc. were was have
5:35
this morning with about two hundred
5:38
based in Milan but also obviously
5:40
in Rome. we have joint ventures
5:42
ah and we plug into a
5:44
larger family of news operations of
5:46
that Comcast. Company which their parents
5:48
which includes and B C
5:50
C N B C, Msnbc,
5:52
Telemundo, and local television stations.
5:55
But. Are you getting involved in what the lead story
5:57
is or how much time the lead story? word?
6:00
The on a forward or your you're not dealing
6:02
with that kind of thing at all. You.
6:04
Can't do these jobs without spending
6:06
some time in the editorial and
6:09
my whole career as prize we've
6:11
talked about is coming up through
6:13
an editorial operation, working on an
6:16
assignment just sending Cruise out editing.
6:19
Ah, I mean, that being said,
6:21
we've gotta really capable team. We
6:23
have an excellent executive editor here
6:25
in the Uk, Jonathan Levy. He
6:27
and I were together at of
6:29
Lords committee last week about some
6:31
of the kinds of issues were
6:33
talking about today, so I try
6:35
to let people do their job.
6:37
Produced programs on all platforms. Go.
6:40
Out every day and cover stories and
6:42
call it like they see it and
6:44
then just support that offering so it
6:46
actually reaches an audience and as a
6:48
commercial future. And. What kind is your first
6:51
phone calls.intend to what's uncle or you wanna
6:53
Xena Seven Saying this is distributed Talks Doyle
6:55
Neither the great stories like I. Think
6:57
I have to. people's probably great
6:59
disruption I I make the first
7:02
phone call generally. And one time
7:04
as I you one of those three i am
7:06
Papal states bosses I've had that as the of
7:08
the have dozens and not good snow down once
7:10
among at three am or even four or five
7:13
the that to be fat wasn't in the media
7:15
that was before. You know what's worse
7:17
is a donkey. Had I had way
7:19
work that's ah. You troll them anyway
7:21
by telling them that slug of work
7:23
for a lot of really impactful, an
7:25
incredible bosses. You
7:27
know what you want. If anything this is actually
7:30
true that whole profession I'm in. The worst thing
7:32
is if they haven't seen the show like Woody
7:34
mean haven't seen the show like I have no
7:36
of listen to course in course you want to
7:38
both. If. You want me
7:40
to call you add Soya Do I hope? And
7:44
before we get into the strategic challenges the
7:46
Sky News which I know you are grappling
7:48
with in your in your card role as
7:50
Op or lead as a big news organizations
7:53
we also want to understand your experiences and
7:55
in America as a as a as a
7:57
leader within a number of news. And because
7:59
I'm. They inform what you're doing in
8:01
in Sky, so let's go back quite
8:04
a few years to when you were
8:06
Vice President of News for Salts. How
8:08
did you get that gig? I
8:11
got the gig and. I
8:14
got the gig the same work my way
8:16
up from running the autocue. Honestly, I
8:19
started working there Ninety Ninety Six at
8:21
the beginning of this career and. Basically.
8:24
They hired seven hundred people about
8:26
the size of the newsroom we
8:29
now have working English and Sky
8:31
and. We're doing that over
8:33
the space of about three months to not
8:35
to diminish my credentials for that opportunity. but
8:37
I mean basically easy to get in the
8:39
door. They were hiring a lot of people
8:42
in really short span of time and you
8:44
sit of work your way up. That was
8:46
my entry level opportunity. And
8:48
when you were at Fox, you presumably
8:50
were aware that it was becoming quite
8:53
a divisive presence within the the American
8:55
media. Why left in two thousand and
8:57
eight? I worked for News Corp a
9:00
second time when I first moved to
9:02
this country. At that time
9:04
to spec out what later became
9:06
talk Tv. And then I left
9:08
the company a second time. Ah, about three years
9:10
ago when I went to work at Sky and
9:13
the business side. Did. You
9:15
know, with that, not when you're back in
9:17
those folks days with the President at Fox,
9:19
we were you. At what point did you
9:21
encounter him? You know it's. Interesting. We
9:23
talk about somebody who actually in or
9:25
has seen the program, engages with it
9:28
has an opinion. I mean, he is
9:30
one of those people and I think
9:32
that he. He's. Interested to
9:35
know what's going on? He's interested in
9:37
what you've heard, what you're reporting, what's
9:39
gonna be in the program, what's can
9:41
be in the paper. You know, In
9:43
those early years that first time working
9:45
at the company, I really had no
9:47
interaction with them because I was in
9:49
a relatively speaking in a line level
9:52
in this second time around him and
9:54
is now he just turned ninety three
9:56
last month. so. He was
9:58
in. There was also covered. And
10:00
I was relocating from the United States but
10:02
he was still very interesting work we're doing
10:04
and I just but mainly sound him to
10:07
be in. oh and engage proprietor of and
10:09
maybe that. Maybe. That fall
10:11
short of the Marcus all the stories
10:13
of people expect but he was just
10:15
interested and engaged and are Fox of
10:18
the time you work that will often
10:20
hear from people who have worked within
10:22
Fox News saying there's a division between
10:24
the opinion programs that are on in
10:26
prime time in the evening and the
10:28
broader, the broader news operation and. Critics.
10:31
Of folks would reject that that distinction
10:33
when you were there. did you feel
10:36
it? Or you know I left in
10:38
two thousand and eight that ten days
10:40
after the chat my after presidential elections.
10:42
So and it's been a lot of
10:44
road since then. I guess stern say.
10:48
They been entirely different prime time lineup am
10:50
it's different kind of median for him and
10:52
I'm her part of. Where. We
10:55
are today and mean consider for
10:57
instance. Sky. At that
10:59
time was a listed company
11:01
in this country which was
11:03
controlled by the Murdoch operation.
11:05
Sky Today's a wholly owned
11:07
subsidiary of Comcast, Nbc Universal
11:09
and as are we as
11:12
the Sky News operation. So
11:14
that's changed and then just
11:16
the media landscape courses changed.
11:18
Cable News in the United
11:20
States was in an extraordinary
11:22
growth say is from about
11:24
two thousand until two thousand
11:26
and seven. But peak
11:28
table as far as household distribution in
11:31
the United States was sort of mid
11:33
two thousand and seven. Now it's can
11:35
have a long. Tail. But
11:37
when you think about that, And
11:39
there haven't been net new
11:42
homes into which those networks
11:44
like Fox News or go
11:46
since that time. And here
11:48
we are and Twenty Twenty
11:50
Four. It's a very different,
11:52
very much more digital, very
11:54
much more technologically informed media
11:56
diet. But when you look at
11:58
fox now d by into the argument. That there
12:00
is a dividing line between news and opinions on
12:02
what was it like that? Then I think
12:04
that. A channel like that.
12:07
And many of these channels,
12:09
they're effectively programmed by the
12:11
audience. I. Mean you can see
12:13
in real time what people respond to.
12:16
And so I think it's only natural.
12:18
and it's interesting because the rest of
12:20
the conversation and will be talking about
12:23
artificial intelligence and how out things are
12:25
algorithmically determined. But this is one way
12:27
in which things sort of haven't changed.
12:29
A mean producers You can't help it.
12:31
you look at what you did the
12:33
day before. You. Have some
12:36
data.how the audience do didn't respond
12:38
to it. While. You're programming it. You have
12:40
a little bit of a sense of whether it's gonna work
12:42
or it's not. A out I
12:44
don't think that's that's one thing that
12:46
when there is reason to was or
12:48
wasn't a divide between news, current affairs,
12:50
opinion, it's after I think. I.
12:52
Think all media products. Whether.
12:55
It's Fox or Bloomberg, or of
12:57
Cbs or Sky or and probably
12:59
leaving out a couple of other
13:02
places that are worth. They
13:04
all work off of one thing, which
13:06
is what's the audience expectation. If you
13:08
meet that expectation or exceed it, it
13:11
works. And. If you fail to
13:13
meet it, it doesn't work and you
13:15
can modify that. The audience is expectation
13:18
of you over time. But fundamentally, you
13:20
can't change what people expect from your
13:22
organization. You have to. You have to
13:24
hit that every deck. Is
13:27
addressing You're mentioning the the ratings the come
13:29
in from the from the night before I
13:31
was telling case use we were getting ready
13:33
to come into the studio that I remember
13:35
talking to some one person he worked with
13:37
a new as cable news in the early
13:40
part of the of the noughties saying that
13:42
everyone knew down to the minute when the
13:44
ratings were going to arrive in the afternoon
13:46
and the whole day kind of revolved around
13:48
it and they would break down each item
13:50
and that work that didn't work to a
13:52
degree that sounded like it went beyond. Everyone.
13:55
cares about the ratings in the uk to of course
13:57
but this sounded like it was honored another
14:00
level. But think
14:02
about where we are now, where you
14:04
walk around any newsroom like the newsroom
14:06
here at the BBC, you know, it's
14:08
fair to say our newsroom at Sky,
14:10
you know, we have up the,
14:13
you know, Adobe screens, the real
14:15
time data, the, you know, click
14:17
rate, you know, the
14:19
performance of different items of text.
14:22
I mean, now you're, you're sort
14:24
of buried in this. So yes, you used
14:26
to get time was you got
14:29
sort of an average minute audience against
14:31
linear television, and you kind of checked
14:33
your decisions against that in a commercial
14:35
context. Now, are you getting, you know,
14:38
54 flavors of that? And
14:41
I'm interested in, you know, your assessment
14:43
of the differences between the US experience,
14:46
the US media experience that you had in the UK media
14:48
experience when you went to CBS in I'm
14:52
always because I love the morning show. I know that
14:54
wasn't CBS. But you know, in terms of how, for
14:57
example, you know, certain
14:59
anchors, you know, American anchors have
15:01
a certain reputation. Did
15:04
you, did you observe anything there?
15:07
Well, okay, first, the
15:09
you know, what, what do you mean it on
15:11
the in terms of is the show right? Well,
15:13
kind of, you know,
15:15
look, and the power that anchors have, I mean,
15:18
do anchors here in
15:21
Britain have less power than American
15:23
anchors? How does it work? Look, one thing
15:25
we've tried to do now
15:29
is actually emphasize a little
15:31
bit more the talent contribution.
15:33
And we've done that not
15:35
because we're trying to in
15:37
some way Americanize the
15:39
offering at Sky News. What does it mean
15:42
emphasize more the talent contribution? Well, what it
15:44
means is people watch people. And part
15:46
of the value proposition of
15:48
Okay, what you know, something has happened,
15:50
I want to go see
15:52
what credible voices are going
15:56
to have when they're covering it.
15:58
So like, if I think about You know, the political
16:01
news report here, Beth Rigby
16:03
is our political editor. She's RTS
16:06
political journalist of the year this year. We're very
16:08
proud of that. We built a
16:10
political program around Sophie Ridge,
16:13
who now is on in prime time at
16:16
7pm each evening, doing a report
16:19
in an election year. We think that's important. We've
16:22
got, Beth's got a new podcast
16:24
with Jess Phillips and Ruth Davidson. I
16:27
enjoy it very much. Electoral dysfunction. Shameless
16:30
blog. Yeah, big talk. Try that.
16:33
Sam Coates, deputy political editor. He has a
16:35
new podcast. This is the other thing. Nobody
16:37
has just one show anymore. Everybody has a
16:40
variety of different products, but
16:42
you have to have, I guess what
16:44
I mean in terms of emphasizing that
16:46
you have to have people who
16:48
are credible to the audience who can bring
16:50
them, in this case, the political news report.
16:53
Otherwise it doesn't work. I guess
16:55
I wanted to get into what it was
16:57
like in America back then. CBS, Hay Day.
17:00
And also the fact that quite
17:02
rightly presenters, anchors in
17:04
the UK aren't the boss, they're never
17:06
the boss, the editors of the boss,
17:08
but I've spoken to plenty of news
17:10
anchors over the years in the US
17:12
who have various roles as executive producer
17:14
or other job titles, which gives them
17:16
a type of power that news presenters
17:18
in the UK don't have. Roz, it
17:20
sounds like you're not a complaint by
17:22
the way. Think
17:25
about relocating. Absolutely not. You want to
17:27
be from the embassy? You
17:30
could always make up. Right.
17:32
Well, that is the difference though, right?
17:34
In that some news anchors will have
17:37
significant built in authority
17:39
or power, if you like, in a
17:41
way that doesn't happen here. But in
17:43
a way, you should, because, you know,
17:46
maybe speaking against my own interest here,
17:48
but you're who people are listening to
17:50
on this program. People are listening to
17:52
this program because Katie, Roz,
17:54
you're here, you're presenting this and we're
17:57
going to have an interesting discussion. But
18:00
they're not listening to it because I'm
18:02
the executive in charge of this or
18:04
anything else like that. Part of our
18:06
job here is to resource things and
18:09
send you out so that we can, you
18:11
know, gain an audience, convey
18:13
news, make it interesting.
18:16
Let's talk about your shift from the US to the UK.
18:19
You got a call from either from, was it from
18:21
Rupert Murdoch or someone on behalf of Rupert Murdoch saying,
18:23
hey, how about it? This is the
18:25
one where I initially moved here. Yeah. I
18:28
went to see him. I was in Los Angeles and
18:30
I was doing some work for the LA Times, which
18:32
at the time had a relatively new owner. He's
18:36
now still the owner, but not so
18:38
new. And now
18:40
I went to the studio
18:42
a lot and he told me about
18:45
his ambitions to get back into the
18:47
TV business here after he had sold the
18:50
company I'm now working for to Comcast. He
18:53
being Rupert Murdoch, just be clear. Yeah. And
18:56
it sounded interesting to me and that was it. And
18:58
what was he saying at that point about what he wanted to create? I
19:02
think it was more, you know, look, you'd
19:04
have to ask him about what he wanted
19:06
to accomplish with the product because I left
19:08
and, you know, they went ahead and did
19:10
it. And there's been a lot
19:12
of time since then. It's been a couple of years.
19:16
But look, I mean, I think he
19:18
missed being involved in this segment of
19:20
the business. And look, I understand why.
19:22
It was very I think there's I
19:24
think there's a really robust television
19:27
news sector here. And
19:29
he found himself outside of that looking
19:31
in after the sale. And for people
19:33
who are not understanding, essentially what you're saying
19:35
is this was when Rupert Murdoch was creating
19:37
Talk TV. But I think when you
19:39
were approached, it wasn't in
19:42
that format. Or
19:44
is that right? It was to explore
19:47
what the opportunities were here. And
19:49
look, as has been, I think,
19:51
reported, we
19:53
did a bit of an exploration and we concluded
19:56
that actually conventional traditional
19:58
television was. probably not commercially
20:00
viable. I think that's the language that we
20:03
use. Eventually that was
20:05
briefed out to the press around the
20:07
time that I left for Sky. And
20:10
they proved to be the case, David.
20:13
I mean, look, that was 2021 or
20:15
so that I think we said that. So look,
20:17
but there's... Actually
20:20
you said what they created was just talk
20:22
TV. You were suggesting that wasn't going to
20:24
work and it looks like, well, it hasn't.
20:26
Well, what hasn't worked being
20:28
on television. And that's not to say
20:30
that television isn't going to have a
20:33
very long tail. Like I think when
20:35
Sky News started in 35 years
20:38
ago, it was new
20:41
and novel and had never been tried
20:43
before doing a 24 hour
20:45
rolling news channel in this country. And that
20:48
was a challenger to the BBC. Now
20:51
that's completely established, but
20:53
there's various other things that we
20:55
or anybody else in the segment
20:57
do that are new and challenging.
20:59
And that's in audio,
21:01
we're putting like 40% more resource
21:04
against digital product this year.
21:06
It's SVOD, it's fast channels.
21:08
It's a variety of acronyms
21:11
that hopefully your audience isn't
21:13
intimately familiar with. They will not be.
21:15
I'm not sure I
21:18
am. It's any number of other
21:20
things. It's not just this one channel anymore.
21:24
And after your second stint with
21:26
Rupert Murdoch and as has been documented,
21:29
you left and you had suggested that
21:31
the talk TV plan may not be
21:33
commercially viable. You land a job
21:35
as executive chairman of the Sky News Group and
21:37
you arrive. And presumably
21:39
one of the first things that you have
21:42
to do is analyze the situation that Sky
21:44
News is in within the broader news
21:47
ecosystem. I wonder what
21:49
your conclusions were about what was going for
21:51
Sky, but also equally what needed
21:54
to change. It's a great brand. And
21:57
what I think it has going for it
21:59
is a real ability. ability to be
22:01
agile. It's always been an innovator.
22:04
And it's
22:06
a great privilege just as a manager to
22:08
be involved in something that does that kind
22:11
of quality coverage. So for instance,
22:14
Stuart Ramsey goes to Haiti in
22:16
recent weeks, interviews Barbecue, who's the
22:18
gang leader that's rolled up all
22:21
the other gangs in
22:23
the absence of any real civil authority. And we
22:26
entered into that story on
22:28
journalistic merit. And we did so
22:31
partly too, because Stuart
22:33
had relationships there
22:35
built on that he'd been to Haiti
22:37
a year before and identified this guy.
22:39
And I think it's
22:41
important to note that I thought, all
22:43
right, we're doing this just on journalistic
22:46
merit, and it's an important story, and
22:48
we're proud of it. But it actually
22:50
found an audience principally on YouTube and
22:52
among younger people and x UK where
22:54
it's two, three million hits. Our
22:57
Italian service did a similar
23:00
conversation with him. That's
23:02
had 400,000 on Instagram. So you find your
23:05
audience in places you didn't necessarily expect
23:08
it or certainly didn't expect to find
23:10
it before. I think
23:12
we're the only people in Yemen right now
23:14
with Alex Crawford there. So the
23:18
opportunity to do that kind of quality coverage
23:20
and find that there's a commercial opportunity revolving
23:22
around it, you don't get to do that
23:24
just about any place. And David Rose,
23:26
just a reminder for people listening, you're the executive
23:29
chairman at Sky News Group. I guess
23:32
Sky was always seen as the outsider.
23:34
But you know, 35 years on, you're
23:36
no longer the new arrival. You
23:39
are part of the mainstream media. We're
23:42
part of the mainstream media. Actually, I was at
23:44
this Lords committee that I mentioned with Jonathan, they
23:46
said I was part of the establishment. How did
23:48
that feel? Well, I thought I told I
23:51
told the I found I
23:54
don't know, I'm an American. I'm in
23:56
the House of Lords and I'm the establishment that
23:58
was that was kind of great. to hear.
24:00
It's called ironic in America. Yeah,
24:02
look, I think we
24:05
are established, certainly
24:07
in some aspects, the linear
24:09
channel is that aspect and will be around
24:11
for a long time, but in a lot
24:13
of the places that we're competing now, we're
24:16
a challenger brand all over again. So
24:18
in all those audio products, like there's
24:21
huge opportunity there. But
24:25
there's opportunity of distribution,
24:27
making journalism and getting it to people
24:29
in different ways. But what about the
24:33
opportunity of the bottom line here, because you, as
24:35
I understand it, have some guaranteed
24:37
income from Comcast from when
24:40
Sky was bought by Comcast, but that
24:42
doesn't go on forever. It goes to
24:44
2028. How do you think Sky News
24:47
can fund itself beyond that? How are
24:49
you going to make money from all
24:51
of these hits that you're describing on
24:53
different social platforms? Well, first of all,
24:55
how you just talk
24:58
about those hits. So when
25:01
we were just a linear channel,
25:03
you would approach somebody and say,
25:05
Katie, how about two o'clock? Could
25:07
you do the two o'clock? And then you would do
25:10
a television program and that's our arrangement. And
25:12
then we would look at those ratings that
25:14
Raj mentioned we used to obsess over. Now,
25:18
if we go to someone like when
25:20
we brought Yalda Hakeem in,
25:22
the conversation there is do
25:24
a nightly program about world affairs
25:26
on television. Do a
25:28
podcast once a week,
25:30
which she's going to
25:32
start doing as early
25:34
as next month co-hosted
25:36
with Richard Engel from
25:38
NBC News. Call them specials
25:40
a year. That'll be longer form and
25:43
more documentary and style where we do
25:45
sort of a film. This
25:48
is the way the world works now is you
25:50
don't have one job, like come here and you'll
25:52
have six jobs. And that's
25:54
how it works out commercially because that
25:56
news report is sold
25:58
in all those different aspects. But
26:00
making news is very costly. The things you
26:02
were talking about earlier with Stuart Ramsey or
26:05
indeed what Yalda will be doing. It is
26:07
expensive. We all know news costs money. And
26:09
are there conversations being had about
26:11
what happens post-2028 in terms of perhaps not
26:13
being able to provide Sky News? Well,
26:16
you've got to take into
26:18
account the scope of this company. And that's why I
26:20
think it's actually the best ownership
26:22
that we could have at this moment in
26:24
time. So we're not by
26:26
far the only news operation that
26:29
they have between the alphabet group
26:31
of other brands around the
26:33
world that I mentioned. And
26:35
they've got the wherewithal to support
26:37
all of that enterprise and see
26:40
a commercial opportunity in doing it
26:42
because they've certainly found that commercial
26:44
opportunity in NBC and CNBC and
26:46
Tellemundo and MSNBC and Sky and
26:48
TG24 and so on. So
26:52
Sky News will make it past 40? Absolutely.
26:55
And you've got to
26:57
also think about the cost of providing
26:59
that coverage in a positive way. And
27:01
I'll explain that just as a manager.
27:03
But that's its own kind of barrier
27:06
to entry. Not many people can
27:09
do what we do. When our people go
27:11
out, whether
27:13
it's Yemen or it's
27:15
Haiti or – by the way, it could be
27:18
– we're pretty robust
27:20
coverage all around the up and down
27:22
the country and the nations and
27:24
regions here in the UK. It's
27:27
costly to provide that. Not many other
27:29
people can. We find ourselves competing often
27:31
against you here at the BBC and
27:34
maybe a handful of others. And that's about it.
27:36
It's pretty lonely out there. Well,
27:39
one of your competitors, at least in the
27:41
UK on terrestrial TV, is GB News, which
27:43
of course you'll be well
27:45
aware of. How do you
27:47
assess Ofcom's reading of the
27:49
impartiality regulations with reference
27:52
to GB News? I mean, first, I
27:54
think they're doing something different than we
27:56
do. And they say that
27:58
themselves. I think they are – trying
28:00
to program for an audience and
28:03
we're programming something very different than
28:05
them. On Ofcom,
28:07
look, we're a regulation taker, not
28:09
a regulation maker, so it's really
28:12
in the government's gift to decide
28:15
and ultimately in the public's gift as far as
28:17
they select the government what kind of regulation we
28:19
want. But it's not just what
28:21
regulation you want, it's whether that regulation or
28:23
how that regulation is implemented. You mentioned Sophie
28:25
Ridge's program at 7pm, as you'll be well
28:28
aware. She's up against Nigel Farage on GB
28:30
News and he sometimes rates above
28:32
her. That's a direct competition with you
28:34
and what GB News can do not
28:37
just in that hour but more broadly
28:39
is decided by how these regulations are
28:41
implemented. Are you
28:43
satisfied with how Ofcom's going about it? Well
28:46
first, I think actually
28:48
since you mentioned Sophie and
28:50
Nigel, they're
28:53
actually probably the best example of how
28:55
these two channels are providing something very different.
28:58
So is it a binary choice
29:00
between people of one to the other? I mean I
29:02
think if you're watching Faraj, you're
29:04
watching him for him. If
29:08
you're watching Sophie, you're watching for
29:10
a really comprehensive political report. I
29:12
think the two are, to the
29:16
degree that they're talking about British politics, they
29:18
suppose that they're similar product but they really
29:20
have a very different thing that they're setting
29:22
out each day to achieve. As far as
29:24
the regulatory regime around that, you've
29:27
got to just, in this way,
29:29
the journalism business is like any other
29:31
business. What you need are clear rules
29:33
that are consistently applied and as long
29:35
as the rules are clear and they're
29:37
consistently applied, everybody's happy or if
29:40
they're not happy, they can pick a different
29:42
government, get a different regulatory regime
29:44
and off we go. But those
29:46
are the two things that we need as a business. Clearly
29:49
there are lots of questions at the moment around elections,
29:51
how we're all going to cover them, including
29:54
of course the American election. We're going to be talking
29:56
about Truth Social and Donald Trump later but I want
29:58
to thank you for that. for you at
30:01
Sky, how will you be advising your
30:03
colleagues to cover Trump? Well
30:05
first, we see an enormous amount of
30:07
interest in the UK and in the
30:10
other markets where our content
30:12
reaches in the US election. So
30:15
there's certainly interest in the UK election and
30:17
we think we have that whole political team
30:19
to cover that here. But
30:22
equally, I think the US election is going to be a
30:24
big night for us. We have
30:26
a really robust Washington bureau and
30:29
we plug into all of the data
30:31
and analysis that comes in. But will
30:33
you be carrying his speeches for example
30:35
or do you have a different approach to covering
30:38
Donald Trump that you may do to covering other
30:40
American policies? Well I think everyone's having conversations aren't
30:42
they about whether to, all newsrooms are, whether to
30:44
take him live and then put someone to do
30:46
the analysis afterwards or whether you know some American
30:49
networks they've already decided they're not going to ever
30:51
show Trump making speeches live. Media
30:53
have made mistakes covering
30:56
all politicians but media have made mistakes
30:58
covering Donald Trump really back to the
31:01
beginning. I mean really to before him coming down
31:03
the escrow. What kind of mistakes? You
31:08
know in
31:10
terms of just thinking about
31:12
the approach that media
31:14
were taking, I give an example from
31:18
actually the CBS experience. Like
31:20
if people remember the Charlottesville
31:22
episode in 2017. It's the
31:24
first year of Trump's presidency.
31:27
There's this fascist
31:29
rally in Charlottesville.
31:31
There's a rally
31:34
against the rally. It becomes violent.
31:36
I mean it
31:38
was a really painful episode. And
31:42
famously he
31:44
had trouble addressing it. He said, Trump
31:47
said it was because he was
31:49
waiting to get all the facts. There was criticism
31:51
that he delayed his response. Then there was a
31:54
response to that later in which he says look
31:56
you have to look at what was happening there
31:58
and there was blame. on
32:00
both sides. Some
32:03
aides like Gary Cohn suggested that
32:05
they later left the administration over
32:07
those comments. But a
32:10
couple of other things happened after that.
32:12
The first was media did make mistakes
32:14
in terms of how those
32:16
remarks were covered. For instance,
32:19
there were accounts that said that he
32:21
said the sides were equally to blame,
32:23
and he didn't say they were equally
32:25
to blame. And you just leave a
32:28
hint of that kind of controversy,
32:32
and that's enough for people to drive
32:34
a truck through and just say, look,
32:36
this is the bias that we've been
32:39
talking about all along. He didn't say
32:41
that. That was a mistake. But equally,
32:43
and the reason why that event sticks
32:45
with me is at CBS, we
32:47
polled it. And
32:49
it was only by about 10 points. It
32:51
was like a 55-45 that people felt that
32:54
he was in the wrong, even though the
32:56
events had been really hard to watch or
32:58
sort of torch-wielding neo-Nazis and all this stuff.
33:01
And when you
33:04
unpack it, though, why do people feel that way?
33:06
And you saw in the surveys people saying, you
33:08
know, I disagree with him. I don't
33:11
like what he had to say on that day,
33:13
but I hate the media so much, and he's
33:15
in opposition to you guys. And
33:18
so I'm with him. And
33:20
so you've got to, I think, as a
33:23
profession, take account of how did you
33:25
get to a place where there's at
33:27
least some aspect of 45 percent
33:29
of the American people that could actually think
33:32
that the media is that dishonest that they
33:34
would not be willing to express a certain
33:37
opinion on that. That's
33:39
like something people should take stock of. Well,
33:42
listening to his talk, David, has been
33:45
Madamita M erger from the
33:47
Financial Times, who we heard from a little bit
33:49
earlier. And we all hear because we're going to
33:51
talk about artificial intelligence. And David, I know you're
33:54
looking at this. Every newsroom is looking at this. But
33:57
let's start off with the lead story on the front
33:59
page of the FTE. today which is
34:01
all about OpenAI, an
34:04
organization and META. For people who
34:06
haven't seen it, tell us what
34:08
the story is. So this is based
34:10
on having spoken to two
34:12
leaders from the two companies in
34:14
the past week, both of whom
34:16
sort of tantalizingly dropped hints about
34:19
the next models, the next AI
34:21
software that's going to be rolled
34:23
out. For most of us today,
34:25
you know, our first introduction directly
34:28
to AI systems was probably ChatGPT,
34:30
maybe Dali if you were kind
34:32
of playing around a lot. And there
34:34
might be lots of people listening who's never tried any
34:37
of it. Yeah, but I mean, I think ChatGPT
34:39
was really the first time that people were able to
34:41
interact with an AI system. But even before that, you
34:43
know, if you have recommended ads
34:46
and posts on social media, if you've
34:48
chosen something to watch on Netflix, if
34:50
you've ordered an Uber, you are interacting
34:52
with AI systems. So ChatGPT was not
34:54
at all the first time, but it
34:56
is the first kind of interface where
34:58
we can literally communicate with
35:00
AI generated text. But
35:03
this year, we're looking at the next kind
35:05
of wave of more sophisticated
35:07
AI technologies, which
35:09
will again be able to converse
35:11
back and forth, but also kind
35:13
of summarize information, create videos, code,
35:15
you know, doing things that we
35:18
believed to be, you know, human
35:20
domain, creativity and, you know,
35:22
kind of qualities for
35:24
decades. And the
35:26
kind of really interesting thing from what
35:29
I wrote about this morning was that
35:31
these scientists and CEOs and
35:33
so on believe that the next models
35:35
will be able to reason, will be
35:37
able to plan. And this is really
35:39
what the word means in English, they'll
35:41
be able to kind of look forward
35:43
to how to perform an action and
35:45
be able to figure out what are
35:47
the steps they need to take in order to
35:49
achieve that. So it just
35:52
makes them a lot more able to
35:54
do tasks autonomously. Essentially,
35:56
is that making them more
35:58
human? I mean, I know Elon Musk on Mars. they predicted
36:00
that AI will overtake human intelligence in the next
36:02
year. He previously said that wouldn't happen until 2029.
36:05
Yeah, I'm not sure how much store I put
36:07
by his comment. Okay, so great. You were saying
36:09
he's not right. We've got a bit of leeway.
36:11
But what would that mean for the media? Your
36:13
story, if it's true, that they're
36:16
going to be able to reason... What does
36:18
it mean for newsrooms in the media? So
36:20
it doesn't mean they're going to be human
36:22
in any way. But what it does mean
36:24
is that these companies are trying to develop
36:27
AI assistance. So think of it as just
36:29
your own individual assistant or agent.
36:31
It might have a name. Today
36:33
we have things like Alexa that
36:35
we kind of recognize by name.
36:37
And that will be how you
36:39
interface with the internet. So
36:41
today you kind of go to maybe Google as
36:44
the front page of the internet for you, right?
36:46
Or Twitter or X or Meta,
36:49
whatever, TikTok. And that's where
36:51
you kind of look through where you want to
36:53
go. Or it might be the ft.com homepage or,
36:55
you know, TV channel. But what
36:57
they want to do is create... everyone
36:59
should have their own personal AI assistant,
37:02
which will decide for you where to
37:05
go, what to read, what you need to do next.
37:07
So you think journalists will have that, newsrooms will have
37:09
that. What would it mean for the media? Well, everybody
37:12
who reads us and watches what you make
37:14
and listens to us will have that
37:16
and help. It will be the interface
37:19
and the mediator that chooses what we
37:21
watch, what we read, what we consume.
37:25
But more than that too, if I want to get
37:27
from A to B, how do I get there? If
37:29
I want a recommendation for
37:31
a restaurant, can you book it? So it just
37:33
becomes kind of our way in which we will
37:35
talk to the internet more widely
37:38
because it will be able to kind of plan
37:41
what we need next. So I think it will kind of be
37:45
the thing that filters our digital diet,
37:47
which includes all of the news that we
37:50
consume. David Rosen, Sky, you're listening intently to
37:52
all of this. How does AI fit into
37:54
your plans for how the Sky
37:56
newsroom would be operating in the coming
37:58
years? We're out to... about it. So
38:01
much of the coverage, not yours, because the book
38:04
is actually, as much as
38:06
I've got familiar with it in the last three
38:08
hours, really got a lot to
38:10
grab on to. But so
38:12
much of the coverage has been about it's all the zombie
38:15
apocalypse, and I think that just leaves
38:17
out a lot of really extraordinary possibilities
38:19
that come from it. Such as?
38:22
Well, first of all, it should,
38:24
AI should help
38:27
solve a bit of a productivity
38:29
crisis across the economy, and the
38:31
media is not, you
38:33
know, not out of bounds for that. But when
38:35
I mean that in terms of
38:37
a newsroom, what AI
38:40
fundamentally does in the near
38:42
term is it intensely values
38:44
information that has yet to
38:46
be revealed, and it
38:49
devalues information that's known
38:51
and sort of in the public domain
38:53
and maybe needs to be organized. Well,
38:56
so what I'm saying is it
38:58
values journalism. Like if I can
39:01
report something that hasn't yet to
39:03
be reported, that becomes very valuable
39:05
because the AI may not be able to
39:07
tell you that. And
39:10
so what we see in our
39:12
own newsroom is that things that
39:14
already have pretty much been bit
39:16
away to search, well, they're
39:18
just all those trends are sort of
39:20
accelerated. So that's like, you know,
39:23
just, you know, what's the temperature going to
39:25
be? What's the stock price? That sort of
39:27
thing. But as far as being able to
39:30
tell you that, you know, two people familiar
39:32
with someone's plans say that they have decided
39:34
to do something, it can't tell you that.
39:36
And that's fundamentally the service
39:39
that we provide. Amadamee from the
39:41
FT? Well, I do feel optimistic
39:43
there will be a place for
39:45
quality journalism, but I think the
39:47
big current challenge that everybody who
39:49
runs large media organizations needs to
39:51
be thinking about is, is this
39:53
the next wave of disintermediation? So we've already
39:55
had social media. I was gonna say, what
39:58
does that mean? Yeah, well it means... that
40:00
big tech companies become the platform
40:02
through which we consume the news.
40:04
Say the word again. No,
40:07
disintermediation. We've heard that phrase before.
40:10
Disintermediation. Disintermediation, okay. We
40:14
used to distribute news through newspapers
40:16
and that was the big cost.
40:19
But then you have Facebook or Meta
40:21
and all of the social media platforms
40:23
and the internet itself which made it
40:26
extremely cheap to put something online and
40:28
reach billions of people at once. But
40:31
then they also become the pipes on
40:33
which we are now reliant. Online
40:38
media organizations are dependent
40:40
on these big tech
40:42
companies for their views. You need to
40:45
go via Google to be found. And
40:48
with AI, that risk becomes even
40:50
more extreme because not only are
40:52
they responsible for showing your website
40:54
to your viewers, they
40:56
may also provide the answer. And
40:59
that's what they're all working on at the moment,
41:01
right? Generative AI is essentially a sort of
41:03
question answer summarization system. So what it's saying
41:05
is ask me a question, I'll tell you
41:08
the answer and that's kind of the point.
41:11
And one of the reasons we've asked you
41:14
all and David just alluded to it is
41:16
that you have a new book out called
41:18
Code Dependent. And it looks at how AI
41:20
risks exacerbating existing inequalities in society. And I
41:22
wonder if I could ask you that from
41:24
the point of view of news and from
41:26
journalism, because already, and I'm sure Sky is
41:28
very focused on this, there is a risk
41:30
with news that it super serves some sections
41:32
of society and other sections of society
41:34
largely don't access the news that's being
41:36
produced. Is there a risk that we
41:39
could end up with that being exacerbated
41:41
as AI becomes more instrumental in both
41:43
the creation of content and the distribution
41:45
of it? Yeah, I think that one
41:47
of the patterns I saw again and again across
41:49
my book, and I traveled to nine countries because
41:51
I wanted to go outside of the sort of
41:54
bubble of Silicon Valley to look at really how
41:56
AI is impacting people and industries
41:59
in Argentina. and Kenya and India,
42:01
and I saw it with healthcare, with public
42:03
services, with work, you're seeing a
42:05
concentration of power more than we've seen
42:07
before. And it's a very small handful
42:09
of tech companies that are amassing
42:12
that data, that knowledge, and now
42:15
these kind of algorithms that they're
42:17
running. And with the news business,
42:20
you've already seen local
42:22
news media kind of dwindling
42:25
when we had social media. And
42:27
even recently with the LA Times
42:29
Washington Post, we've seen major dwindling
42:32
of journalists there. And the concern is
42:34
that that kind of inequality
42:37
is going to grow. David, do you think AI
42:39
is going to put journalists out of shops? No.
42:41
I mean, some of what happened
42:44
in news organizations was there
42:46
are some which navigated the technological moment that
42:48
came just before this. And there's others that
42:50
just didn't do a particularly good job of
42:53
that. And so I think if
42:55
you look at, and you mentioned a couple of US
42:57
newspapers, some had a value proposition
42:59
where they could charge for it, and they have
43:01
a more robust news report than
43:04
they ever have before. And
43:06
others just weren't offering anything particularly unique or
43:08
different or keeping up with the pace of
43:10
change in their own
43:12
community to where people were
43:14
willing to pay for it. Look,
43:17
there are a lot
43:19
of built-in biases, algorithmic
43:21
biases, to be aware of. And then
43:24
you have to consider, I think, that, and
43:26
this would be maybe controversial, but some of
43:29
those aren't necessarily a negative. I mean, we've
43:31
been here sort of speaking up for British
43:33
journalism, for its value system
43:35
as far as going out, doing
43:37
eyewitness reporting like we do. I
43:40
mean, the fact is that the
43:43
majority of the internet's written in English. Many
43:45
of the sort of first principles of what
43:48
these language models are learning off of is
43:50
English language
43:52
and, in some cases, journalistic
43:55
content. So there's kind
43:57
of a built-in advantage here for people who are
43:59
interested in the world. involved in the kind of
44:01
activity we do in terms of the rule writing
44:03
of this whole system that we're in. David
44:05
Rhodes, Executive Chairman of Sky News Group, thank
44:07
you very much for being here. All very
44:09
illuminating. I know you've got to rush. Madam
44:12
Speaker, you're going to stick with us, please,
44:14
because I do want to
44:16
talk about something else, which is on Monday,
44:18
Donald Trump posted a video about abortion laws
44:20
in the US, and he did so on
44:22
his social media platform, Truth Social. He posts
44:24
there regularly. We wanted to have a look
44:26
at Truth Social in detail, and we'll do
44:28
so with the help of one of the
44:31
people who set it up. But first, let's
44:33
look at what it is. We've got Joshua
44:36
Tucker, who's a professor of politics at
44:38
New York University and co-director of the
44:40
University Center for Social Media and Politics.
44:43
Joshua, thank you so much for being,
44:45
well, almost with us, with us down
44:47
the line. And for someone who hasn't
44:49
heard of it before, just explain what
44:52
is Truth Social. Thanks, Katie. Thanks
44:55
for having me here. Truth Social is
44:57
one of a number of new platforms that
45:00
have emerged in the US that have tried
45:02
to cater to different audiences. Truth
45:04
Social falls into the category of what we would
45:07
call Twitter clones. It kind of looks like Twitter.
45:09
It has the same basic affordances as Twitter. But
45:12
as Billy explained in the beginning, it was
45:15
primarily directed at an audience of
45:17
people who were supporters of President
45:20
Trump and who were interested in
45:22
continuing discussions that the
45:24
organizers of Truth Social thought were not being
45:26
able to have
45:28
on these mainstream platforms. So there are
45:30
a bunch of these platforms. There's a
45:32
lot of them that have sprung up
45:34
that are sort of smaller kind of
45:36
niche platforms that go after particular target
45:39
audiences, but use a similar setup to
45:41
one of the big mainstream platforms. And
45:43
hi, Joshua. It's Ross here. People
45:45
may be aware that Trump's Truth Social has
45:47
been in the news an awful lot because
45:50
it's recently gone public in the last couple
45:52
of weeks. And for a while, at least
45:54
it had an incredibly high valuation in
45:56
the region of $11 billion. Just explain
45:58
what's happened there. Right.
46:01
So, TrueSocial, like a lot of
46:03
these new platforms, struggled to attract
46:05
many users. And there are reasons
46:07
why, there are sort of structural reasons why it's very
46:10
hard to break into these kind
46:12
of, to break into these information ecosystems.
46:15
All these social media platforms follow something called
46:17
network effects. They get their value from the
46:19
number of people who use them. And when
46:22
you're not introducing a kind of new feature,
46:24
and especially if you're targeting a particular niche
46:26
audience, it becomes hard to break
46:28
in and build up a large user base.
46:30
So like a lot of these other platforms,
46:33
it attracted some users, but it was quite
46:35
a small portion of the information ecosystem. But
46:37
what makes it different though, is
46:39
that it has attracted, it completed
46:42
a merger with a
46:45
company that was set up only for the
46:47
purpose of completing a merger with a company
46:49
like TrueSocial and to bring an injection of
46:51
capital into it. And following
46:53
this merger, suddenly the valuation of TrueSocial
46:55
went skyrocketed. And the question that everyone's
46:57
been asking is, why did this happen?
47:00
And so on the one hand, you
47:02
might expect the basic financials of the
47:04
market are doing it because people see
47:06
promise in it as developing future economic
47:08
value. But the other explanation for it
47:10
is that it's become a meme stock.
47:12
It's become a way essentially for people
47:14
to bet on former President Trump.
47:16
The ticker symbol appropriately enough is now
47:18
called DJT. That's the ticker symbol that
47:21
pulls up for this new company. And
47:23
as you pointed out, the price of it skyrocketed, but I've been
47:26
pulling it up over the
47:33
course of, as we're talking here today,
47:35
it's down another 4% today. And
47:37
whereas it had peaked around $80 a
47:40
share, it's now down to about $36 a
47:42
share. So there's a huge question about whether
47:44
there's a financially viable model here for this
47:46
to go forward other than this process of
47:48
being a meme stock. Okay, well, I would
47:50
like Joshua to bring in Billy Boozer, who
47:52
was a chief product Officer
47:55
at TrueSocial back almost from the
47:57
very beginning. Billy, welcome to the
47:59
Media. So am I Just want
48:01
to know when did you first hear
48:04
about says. Treat. Social. And
48:06
how did you end up getting involved? So.
48:08
I actually have a good friend that
48:10
was being brought into the project or
48:13
because of his expertise and specific technology.
48:16
And they they had some expectation
48:18
said this. This application with scale
48:21
are extremely quickly in the right
48:23
and because of that the technology
48:25
that we base. To
48:28
social On was called Mastodon which was
48:30
a open source social networks and that
48:32
Mastodon instance was also based on other
48:34
technology called Ruby on Rails that has
48:36
the ability to scale but is more
48:38
difficult to scale than other technologies And
48:40
so they brought in an expert which
48:42
was a good friend of mine. He
48:44
called me and said hey, have an
48:46
opportunity. Would you be interested in coming in
48:48
and showed up and there's or people. They
48:50
are five people there are a thinking about
48:53
and hacking on what it would look like
48:55
to create a social network. For a you
48:57
know, a Us President that is probably met
48:59
the most controversial Us President in Us history
49:02
and so on. and so I I. I
49:04
was like this isn't the craziest thing that
49:06
you could ever be a part of and
49:08
there's no way to not say yes to
49:11
it's as well as I had a lot
49:13
of the inclinations towards free speech and had
49:15
a lot of. Friends that were being
49:17
the platform door fired from their jobs
49:19
in the U S because of their
49:21
Christian values and so just decided that
49:24
this was the right opportunity to address
49:26
a problem that we were seeing in
49:28
our country where free speech was being
49:30
limited and also the media was not
49:32
necessarily distributing information that seemed accurate to
49:34
the real world picture that we were
49:36
seeing everyday and. How much into ice
49:38
and did you have with D J P
49:41
Donald? Trump. I mean. Why?
49:43
i get good and immoral i go
49:45
and present the application tools for the
49:47
first time ever i'm we had like
49:49
special devices that we we give him
49:52
so that he could have the application
49:54
and engage with it's p actually gave
49:56
me the key the the white house
49:58
retrospectively s i'm fairly certain that it
50:00
does not unlock a door there. You haven't tried
50:02
yet. Yeah, I have not had
50:04
the opportunity to try yet. You might get shot.
50:06
I think it's not a good idea. No, no,
50:08
no, I do not think so either. But I
50:10
got the opportunity to meet with him. And one
50:12
of the things that was really interesting about President
50:15
Trump is that he's, this is I think what
50:17
engenders him with the conservative right, which is he
50:19
seems like an everyday person when you're sitting
50:21
down in front of him having conversations with
50:23
them. We've all met,
50:25
you know, highly influential people throughout
50:27
our lives. But at times, those
50:30
people seem to talk through you, not to
50:32
you. And President Trump was not like that.
50:34
He's one of those people that actually will
50:36
sit down and look you in the eye
50:38
and have a conversation with you. Now, a
50:40
lot of those conversations devolve into having conversations
50:42
about President Trump. But that's primarily because there's
50:45
a lot of gravity there. So he'll talk
50:47
to you, but if primarily about himself. But
50:49
I wonder when you were talking to him, could
50:51
he use what you were building? Because
50:54
he thought, famously, he's not that
50:56
keen on using computers, is he? That's
50:59
correct. I mean, he's not necessarily a technologist.
51:01
And I mean, honestly, that was one of
51:04
the bigger outcomes that came out of building through
51:07
social was we ran into a lot
51:09
of political people and people that were
51:11
very interested in that political spectrum. And
51:13
so few of them understood technology in
51:15
itself. I mean, like
51:18
we were talking earlier about artificial
51:20
intelligence and the breakdown between what
51:22
society realizes or expects out of
51:24
these technologies and how they can
51:26
communicate them. In the
51:28
AI world, they talk about tokens a lot and things
51:31
like that. And those
51:33
aren't terms that the average person
51:35
understands. And so the same goes
51:37
for politicians. The vast majority of
51:39
them have no understanding about technology
51:41
whatsoever. And because of that, it's
51:44
actually one of the things that concerns me
51:46
the most because that technology is actually what
51:48
is Disintermediating society in
51:50
itself. If You look at it from
51:53
a non one to five year perspective,
51:55
but a 20 year perspective, you'll see
51:57
that technology is shaping the entire. The
52:00
or the of Society. So if our
52:02
politicians don't understand that technologies, they won't
52:04
understand the direction of society moving forward.
52:07
And met a meter from the finance times. This
52:09
into this I guess that very complicated word you
52:11
used as a bit light. London Buses you wait
52:13
if you're anti media so grave them to come
52:15
along and then they come along twice in one
52:17
sammy run as that. Even with any meantime
52:19
he got a said it and everybody knows
52:21
what that means. he out essentially to getting
52:24
in the lane, aging the place and something
52:26
else that but I think it's interesting that
52:28
you say you know he doesn't like technology
52:31
doesn't that using it The wanted to build
52:33
a social media platform that's kind of powered
52:35
by algorithms and and I'm just wondering you
52:37
know, how does somebody who doesn't like technology
52:40
a peep a users who don't like it?
52:42
You know why. Why build attack platform little.
52:44
Monotonous and that with us try and
52:46
understand the impact of this platform has
52:48
had we a lot about as valuable
52:50
as understand his impact as well. Uni
52:52
Shanks, his assistant professor at the department
52:55
of Communication at the University of Buffalo
52:57
and Edu done some of the first
52:59
academic research on Truth Social tell us
53:01
what you've been routes that he ensue.
53:04
Thank. You for the question I
53:06
think you're right at me like
53:08
are certain really looks as a
53:10
political on and which he sent
53:12
out to social elite. get cel
53:14
by comparing Trump's ability to eat
53:16
or to drive throughs attention using
53:19
Truth Social are geared at Twenty
53:21
Twelve or Twenty Twenty Two arms
53:23
Us midterm elections are versus his
53:25
ability to do sell was Twitter
53:27
during the Twenty Sixteen primary elections
53:29
in the Us. So as we
53:31
all know on during that Twenty
53:33
Six election. Cycles. Trump's Ah was
53:36
very good at using to on
53:38
Twitter to promote his preferred messages
53:40
and to attack his opponents, and
53:43
he emerged on triumphantly on out
53:45
of that cycles. And I think
53:48
that Cycle also like cemented his
53:50
status as a Twitter Salons on
53:52
Air in that sense arms after
53:55
Trump was the platforms die on
53:57
Twitter and after he created and.
54:00
The new our platform we were curious
54:02
to see his arm. He was able
54:04
to do the same thing with through
54:06
Social as he was able to do
54:08
so with Twitter Back and Twenty sixteen.
54:10
And did he could? He does. He get
54:12
the same impact with the news media and
54:14
with everything. Three Trump Social As it is
54:16
a Trump Social Precise. When you're going to
54:18
do that, I have three. They do. That
54:21
I loaded with the relax. Well
54:23
that the answer is on as
54:25
a say more complex then on
54:27
we expect weeks back to to
54:29
beat. While I would say there
54:31
are two sides are wealth up
54:34
the first that are successful side
54:36
is that slow to so saw
54:38
on. This was almost as effective
54:40
as Twitter and terms of driving
54:42
use attention to who Trump and
54:44
his social media activities and on
54:46
a during that Twenty six see
54:48
let's send during the Twenty Twenty
54:50
Two elections on cycles but to
54:53
social. Was not that effective compare
54:55
to twitter and the sense that
54:57
swell like in during the twenty
54:59
or twenty two ah, midterm election
55:02
cycle journalists stuffed directly embedding his
55:04
tooth, social posts or his truth
55:06
on in in the news stories
55:08
on their website on and didn't
55:11
the number of such stories arms
55:13
actually is orders of magnitude smaller
55:15
than the number of stories on
55:17
embedding. his tweets are bathroom us
55:20
when he sixteen. While. You do.
55:22
We appreciate your help in In in
55:24
helping us understand the impact of of
55:27
these posts on Truth Social. I'm Billy
55:29
Boozy You help create this product but
55:31
you don't. Work. For Truth
55:33
social any more, How com and
55:35
how was the process of leaving
55:37
Trump World. You.
55:40
Know Ah, I believe that a
55:42
lot of the social networks. Have
55:44
gone through an interracial where they realize I
55:47
think you on was the the one that
55:49
really pointed this out really well with with
55:51
an axe is that. They've
55:53
gone through this aeration were
55:55
advertising has been the primary mechanism
55:58
of monetization and hours. That
56:00
there are other opportunities and other
56:02
options to create monetization and create
56:05
value for your underlying users. And
56:07
so I was kind of for
56:09
for for for my entire career
56:12
of have been an anti advertising
56:14
person I believe that is is
56:17
a mechanism of control for a
56:19
platform and puts the platform and
56:21
it's creators in an adversarial relationship
56:24
based on algorithms and based on
56:26
Ah their need to create more
56:28
impressions for their underlying ah. Advertisers
56:31
and so still we we bifurcated and
56:33
leadership for based on that idea of
56:35
just not having advertising is the predicate
56:37
for how we were going to make
56:40
money with that service. and so I
56:42
just decided it was not the right
56:44
of place to v because I didn't
56:47
feel like that in itself would allow
56:49
for a true free speech network. But
56:52
just and then as quickly Barry, I'm
56:54
wondering, Trump doesn't have a reputation of
56:57
going separate ways with people always amicably
56:59
did. Did. You manage to exit without a
57:01
fool out. One. Day
57:03
and like the day to day operations
57:06
had nothing to do as President Trump
57:08
the day to day operations of that
57:10
business had to do with technologists building
57:12
technology where the purpose of free speech
57:14
while he was co signing and you
57:17
know putting his likeness on the line
57:19
for that service. It wasn't like you
57:21
know day to day operations was going
57:23
through a o am also of yeah
57:26
it. It. Really, it wasn't a split
57:28
on that because I mean even are not,
57:30
they are now. I would support President Trump
57:32
going into the next election and I merely
57:34
because of my values as A even though
57:36
I didn't really make a ton of money
57:38
off up with it. While we appreciate you
57:41
coming on to talk to us about your
57:43
experience of working with Donald Trump and helping
57:45
set up Treat Social those Betty booze a
57:47
former cheap product offsets with through social Joshua
57:49
Tucker's Well from the News from New York
57:51
University and yet he jang from the University
57:53
of Buffalo. Hurdles, it's Coors Light. doesn't
57:56
matter me to merger from the S.
57:58
P. David Ways the executive Ten. at
58:00
Sky News Group. Now I'm off next week, because you're going
58:02
to be holding the fort. Thanks for doing that. I
58:04
will be here, but now I'm, neither of
58:07
us will be here for now, because that
58:09
is it. Goodbye. Bye-bye. I'm
58:12
Helena Bonham Carter, and for BBC Radio
58:14
4, this is History's Secret
58:16
Heroes, a new series of
58:18
rarely heard tales from World War II.
58:21
None of them knew that she'd lived
58:23
this stubborn life. They had no idea
58:25
that she was Britain's top female code
58:27
breaker. We'll hear of
58:29
daring risk takers. What she was offering to
58:32
do was to ski in over the High
58:35
Carpathian Mountains in minus 40 degrees.
58:38
Of course it was dangerous, but danger
58:41
was his friend. Helping people
58:43
was his blood. Subscribe to
58:46
History's Secret Heroes on BBC So.
58:56
Hey, it's Paige DeSorbo from Giggly
58:58
Squad. High-quality fashion without the price
59:01
tag, say hello to Quince. I'm
59:04
snagging high-end essentials like cozy cashmere
59:06
sweaters, sleek leather jackets, fine jewelry,
59:08
and so much more, with Quince
59:10
being 50 to 80% less than
59:12
similar brands.
59:15
And they partner with factories
59:17
that prioritize safe, ethical, and
59:19
responsible manufacturing. I love that.
59:21
Luxury Quality Within Reach. Go to quints.com
59:23
to get free shipping and 365 day
59:25
returns on your next order. day
59:28
returns on your next
59:31
order. quince.com/style. Do
59:33
you ever feel like your brain is on overdrive
59:35
and your mind is constantly racing? The plans, worries,
59:37
and to-do lists are never ending. Calm
59:39
can help your mind take a break from
59:42
the noise by softening anxiety symptoms in the
59:44
moment and helping you cope with day-to-day stressors.
59:46
Calm is the number one app for sleep
59:48
and meditation, giving you the power to calm
59:50
your mind and change your life. Their meditations
59:52
range to fit your needs each day from
59:54
anxiety and stress, relaxation and focus, to building
59:56
habits and taking care of your physical well-being.
59:58
They even have expert-led talks. talks on topics
1:00:01
such as tips for overcoming stress
1:00:03
and anxiety, handling grief, improving
1:00:05
self-esteem, caring for relationships, and
1:00:07
more. For listeners of
1:00:10
the show, Calm is offering an exclusive
1:00:12
offer of 40% off a Calm premium
1:00:14
subscription at calm.com/stressless.
1:00:18
Go to
1:00:20
calm.com/stressless for
1:00:23
40% off unlimited access to
1:00:25
Calm's entire library. That's
1:00:28
calm.com/stressless.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More