S06E03 – The Book About Careless People That Thoughtful People Should Read

Talking About Marketing Podcast by Steve Davis and David Olney

From a Kiwi diplomat’s disillusionment at Facebook to a poignant look at our digital social contracts, we explore how the platforms that promised connection became architects of division—and what thoughtful users can do in response.

Willie Nelson once said you should “get to the heart of feelings and keep it to a minimum” for maximum effect. We wish Facebook had taken that advice before building an empire on manipulating our emotions. Sarah Wynn-Williams lifts the veil on tech’s “move fast and break things” mantra in her revealing memoir of life inside Meta’s walls.

David shares his belated Facebook awakening and the initial joy of reconnecting with students and overseas friends—before the platform’s heavy-handed manipulation became impossible to ignore.

Steve conducts a post-mortem on our collective social media naivety, tracing the path from wide-eyed optimism to the sobering reality of platforms that profit most when humanity is divided, angry, and clicking.

Get ready to take notes.

Talking About Marketing podcast episode notes with timecodes

01:15  Person  This segment focusses on you, the person, because we believe business is personal.

Sarah Wynn-Williams’ Cautionary Tale of Idealism in Silicon Valley

Sarah Wynn-Williams’ journey from diplomatic service to Facebook’s corridors of power offers a fascinating window into tech’s hollow promises. Her book “Careless People” details how her desire to make a positive difference in the world led her to Facebook—where she discovered idealism is no match for growth at all costs.

As David notes, it’s remarkable that someone so committed to values could survive within the company’s ecosystem for as long as she did. Her tenacious belief that Facebook could become a force for good provides a poignant contrast to the “move fast and break things” mindset embedded in the company’s DNA.

The hosts reflect on how many of us “drank the Kool-Aid” during social media’s early days, creating genuine connections before algorithmic manipulation became the norm. While David found accessibility benefits in Facebook’s ability to reconnect him with students and overseas friends, even these positive experiences came with hidden costs that Wynn-Williams’ book painfully exposes.

13:00  Principles  This segment focusses principles you can apply in your business today.

Free Speech Champions Until The Speech Isn’t Free (of Criticism)

In a masterclass of hypocrisy, the tech industry’s self-proclaimed defenders of free expression reveal their true colors when the spotlight turns on them. Steve highlights the book’s uncertain future as Meta attempts to silence Wynn-Williams through legal manoeuvres—ironic for a company whose leadership constantly wraps itself in free speech rhetoric.

The discussion explores Facebook’s calculated approach to political influence, including the shocking revelation of how they embedded staff within Trump’s 2016 campaign while employing sophisticated proicesses for micro-targeting voters. As Wynn-Williams recounts, Zuckerberg’s reaction to learning of his platform’s role in the election outcome wasn’t moral reflection but rather fascination with his own potential political aspirations.

Most disturbing is what the hosts describe as the “absent moral dimension” throughout the company’s decision-making. From offering surveillance capabilities to authoritarian governments to designing systems that profit from societal division, the book exposes how ethical considerations consistently take a backseat to user acquisition and engagement metrics.

23:00  Problems  This segment answers questions we've received from clients or listeners.

When “Connecting People” Becomes a Weapon

The most harrowing segment delves into Facebook’s role in the Myanmar genocide, where military operatives weaponised the platform to spread misinformation and incite violence against the Muslim population.

Steve and David confront the ethical dilemma this presents to marketers and users alike. While acknowledging the platform’s continuing utility as a communication tool, they announce their decision to adopt an “organic social media only” policy, refusing to funnel client advertising dollars into Meta’s coffers.

The hosts grapple with the uncomfortable reality that no social media platform is entirely “clean,” leaving businesses and individuals to make difficult ethical calculations. As David notes, “We can’t have a pure version here, but we can certainly not contribute to it being worse.”

30:00  Perspicacity  This segment is designed to sharpen our thinking by reflecting on a case study from the past.

When Social Connection Returns to Human Scale

From the chaos of the Christchurch earthquake emerges a surprising insight about technology’s proper place in our lives. Sarah Wynn-Williams’ personal story of receiving news about her sister’s safety through Facebook demonstrates how these platforms can serve genuine human needs during crises.

Yet as Steve observes, the trustworthiness of crisis information has dramatically declined with the proliferation of fake content. The hosts suggest that social media works best when confined to Dunbar’s number—approximately 150 people we can meaningfully know and trust.

The episode closes with a call to redirect our attention from the “fake promises of connection” toward the “hard, slow, sweaty work” of maintaining relationships with people physically close to us. As David summarises, “Look after your circle of trust. If people are outside your circle of trust, they’re outside of it. And that tells you something really important.”

Transcript  This transcript was generated using Descript.

A Machine-Generated Transcript – Beware Errors

TAMP S06E03

[00:00:00] Caitlin Davis: Talking about marketing is a podcast for business owners and leaders. Produced by my dad, Steve Davis and his colleague talked about marketing David Olney, in which they explore marketing through the lens of their own four Ps person, principles, problems, and pers. Yes, you heard that correctly. Apart from their love of words, they really love helping people.

[00:00:31] So they hope this podcast will become a trusted companion on your journey in business.

[00:00:39] Steve Davis: David, it’s 2010. In two or three words, describe Facebook to me. Uh, a mystery because at that point it wasn’t accessible. Okay. Not quite two to three words, but, uh, it’s now 2020. How would you describe it? I think I’ll pick another option as to tell you about my first experience. Would you like that? And that’s what we’re on about in this episode, the book about careless people, that thoughtful people should read

[00:01:10] Caitlin Davis: our four Ps. Number one person, the aim of life is self-development to realize one’s nature perfectly. That is what each of us is here for. Oscar Wilde

[00:01:27] Steve Davis: in the person segment. In this episode, we are diving into, I think, one of the most important books written this century. It’s a critical book for people to read because all of us, all of us, are connected to social media in some way, shape, or form. It’s by Sarah Wynn-Williams, a Kiwi who. Is a, an interesting human who, uh, moved through diplomatic circles and law, uh, and had her heart set on doing something noble with her life.

[00:02:04] And after a number of different. Uh, roles including the United Nations. She made to the Holy Grail. She thought at the time of what was going to be so important for humanity. And I talk about Facebook and her warts and all recounting of her time at company is riveting reading. It’s very well told. But by golly, it leaves you feeling well.

[00:02:36] It left me feeling quite winded on behalf of humanity afterwards. David, what? What was your journey like in reading this? I was amazed

[00:02:46] David Olney: that someone so idealistic. Could survive the environment so long because she kept thinking. But eventually they’ll get the point of how amazing Facebook could be and how much it could matter, and how positive an impact it will have.

[00:03:03] And I’ll just keep showing them what the good version of this looks like. So, you know, 11 out of 10 points for tenacity and idealism. For surviving as

[00:03:14] Steve Davis: long as she did. Well that’s true. Now if I go back, ’cause I ran the first ever social media marketing workshops in South Australia and they were held at the, towards the end of 2005.

[00:03:24] And I must say over the next few heady years I. We were all a GOG in the same way over these new tools emerging. We had MySpace at that point. Um, YouTube, uh, we had Facebook, et cetera. They gradually came online. There were some that came and went. I remember twerking at one point. There’s all sorts that came and went four square.

[00:03:48] And we were putting ourselves out there and connecting and making amazing friendships. And I, I can’t throw the first stone at Sarah Wynn-Williams for, for drinking the Kool-Aid because I think we all were. Is that, well, we. Just solidly naive, David, or do you think there was reason to share that? Naivety and optimism.

[00:04:13] David Olney: I think this is a really good opportunity for me to talk about my exposure to Facebook, which was later than a lot of people, because, you know, mobile phones weren’t accessible and apps weren’t accessible. So it was only in 2011 that I was able to get an iPhone with enough accessibility baked in to make it useful.

[00:04:31] And of course, because of that, Facebook was then available and I’d had so many students at university on Facebook for years before then using it to stay in touch with people, to be connected. I had friends all over the world using it who would email me and go, oh, is this accessible for you yet? I’m like, not yet.

[00:04:50] So I remember. You know, literally getting my iPhone, making friends with it, and then loading Facebook and setting up an account. And suddenly from my contacts, it started populating with my former students and my friends overseas. And that initial buzz of being able to stay in touch with people who, it was so nice to know they were doing well or so nice to know, I could easily send them a birthday greeting.

[00:05:16] Yeah, that first say two years of, it was lovely when it was about social connection and interacting in a way that seemed to be more self-guided than company manipulated,

[00:05:31] Steve Davis: even though that mechanism was company manipulated, as we discover in the book. Yeah, it was. It was like Leonard Cohen said, there’s a crack in everything.

[00:05:42] That’s how the light gets in. It’s like. They had this gung-ho growth at all cost mentality, but they were still. Um, bringing some light to people’s lives along the way, even though you were looking at it as opening up your social connections, they were looking at it as more minons that we can expose advertising to.

[00:06:04] David Olney: Mm-hmm And I guess that’s the thing, because I wasn’t gonna be affected by all the cute photos and the bright colors. I was really able to stay as someone who got useful data about people I cared about, and everything beyond that was of no interest.

[00:06:18] Steve Davis: From a person perspective, then, uh, I think many of us who’ve, who’ve been around through that period will have similar emotional memories of it being net good and opening up of doors.

[00:06:33] But it’s interesting that you mention that findability because there’s a lovely track from the book where Sarah. Actually reflects on this because, uh, Harvey, the, the man who was in charge of that, um, she had to look after at some point, and here’s what she had to say.

[00:06:52] Sarah Wynn-Williams: Facebook’s business model depends on its conquering new territories, expanding exponentially. The growth team is in charge of forging those new frontiers. And like most frontiers men, Harvey and his team play fast and loose. They’re aggressive. Quick to stake their claim. Always looking for opportunities in the gray area created by the lack of regulation.

[00:07:18] Harvey’s team is the group that came up with the idea of importing your contacts into Facebook, so Facebook could press non-users to join the service. In the beginning, they didn’t ask permission to do this. You could tell Facebook, don’t take my contacts, but then when you open Messenger, they’d take them by default.

[00:07:40] His team is instrumental in the development of the people you may know tool, which is described by Mashable as Facebook’s creep as hell, tool for its ability to make uncomfortable friend recommendations such as when a sperm donor was recommended, a biological child he had never met. It’s a growth at all costs approach.

[00:08:02] To me, it seems like a very American thing. When Alexi Deville visited the US in the 19th century, he was on a rickety steamboat that hit a sandbar and capsized, and he nearly drowned. Afterwards, he found the manufacturers and asked them why they didn’t make the vessel safer. They explained that technological innovation in America happened so quickly.

[00:08:27] There was no point. By the time they made the necessary changes, the boats would be obsolete anyway. Better just to take a chance on what you have. If some drown, no need to dwell safe in the knowledge that something better is just around the corner. That cheerful recklessness combined with passivity, that forward motion without introspection, that’s what Harvey’s team has.

[00:08:56] Steve Davis: So the growth at all costs, it’s there. But of course that lovely sting in the tail or or pernicious sting in the tail is move fast and break things. And I suppose that’s, I mean, it’s fun being on that rollercoaster ride. It’s when the track. Starts deviating and the bolts are loose. That’s when we should get nervous.

[00:09:20] And that’s what some of us didn’t see coming.

[00:09:23] David Olney: No. I think how could we see how much it was going to turn into a manipulative product? At the beginning, it was just this thing we could use in the way we wanted. Yeah. It was a bit pushy about wanting to get all your contacts, but also how else were you gonna see who was on it?

[00:09:40] We accepted some of the pushiness. But as time went on, it just became pushy about everything it wanted you to do, rather than you using Facebook to achieve the things you wanted to do. So it stopped being an empowerment or engagement tool, and instead just became a task master designed to waste your time and destroy your mental health.

[00:10:01] Steve Davis: We probably do to, to round off this person segment, reflect on this from the fact of how we held Mark Zuckerberg in high esteem, uh, Elon Musk in the early days, um, others of that ilk, and now most of us looked down at them as despicable, greedy, untrustworthy people, and. There’s something I think we all take away from this.

[00:10:32] ’cause they, they were on a good thing. They were surfing a good wave, which was impacting people’s lives positively. And there’s a point where they cross the line. How do we maintain awareness of this point in our own worlds? We may not have a world dominating enterprise, but we would have our own circles.

[00:10:54] Um, is there a way that most of us have that innately a sense of when we’re crossing the line versus narcissistic humans who are the tech bros? You don’t give a damn who are just focused on the numbers that matter to them. I didn’t know how would you wrap that in a bow

[00:11:11] David Olney: as you’ve been talking about that I’ve remembered a strange story from 2011 that I don’t think I’ve remembered for a decade, and it’s that a professor from Harvard.

[00:11:23] He gave a guest lecture in Adelaide, a famous professor called Robert Putnam, who works on the concept of social capital. And you know, he gave his lecture on social capital and that was nice. But he also gave a masterclass for people within the university who were interested in what he was working on.

[00:11:41] And I was one of the people that got invited to go. And it turned out that Robert was user number 25 on Facebook when it was an experimental product. Run by a Harvard student and Robert recounted the story of turning to Mark Zucker. You know, mark Zuckerberg, the very young Mark Zuckerberg, and going, mark, so what you’re telling me is I’m gonna be friends with someone called Hans in Germany who I’ve met once.

[00:12:10] If I’m sick, is hand’s gonna make me soup and bring it to my door. And Robert described how Mark just looked at him with no sense of comprehension. At what Robert was trying to get across. Human connection is a deeper thing and this tool is good for reconnection. But don’t pretend these shallow connections have a big impact.

[00:12:31] And it doesn’t really answer your question, but the fact that Zuckerberg, Zuckerberg missed these early cues about what he could have done with Facebook, you know, kind of indicates he was just having fun coding.

[00:12:45] Steve Davis: And tends to marry with the, um, the image of him that’s, uh, crafted through this book called Careless People.

[00:12:56] Caitlin Davis: Our four Ps. Number two principles. You can never be overdressed or over educated Oscar Wilde

[00:13:10] Steve Davis: in the principle segment. We’re going to continue, um, reflecting on aspects of this book, careless People, uh, because one way or another David, and I would love you to read this book. Um, we’ve actually also bought. Some, uh, paper backed copies just in case. ’cause at the moment, here’s the interesting thing, and this gets to the heart of some of the principles we’re about to be discussing.

[00:13:34] All the tech bros out there, including Mark Zuckerberg, um, wear the big badge on their chest that says free speech. They love free speech. They’re for free speech. They’re bastions of free speech. So when Sarah Wynn-Williams writes a book that’s maybe not as flattering about you as possible, you would anticipate a free speech advocate.

[00:13:56] Would say, well, all’s fair and love and war, let’s let the market decide. Isn’t that what you’d expect the free speech advocate to say? It would be lovely if that was the case, but the bros are all hypocrites,

[00:14:07] David Olney: so sadly, no.

[00:14:08] Steve Davis: No. So they’ve gagged her from being able to promote her book. So this is why you’re not hearing many interviews with Sarah Wynn-Williams at the moment, and it’s left up to the rest of us to carry the flame on her behalf.

[00:14:21] Carry the torch. And that’s why we’ve got some paper copies because if they go the next level and get the book banned from, uh, being available on audible eBooks or wherever, uh, we don’t want people to lose access to this important storytelling. Into really the dominant communication system of the modern world.

[00:14:44] I think we all are invested in it. We all need to know what’s happening in it. Shining the light in the shadowy workings of this, um, BMI is. Where we wanna focus on here, because unbeknownst to many people, I didn’t know this until I read this book, Facebook is part of its growth. Uh, there was a, a rule of thumb, David, if I remember correctly, that that’s wanted to get into any country or every country at any price.

[00:15:16] So there’s a lot of detail about, uh, wanting to get into China, and so developing systems. Just for China so that the Chinese government would be able to, whether directly or arms, link through a, uh, a Chinese government, uh, company, uh, analyze any content by Chinese citizens, if I remember that correctly.

[00:15:39] That’s what they were offering, despite saying to the US Senate and other places that no, we would never do that.

[00:15:45] David Olney: No. Well, they said they wouldn’t do it for the American system to, you know, to give the state control over that much data. And yet they were happy to do it for China to get over a billion users.

[00:15:56] Steve Davis: Hmm. It was basically they were at any cost to be in. And what was fascinating is come the 2016 US Presidential election, um, it seems that, uh, Facebook offered both the Trump campaign and the Hillary Clinton campaign. On offer of assistance, uh, Hillary Clinton’s campaign turned them down, but Trump took them on board.

[00:16:24] Sarah Wynn-Williams: Over the course of the 10 hour flight to Lima, Elliot patiently explains to mark all the ways that Facebook basically handed the election to Donald Trump. Facebook embedded staff in Trump’s campaign team in San Antonio for months alongside Trump campaign programmers. Ad copywriters, media buyers, network engineers, and data scientists.

[00:16:50] A Trump operative named Brad Pascal ran the operation together with the embedded Facebook staff and he basically invented a new way for a political campaign to shit post its way to the White House, targeting voters with misinformation, inflammatory posts, and fundraising messages. Pause. Who led the ads team described it as the single best digital ad campaign I’ve ever seen from any advertiser period.

[00:17:21] Elliot walks Mark through all the ways that Facebook and Pascal’s combined team micro-targeted users and tweaked ads for maximum engagement using data tools we designed for commercial advertisers. The way I understand it. Trump’s campaign had a master database named Project Alamo with profiles of over 220 million people in America.

[00:17:47] The campaign used Facebook’s custom audiences from custom lists to match people in that database with the Facebook profiles. Then Facebook’s lookalike audiences algorithm found people on Facebook with common qualities that look like those of known Trump supporters. People likely to respond to build a wall.

[00:18:07] Got that sort of message. Moms worried about childcare, got ads explaining that Trump wanted a hundred percent tax deductible Childcare. Pascal’s team also ran voter suppression campaigns. They were targeted at three different groups of Democrats, young women, white liberals who might like Bernie Sanders and black voters.

[00:18:29] These voters got so-called dark posts. Non-public posts that only they would see. They’d be invisible to researchers or anyone else looking at their feed. The idea was feed them, stuff that’ll discourage them from voting for Hillary. One made for black audiences was a cartoon built around her 1996 soundbite that African-Americans are super predators.

[00:18:53] In the end, black voters didn’t turn out in the numbers that Democrats expected. In an election that came down to a small number of votes in key swing states. These things mattered. Mark quietly takes it all in

[00:19:12] Steve Davis: Bit like the, your story in the first part of this episode where Zuckerberg looked at that professor and went, huh? I don’t, he just didn’t understand what Robert was saying. Mm-hmm. According to Sarah Wynn-Williams account, he didn’t quite understand just how much Facebook was responsible for getting Trump in the first time.

[00:19:28] Until it then started to dawn on him as the team Unrelentingly showed all the data, all the statistics, it say, ugh, so much. So he even then started fantasizing about running for office himself. You still think he hasn’t quite put that away? No.

[00:19:43] David Olney: I have a suspicion that that’s what the big visit to Mar Largo was after Trump won, to kind of say, we are here to help.

[00:19:50] But the fact that Zuckerberg, you know, realized, oh, I just made a president. Oh. I could be president. That’s the implication of, you know, Sarah Wynn-Williams’ book, that he didn’t have a realization of the enormity of what he’d done at any moral or practical level. He only had a realization of the enormity as it could serve, whatever childlike desire he had,

[00:20:14] Steve Davis: and that’s the bleaks part of reading this book.

[00:20:16] The Moral Dimension is just absent all the way through. Unless I, did I miss it? Is it hidden under anywhere? No.

[00:20:26] David Olney: No. There’s Sarah’s idealism and not much else that’s positive.

[00:20:31] Steve Davis: So from a principal’s perspective, what we have here is a company that all of us use in some way, shape, or form. Learning that basically it wins when the population is divided into splintered groups of people who hate each other.

[00:20:55] That’s how Facebook makes money. If we are all happy. They’d still be making some money, but not making the killing. And I guess that’s the, that’s the thing to, to share in this principle segment. I’m not sure what’s actionable about that other than if we want to sleep and. At night and not feel like we’re part of it.

[00:21:16] Um, we need some sort of rule of thumb to ask ourselves before we contribute content in this space. Um, what would be a good rubric that we could use David, so that at least if we are using this tool, we are not. Falling for the stuff they do, which is basically win at all costs. They just want eyeballs so they can expose people to advertising there.

[00:21:41] There might be a slightly more pure way we can use it.

[00:21:44] David Olney: Well, the two that you know we keep coming back to as we talk about it is try and use Facebook as Sarah Wynn-Williams imagined. It could be used as a place to connect people so they can help each other so people have better lives. And I think the other one I thought of from so many years of teaching young adults and you know, that you resonated with, from being a dad of two daughters is, you know, think about whatever you put on Facebook.

[00:22:12] Would you want your 10-year-old or a 10-year-old or a 17-year-old? Doing the same thing. How would you feel, you know, if a teenager posted the thing you just posted?

[00:22:23] Steve Davis: That might be where we leave the principal segment, because I think there’s something useful in that, that little mental exercise that if my father or mother or son or daughter or whomever was gonna do this too, have I been a great role model?

[00:22:44] Caitlin Davis: Uh, four Ps. Number three problems. I asked the question for the best reason possible. Simple curiosity. Oscar Wilde

[00:22:58] Steve Davis: in the problem segment. This is one where quite possibly you might want to, uh, not listen out loud if you’re at work, because we’re about to play something from the book. That’s a very hard to listen to. And I, I thought this sits nicely in the. Problem segment because it is a dilemma that we all have to face.

[00:23:23] Sarah Wynn-Williams: Since my time there in 2013, it went from having virtually no internet to everyone on mobile, totally skipping desktops. Facebook made deals with the local telecoms to preload phones with Facebook, and in many plans, time spent on Facebook wasn’t counted towards your minutes. So in Myanmar, if you are on the internet, you are on Facebook.

[00:23:50] And because of this, Myanmar demonstrates better than anywhere. The havoc Facebook can wreak when it’s truly ubiquitous. The unthinkable happens in late August. The military launches a campaign of atrocities against the Muslim population that the UN later describes as genocide and crimes against humanity.

[00:24:13] At least 10,000 people are murdered. The clinical language of the UN report on this somehow makes it all seem more horrible. Children were killed in front of their parents, and young girls were targeted for sexual violence. Rape, and other forms of sexual violence were perpetrated on a massive scale.

[00:24:37] Sometimes up to 40 women and girls were raped or gang raped together. Rapes were often in public spaces and in front of families, and the community maximizing humiliation and trauma. Mothers were gang raped in front of young children who were severely injured and in some instances killed. Women and girls were systematically abducted, detained.

[00:25:06] Raped in military and police compounds often amounting to sexual slavery. Victims were severely injured before and during rape, often marked by deep bites. They suffered serious injuries to reproductive organs, including from rape with knives and sticks. Many victims were killed or died from injuries.

[00:25:31] Others, men and women. Were killed in arson attacks, burned to death in their own houses. In particular, the elderly persons with disabilities and young children unable to escape. In some cases, people were forced into burning houses or locked in buildings set on fire. What the world will learn later is that the military had set up a massive operation, at least 700 people to spread misinformation and hate on Facebook.

[00:26:04] This was revealed by a reporter named Paul Moser in the New York Times. Sources in the military’s Secret operation. Told him how they created and took over verified accounts that had huge followings, fan accounts for pop stars and celebrities. The Facebook page for a military hero and used them to pump out false inflammatory posts.

[00:26:27] Facebook’s response to Moza. The company issued a statement saying it had found evidence that the messages were being intentionally spread by inauthentic accounts and took some down. At the time, it did not investigate any link to the military at that point. The UN report on the human rights violations in Myanmar devotes over 20 pages to the critical role Facebook played in spreading hate.

[00:27:02] Steve Davis: The reason we included this in the problem segment is it leaves us all with a problem with a dilemma. I feel bad about putting any money into the coffers of an entity like Facebook. Uh, so. Dave and I were discussing it, uh, talked about marketing will now have a, a policy of being organic social media only.

[00:27:20] We won’t be involved in funding or, or replacing funds of clients into any advertising within the Meta Universe. Uh, because quite frankly, for us to do that means that we are benefiting from the, the, the, the sad, um, history of. Of pain and torture that lots of innocent people have paid the price for, and I just can’t stomach it.

[00:27:49] I have to acknowledge that it’s still a communication tool and if we could use it with the, within a straight and narrow, with a helpful, constructive messaging, at least in some small way, what we are putting out there on us and on behalf of our clients is. Acting for the net good of society. We, we, we, we are finding this as our first way of reacting to Sarah Wind Williams’ book, and I’m sure it will evolve over time.

[00:28:21] But David, anything you’d add to this as people think through this themselves, do you want, how do you feel about contributing to this organization, which has been laid bare by Sarah’s work?

[00:28:35] David Olney: Really, this is how we came up with our two rules of. It’s still too important to be able to communicate with people, and it is the tool that people have and it’s the tool that people are accustomed to.

[00:28:47] But let’s use the tool in an ethical way that, you know, we can be proud of the content and accept that we using something that has been misused, but we are not misusing it. You know, we can’t have a pure version here, but we can certainly not contribute to it being worse. But at the end of the day, we need to use tools that can help people communicate with other people.

[00:29:15] And it’s crazy to think that an ethical tool is gonna pop up overnight once there’s an ethical tool. You know, we will try and find it. You know, we hoped, I think in 2023, Mastodon started to become a thing and it was meant to be the ethical social media, and yet it just hasn’t taken off. In a way where we could say, you know, put your social media posts there.

[00:29:37] We’d be doing you a disservice as your marketing advisors to recommend

[00:29:41] Steve Davis: that. Yeah, and same with Blue Sky at the moment. Yeah. Watch this space and hopefully some food for thought for you as well.

[00:29:55] Caitlin Davis: Our four Ps, number four per sy. The one duty we owe to history is to rewrite it. Oscar Wilde,

[00:30:10] Steve Davis: the book Careless People, has a number of, um, very deep personal, uh, stories that Sarah shares beyond the realms of Facebook. But this one bridges both worlds. It was with her, of course, being from New Zealand, uh, there’s a harrowing, uh, retelling of a tale when earthquakes strike in Christchurch, uh, at a time, uh, when her sister was working in one of the news TV stations there in a building in Christchurch, which was at the center of all this earthquake activity.

[00:30:43] And actually had a very close escape, uh, with some injury, I believe from. From that, uh, natural disaster, but of course, phones to the newsroom, all, all cut. There was a massive disruption to communication and ultimately it was a Facebook message. A Facebook post was her first way, Sarah’s first way of getting a sign of hope that her sister was okay after this calamity, which made her think yet again.

[00:31:14] Here I am. Uh, there is. Much good about this company. And so from a perspicacious perspective where we look at something and we reflect on how it would still be, um, relevant over time. Let’s take you back, David, to those heady days, Christchurch, uh, Facebook being that little source of accurate on the ground news.

[00:31:41] Are those days still part of its mix?

[00:31:43] David Olney: Well, I think the wonderful thing is. Uh, that was people, including politicians like the Prime Minister at the time in New Zealand using Facebook because enough cell phone towers were still working. People still had signal. Their phones hadn’t gone flat yet. At the very least, people could say, you know, go here, there’s water.

[00:32:03] Go here. You know, nothing’s shaking anymore. Go here. This empty space is being set up for food distribution and first aid. You know, people will use the tool available. If you know, they know how to use it and it’s available to them. So I think the wonderful thing is it shows where Facebook started, people made use of the platform.

[00:32:23] The platform at the time wasn’t particularly manipulating them, and I think there’ll be time and time again where people will use the platform. Well, I. As long as they ignore the company that created it.

[00:32:34] Steve Davis: My trust in the news from a natural disaster zone or a conflict zone is lower than it’s ever been because of the easy ability to make fake content.

[00:32:46] So X for example, I wouldn’t trust anything I see on X ’cause it’s all those. Idiot manipulator, um, embedded troops of, uh, Russians and all sorts of people who just concocting fake rubbish to seize on any moment. Facebook would have its own share of that. Uh, so I think those days really have numbered unless you happen to know that it’s in your local Facebook group where you trust.

[00:33:15] Each other will know each other that you’re getting some sort of feed.

[00:33:18] David Olney: Yeah. The fact that you would recognize, hey, that’s the person from the pub. Hey, that’s the person from the cafe. I know that if they’re putting a message up, that’s a real person. So it works on the scale of bigger community than the people you know a lot about, but a community in which you recognize the face or the business or the place.

[00:33:37] So it tells us something about scale.

[00:33:39] Steve Davis: I think the thing I take away from this is. No matter how much this technology like Facebook and our wide-eyed craziness in the early days thought we could have it all. We could have more. We could have thousands of people. It’s. It doesn’t really escape the confines of what’s the, the Dunbar number, which is that number of people.

[00:34:02] I think it’s about 150. It’s widely thought to be that we can know well enough, like a group of monkeys, um, to be trusted and trustworthy. Yeah, anything we go beyond that is artificial. It’s almost like if you move into space, it’s a rarefied atmosphere where we need the contrivance of a whole lot of artificial systems to prop things up.

[00:34:27] That’s like the, um, the intricate settings to have thousands of connections. You’re never really gonna meet these people, but it’s the ones close to us that, like the professor talked about, Hans, from Germany. If I’m sick, he can’t bring me bread and soup, but the people around me are, and if anything, careless people.

[00:34:48] Is a reminder to pull our attention away from Facebook and its growth at those costs and its fake promises of connection and say, you know what? Do the hard, slow, sweaty work. Put that time into the commute, the relationships of people who are living closer, who can be contactable physically sooner, because ultimately that’s where reality lies.

[00:35:16] That’s what I get out of this, David.

[00:35:18] David Olney: Very much so. It’s very much a case of look after your circle of trust. If people are outside your circle of trust, they’re outside of it. And that tells you something really important.

[00:35:31] Caitlin Davis: Thank you for listening to talking about marketing. If you enjoyed it, please leave a rating or a review in your favorite podcast app and if you found it helpful, please share it with others.

[00:35:43] Steve and David always welcome your comments and questions, so send them to [email protected]. And finally, the last word to Oscar Wilde. There’s only one thing worse than being talked about and that’s not being talked about.

Get helpful marketing articles and links to our latest podcast episodes