What is cultural anthropology? How does word association affect public perception of AVs and AI in general? Alex Roy sits down with the Queen of Culture and Founding Partner of Cultique, Sarah Unger.

Listen On
Apple
Google
Spotify
iHeart Radio
Share

View Episode Transcript

Episode Transcript

Alex:                         Sarah, what is cultural anthropology?

 

Sarah Unger:             So I’ll give you an analogy because I think that’s a good way to help people understand what cultural anthropology is. We think of our clients as building rocket ships and our job is to study the cultural atmosphere in order to help them calibrate in order to meet whatever objectives they want for their brand audience or business. So cultural anthropology can be applied [00:00:30] in a super academic sense, meaning simply studying and researching patterns and shifts. But the way we use it is to distill it into very understandable concepts that our clients can actually act on, which is the goal, and use to effect change in their business. That could be cultural insights to help build a better movie, build a better product, connect to people in a more culturally attuned [00:01:00] or relevant way. Bespoke cultural insights are really our application of cultural anthropology, and we believe they’re really mission critical for future-proofing of business. So where discipline sits at the intersection of cultural anthropology, branding, and business strategy.

 

Alex:                         That was an awesome and completely incredibly professional and polished corporate explanation. And I totally buy [00:01:30] that because I’ve been on the other side of the table, but the average person in the street often wonders how does a movie get made. A decision is made to make a movie. Hundreds of millions of dollars are spent. A marketing campaign is launched. The movie arrives. Sometimes the marketing is wrong or the movie sucks or both. And people wonder didn’t anybody ask in the early stage someone what [00:02:00] they thought of the idea before they committed. And that person I imagine is you.

 

Sarah Unger:             We try to get in there as early as we can, frankly, because that certainly helps culture work as a competitive advantage the earlier you get. But I think an easy way to understand it is culture operates, whether your brand exists or not. Do you [00:02:30] want to use it to your advantage or against your advantage? And I think typically organizations aren’t structured to have people putting their feelers out in this capacity. They might have people inside who are really good at this, but everyone’s doing a lot of different things, running campaigns, actually making the movie. Usually companies don’t have specified culture whisperers, for a lack of a better term, who are tapped with really asking the questions you just asked. And one [00:03:00] interesting way to think about it is culture operates on a few different wavelengths.

 

Any topic has different moments in its cultural curve. So we think of culture as on a curve, which means there’s a residual manifestation of culture. There’s a dominant manifestation of culture. And then there’s an emergent manifestation of culture. So take any topic, whether it’s wellness, for example, there’s things in wellness that are really residual [00:03:30] right now, like dieting culture, body shaming. Those are things that have been around for a long time, counting calories. They’re veering to the background of culture. Things that are dominant are zeitgeisty things that you would talk about at a dinner party like meditation. Everybody’s doing a meditation app. How many celebrity partnerships are there for meditation apps? And then things that are-

 

Alex:                         It makes me not want to meditate.

 

Sarah Unger:             Shawn Mendes can rock you to sleep via Headspace or Calm, one of the two. [00:04:00] And that’s amazing. That’s showing how zeitgeisty wellness is in culture right now. Then there’s emergent manifestations, which are things that are more ahead of the curve. They’re not as dominant yet in culture. That might be conversations around where wellness and activism intersect, wellness and inclusivity. Those are conversations that culture hasn’t fully dove into, but are headed towards.

 

Alex:                         So do you ever tell a client don’t do this, just stop?

 

Sarah Unger:             [00:04:30] One of the interesting things about our business is that some of the work is really looking far ahead. So deep diving into the future of a topic, category or audience, but some is more urgent and topical. So for clients on retainer, for whom we serve as a batphone talking to people and helping guide them, I think what our job is to do is really understand and help people understand the cultural implications of [00:05:00] what they’re going to do. Ultimately it’s incumbent on them to decide how they want to use the cultural information. We actually try to stay decently unbiased so that we can be that third party saying whether you like it or not here’s what culture is manifesting. If you do it, this path, X path, it’s going to have this implication on your business likely. If you take the other road, this is what could happen. Ultimately, which road a client takes is on them.

 

Alex:                         Four years ago, [00:05:30] I was just another driving enthusiast. I spent my whole life in cars. And self-driving entered the zeitgeist. I was convinced then just like I’m convinced now that a lot of the messaging of it is just off. A lot of people I think would like it. Don’t get it. People like me. So I think this starts with just the words that the self-driving industry uses. [00:06:00] I want to ask you a couple of questions. All right.

 

Sarah Unger:             Yeah.

 

Alex:                         What does the word driving mean to you?

 

Sarah Unger:             So we love semiotics and studying words. So I just have to say this exercise is really, really fun to me.

 

Alex:                         For those who don’t know what the word semiotics means, would you explain it?

 

Sarah Unger:             Yeah. So semiotics, when we describe it, we’re talking about studying signs and signals in culture. And so those could be visual codes. They could be verbal codes. [00:06:30] It’s the ways we communicate and how they instill cultural information as an inherent part of them. So semiotics, what we’re going to do right now is really debate and pick apart what each of these words mean beyond their dictionary definition, which could be a very long process, but I’ll try to keep it short and not painful. So what does driving mean to me? Practically speaking, I think it means controlling a vehicle, but culturally speaking, it means being [00:07:00] in charge, pushing something forward. I honestly suspect that the longer we exist as a species on this Earth, the more meanings each word will come to stand for. So who knows what driving will mean to future generations? It could be a new slang word that we haven’t thought of yet.

 

Alex:                         All right. So what does the phrase self-driving mean? Or do you think it means?

 

Sarah Unger:             For this term, I really think of it in relation [00:07:30] to a vehicle, meaning there is no human operating it behind the wheel. It’s a mechanism within the car itself.

 

Alex:                         So is a self-driving car, a car that you have to sit in the driver’s seat and do anything?

 

Sarah Unger:             That, I think is not innate in the definition of self-driving. Self-driving to me doesn’t necessarily imply who is in the driver’s seat. It implies a [00:08:00] mechanism with which the overall vehicle operates if I’m getting really detailed about it.

 

Alex:                         There’s no wrong answer here because I have heard people who work in the industry give different definitions. And if you go to, I think Nigeria or Kenya, there’s signs in some of the safaris where if you bring a vehicle, there’s a sign that says no self-driven vehicles. In other words, you can’t bring your own car and drive your own car, [00:08:30] but someone else can drive you in their car. So self-driven vehicle is a vehicle you own, which is problematic for-

 

Sarah Unger:             Interesting. That’s worth noting culturally the definitions can span based on usage in culture. For sure.

 

Alex:                         So now driverless. What do you think that means?

 

Sarah Unger:             Okay. So this is where I think it gets fun. First two, I’m like, okay, okay. Now we’re having some fun here. So when you compare driverless to self-driving, I feel like [00:09:00] now we’re getting into an interesting analysis that can make for a great study, because at first pass, when I hear driverless, I think about how driverless is actually phrased in the negative, indicating that something is lacking. Driverless. I would say that’s not necessarily the emotion you want to drum up when instilling confidence in safety in an offering. So self-driving is the positive inverse. Driverless is without driver. That’s [00:09:30] what comes across to me.

 

Alex:                         Interesting, because I think the self-driving industry would argue that driverless is when there is no human in the seat, but there might be an artificial intelligence driving the car. But the pessimist would say, clearly someone passed out in the seat and the car was out of control and that driver’s vehicle crashed. I hate the word driverless. Great. Now we get to [00:10:00] really the four dimensional word, autonomous.

 

Sarah Unger:             Okay. So if we continue on this exercise, and frankly, it’s an interesting one, because I think what we may surface is the definitions that are specific to an industry are not necessarily what culture or people associate with them. So translation is really important here as a key communication stuck with all these words. I’m riffing as if I had no background knowledge, just what would a consumer think so continuing on this exercise, [00:10:30] autonomous reads as slightly different to me than self-driving because now autonomous taps into a whole philosophical term about self-governance and being independent from the whole. We’re hearing autonomous a lot right now in culture, in relation to workers wanting autonomy. Autonomy, notably, is a step up from flexibility. Full autonomy, we hear a lot. And so I think take it for what you will. It means to me a sovereign [00:11:00] vehicle.

 

Alex:                         That is the purest definition. Of course, the last thing we want is a sovereign vehicle because it would not go where we ask it to, which would be a big problem. I always, I have a three-year-old and I always say, she’s very autonomous, but she’s not safe. She makes decisions that are terrible all the time. I don’t want a fully autonomous vehicle. I want one that’s safe, which is not necessarily the same thing.

 

Sarah Unger:             You might want a little governance in there.

 

Alex:                         Exactly. Definitely want some. All right. So, and then [00:11:30] the final word I’m going to ask you about, AI, artificial intelligence. What do you think that means?

 

Sarah Unger:             So machine intelligence, as opposed to human intelligence. To me, AI is a factor in allowing for self-driving, autonomous, driverless, whatever you call it vehicles. I think it’s definitely a term that people know and [00:12:00] probably have more clarity around than the three we just discussed above, frankly, because they’re nuanced as we just distilled. I think AI is also a more loaded term than robotic, computerized, other digital lens terms. I don’t think it’s quite metaverse level in terms of I have no idea what this means, but I still think-

 

Alex:                         Metaverse.

 

Sarah Unger:             I think people get AI maybe more than the metaverse, but it’s still [00:12:30] something that can be tough for people to nail down in a super tangible way. I think with all these terms, the assumption that frankly, it’s great that you’re asking these questions because the assumption that we have a shared definition around these terms is probably wrong.

 

Alex:                         So AI has been the bad guy in so much pop, I’d say 95% of depictions of anything with AI in it, it’s bad, it’s always bad. [00:13:00] And yet AI is constantly added to pitches for fundraising and companies and stories and headlines. And it seems to be actually pretty negative term. What do you say to a company that wants to use AI in the messaging of anything?

 

Sarah Unger:             There’s a fear that AI will replace us. And that’s not generally a welcome principle for many people who are worried about job security. The thing about change [00:13:30] is people generally don’t embrace it [inaudible 00:13:35] especially when it’s dangled well in advance of the change. So think about it this way. And it’s really funny actually, because people actually adapt really well when presented with changes. It’s just the concept of change, especially far in advance that’s tough to get our head around. So if we had been all told years out that in 2020, there will be a great big pandemic and it will change everything. [00:14:00] And you’ll either be an essential worker, or you’ll be locked up and you’ll have to wear masks. I don’t think people would have reacted well and been like cool. When the time comes, we’ll adapt. No way.

 

So I think that replace is a scary term. I think in general, there is a buzz factor when it comes to these emergent concepts, like machine intelligence. And so it makes sense that companies who want [00:14:30] to present as innovative and forward-looking would tap into them from a cultural relevance perspective. But once again, how things are phrased is really key. If AI was phrased as a way to free up our minds to do more valuable human only tasks, that holds far less scary, far less doomsday than replace. And either way, as I said, people are better at actually adopting the change than [00:15:00] responding to the predictions of it. I think talking heads, executives, it seems like they’re trying to emphatically express the importance of it in some ways to justify investment in it.

 

Alex:                         You’re being very diplomatic. You said to me earlier about things are changing and CNN has an interstitial ad that runs seven times an hour. And it makes me so [00:15:30] angry and it’s very well-produced. And it’s this English accented man. He’s like, the world is changing faster than it ever has before. And anytime I hear someone telling me things are changing faster than it ever has before, I know that person is selling me something I don’t need. I know it because that just doesn’t make any sense to me. And this notion that everything is changing faster, [00:16:00] we have to decide right now we should trust what we’re hearing. I just don’t buy it.

 

Sarah Unger:             Well, I think first of all, I did often wish that I could speak as a British man would just to imbue my cultural predictions with more gravity. That was when I was younger. I’ve since embraced my own voice. But I think at this point, change has become a very [00:16:30] polarizing topic. Even politicized in many ways. Traditional value sets versus innovation. We have recently leaned into segmenting things along ideological lines versus demographic lines. And that’s helpful because demographics, they’re not nuanced enough to reflect cultural divisions in the world today and certainly not people and their [00:17:00] preferences and opinions.

 

And so when we look at ideology, it’s interesting, a big hallmark of what characterizes different ideologies is openness towards change and interest in that versus considering yourself a hallmark of traditional values and trying to get things back to the way they were. So I think change is referenced so frequently or that things will bring about change because it’s really [00:17:30] foundational backbone of culture. I study broad paradigm shifts so that people can exist in this world better. So if I wasn’t an advocate for studying change, I’m not going to knock on change. It’s how I make my money.

Alex:                         So that’s a very interesting thing because the ad I was referring to is on CNN, which many would say is left to center and is clearly part of the context of what they’re saying is change is happening. It’s fast, [00:18:00] it’s here, watch our channel. We’ll tell you about the change. And so are you suggesting the cultural continuum is open to change, resistant to change?

 

Sarah Unger:             I think there’s a few different continuums that you can look at. It’s not just one axis, but I think you have people who are ideologically more interested in actually I would say [00:18:30] you can go back to the cultural curve we talked about prior. People are more interested in residual concepts and culture and residual isn’t bad, by the way. Residual manifestations of things can make a lot of money and provide a lot of comfort and familiarity to people. But I think people are interested in different elements of the cultural curve, depending on the topic. It’s very contextual. Someone could be [00:19:00] very interested in change in a certain topic and lean back on more residual concepts for a different one. And so I think context is really key in characterizing any continuum.

 

Alex:                         This is very interesting, what you’re saying. If you look back on the history of transportation issues, things like owning a car, building a highway, driving a big truck are generally associated [00:19:30] with freedom, power, strength, catharsis, and those things, I would not say this is universal, but on the right side of the political spectrum. And then things like transit, buses, bicycles, scooters, walking fall on the left side of the spectrum generally. These are not universal. And when I talk to friends who are really into cars and driving, if they come into the conversation thinking [00:20:00] that a self-driving vehicle or vehicles, if that it’s going to be ubiquitous, forced on them and eliminate their freedom to choose to own a car, they’re against it. If you say to them, you can take one if you feel like it, you won’t have to drive in traffic, they love it. The same product.

 

Sarah Unger:             I think that people can’t help but put their own narrative over an arc of change. And that is because [00:20:30] when people intake information, the way our brains process it is by adding narrative and story. So it’s hard for anybody, myself included, even though when I do it from a business perspective, I’m trying to do it as unbiased as possible, of course, I’m going to add my own narrative and set of values and beliefs over any arguments. So of course information would be [00:21:00] put through the filter of one’s own value system as the primary reaction. I think that’s what you’re getting at.

 

Alex:                         I’ve heard you describe what you do as reading the cultural tea leaves. And I imagine your job distilled is to read those leaves and help stir the tea in the direction of your client.

 

Sarah Unger:             You’ve taken the metaphor one step further.

 

Alex:                         All right. So, [00:21:30] coming into looking at autonomous vehicle sector is in its infancy as a product. So looking at the tea leaves coming as an outsider, what do you see?

 

Sarah Unger:             Yeah. So, well, thank you for the tea leave analogy. As you know, and for your kind listeners, I’m not a fortune teller, I’m not a navel gazer. I’m not a soothsayer, but I’m looking at what’s happening in culture and [00:22:00] doing semiotic analysis over it. And what I see here is that a lot of money has been invested in this space as you know. You’ve been working heavily on some of the most interesting pilots that are launching, which is momentum for sure. That said, self-driving cars are still an emergent conversation. I think I referenced it in relation to the metaverse is it’s something that may seem futuristic and far off to people. [00:22:30] However, there’s an on-ramp. I think a lot of people might not realize that elements of sensory systems within increasingly smarter cars are actually bridging the gap that it’s already creeping into our driving experience.

 

Most people really haven’t dug deep into the layers of self-driving cars and the implications, implications that are intersectional and maybe not specifically [00:23:00] residing within the vehicle industry, so things like implications for systemic injustice, small businesses, economic stabilization, wanderlust and travel, productivity. That’s not the level of depth culture is at yet when it comes to self-driving. And I think the conversation needs to move in that direction in order to normalize it in dominant culture rather than emerging culture.

 

Alex:                         Okay. I want to [00:23:30] ask you about each of those four examples you gave, because I’ve thought about this a little bit, but I don’t want to give away the inferences I’ve made. You said that you thought that self-driving vehicles could potentially address injustice. How?

 

Sarah Unger:             Self-driving and injustice, I think is a question of access. For example, who has access to cars? Cars are not an inexpensive purchase [00:24:00] for people. So when we talk about who’s able to use them, how people get from A to B, I think there’s a question there that comes into play. I think a lot of interesting social engagements happen within the structure and construct of a car, whether it’s police stops, police interactions. When you remove people from the driver’s seat, there’s likely all sorts of ripple effects that [00:24:30] we probably don’t know because it’s so emergent that we haven’t even studied them properly yet.

 

Alex:                         You also mentioned small businesses. There’s probably an infinite number of ways self-driving car could help small businesses, which way were you thinking?

 

Sarah Unger:             I’m thinking the infinite number of ways, Alex.

 

Alex:                         Are you saying self-driving vehicles could basically allow small businesses, could be the Shopify for small businesses the way Shopify has [00:25:00] allowed small businesses to compete with Amazon?

 

Sarah Unger:             I think that we as a society are now more open post-pandemic to look at things ecosystemically and understanding that if we have an itch on one half of the planet, the other half of the planet might feel it. Connections that are not always obvious. So what I believe would help advance the self-driving [00:25:30] vehicle conversation into a more dominant direction is if we start to look at all the intersectional issues that we have as a society like climate change, like systemic injustice, like allowing small businesses to deliver things in a much more accessible, safe, and cost-effective way, for example. By taking the conversation there, we can get excited as a society about the possibilities [00:26:00] that this technology and intelligence can have.

 

Alex:                         It seems like people love carrots and hate sticks. And yet a lot of the messaging around self-driving cars is sticks. You’re a bad driver. The roads are dangerous. We’re safer than you. Trust us. Get in. I don’t think that makes sense. And it starts with the word safety [00:26:30] itself is vague. What does safety mean to you?

 

Sarah Unger:             Yeah. Yeah. So I think safety overall reads to me as you’re protected from risk, danger, but especially unnecessary risk and danger. That said, when I think really hard and deep about safety, I can absolutely think myself into a rabbit hole here [00:27:00] because nothing is 100% safe. You could argue that being human is a very unsafe activity.

 

Alex:                         Living in New York. You’re a New Yorker, right?

 

Sarah Unger:             I’m actually a Los Angeles, but I was a born and raised New Yorker. So I have dealt with the mean and magical streets of New York for many years. Safety is really risk minimization, but risk minimization [00:27:30] semiotically is a lot scarier than the word safe, so that’s why we frame things in the positive and proactive versus negative and reactive. And so I think over the course of my life as a human, I’ve realized that my potential ability to assess safety is subject to all of the biases that I have as a flawed human being.

 

[00:28:00] I think of this example often is when I have been hiking and sometimes scarier hikes, like I was hiking in Nepal, Everest base camp track, I remember seeing donkeys, horses, mules, yaks, and thinking, wow, this is so steep terrain-wise. I would feel very unsafe on the back of a donkey. I would much rather rely on my own two legs to kill me, but how wrong I [00:28:30] was. Those animals are naturally adapted for steep terrain versus me much more open to slipping and sliding. So I think safety is very subject to all the biases that we have as humans.

 

Alex:                         Did you have to trip and fall before realizing you should get on the back of one of those things or did someone just say do it?

 

Sarah Unger:             Well, I think first you see people on them and you’re like, wow, those animals are [00:29:00] killing it. Second, by the time whenever you’re trekking and granted you’re doing it in the risk minimized way that you most possibly can, if you’re on a journey like that, where you’re at altitude, you’re tired, you’re on ledges and you’re swaying back and forth, just barely taking one step after another at points. And so I think I felt in my body that [00:29:30] a four legged animal might be slightly more stable at this point in time.

 

Alex:                         But what you’re saying gets to the essential tension inside the autonomous vehicle industry today, because there are companies that are saying right now this is safer than you. You can just get in, don’t worry about it, trust us. And then there are other companies that are saying, it’ll be ready when it’s ready in time. And if you [00:30:00] redefine these two poles, it is are people, are kids today just going to age into trusting these things? My three-year-old thinks this is cool versus trying to compel people who are already thirties, forties, fifties, to change their definition of trust. Waiting is not palatable to some investors, but it certainly would guarantee success if your timeline is long enough.

 

Sarah Unger:             Yeah, I think when you [00:30:30] think about safety as a selling point, it’s somewhat table stakes because for us, we’re in a time where we know technology to be very powerful, but also glitchy. So if you’re having any awareness of Silicon Valley, you know the move fast and break things mantra which feels terrifying when it comes to cars. So I think it’s incumbent on the [00:31:00] autonomous vehicle industry to contextualize these concerns surrounding safety for all audiences. But I think making safety an exclusive selling point putting all of your eggs in the safety basket, in my opinion, is an undersell, especially when it gets back to the broad cultural ripple effects as a whole, when the implications are so much. Safety is one, and [00:31:30] there’s also infinite others.

 

Alex:                         That response pleases me so much because I also agree that safety can’t be a distant goal. It has to be table stakes. The great mobility philosopher and I don’t know how else to describe him, Josh McManus once said, no one goes to a restaurant that has a sign outside saying no salmonella here. No one does [00:32:00] that. The assumption is that a great restaurant gets an A and then you look at the menu.

 

Sarah Unger:             That’s it. And then you look at the menu. It’s a table stakes process. I also want to add that safety should be thought of in a well-being sense in light of mental health, which is what is safe for society as an ecosystem. Right now, I study the mental health crisis, a decent amount. [00:32:30] And so I really think for safety, we should be asking, what are the mental health benefits of autonomous driving? Because driving is stressful, probably up there with divorce, death, moving in terms of all the negative effects it has on our bodies. So you can think and contextualize safety in terms of more than just accident data. It’s also about our happiness as a whole.

 

Alex:                         All right. So [00:33:00] I took a shared Uber before the pandemic and I think I was just curious. Maybe I was having a lonely, I wonder who was going to be in the car. And someone else got in who for whatever reason didn’t know they were getting in a shared vehicle. It was a young lady. She was very upset. She was upset. She did not want to share the car with three men. And I don’t know [00:33:30] her. We didn’t talk, but I get this. It seems to me like a very under-estimated selling point of autonomous vehicles is that there’s no one else. Not only that, there’s no driver. That this really mad. And so I asked my girlfriend about it. She was like, absolutely. That matters to me. Do you think that is a much broader want than is apparent?

 

Sarah Unger:             I think ride sharing is something that [00:34:00] the pandemic has had a decent impact on just simply because of the health implications of it. I think people are complex. Some people will want to share rides. Some won’t. Some people will want to sometimes. And some won’t. I think that it’s very variable dependent on public safety, the environmental footprint, the affordability factor. So I think people will contextualize it when making their decisions. [00:34:30] Sometimes I’m introverted. Sometimes I want to blast music with my dog and boyfriend in the car. I think the question that you’re really getting at is riding with people who we don’t know in our circles and speaking longterm in the future, it’s hard to predict at this moment. Right now I think there’s an appetite for because we’re in the middle of pandemic sticking to your pod when you can, [00:35:00] for lack of a better term, but five years out, 10 years out, that could 100% change.

 

Alex:                         I’m just old enough to remember when new Coke came out and before I tasted it, I was pretty young. I knew it was bad. I knew. I didn’t have to taste it. I didn’t want new Coke. I liked old Coke. I wanted as much old Coke as I could get. They didn’t read the room. No tea leaves in the Coke, they just got it all wrong. What is [00:35:30] the most expensive lesson a company ever learned, other than new Coke, where they failed to read the tea leaves and they just didn’t read the room, no tea leaves, got it all wrong? Because looking at autonomous vehicles, I’m always looking for historical examples that one might draw a lesson from. It’s tough. Cold fusion never went anywhere. It never even worked so you can’t point at that. Blimps, zeppelins. Well, [00:36:00] you didn’t need to be a rocket scientist to know that thing gets pierced, it’s not going to be good.

 

Sarah Unger:             Yeah. It’s a good question. I think I in general try to avoid trash talking [inaudible 00:36:17] [crosstalk 00:36:18] but I think in culture where you’re getting at maybe inadvertently is a little bit surrounding cancel culture.

 

Alex:                         Great businesses [00:36:30] survive mistakes. It happens.

 

Sarah Unger:             I think when I listened to your very smart question, what I really took away is look, if people are smart, they see plenty of examples out there in culture. But I actually wrote an op-ed earlier this year in Newsweek on how to apologize. So rather than focus on the inevitable point in time in which brands step in it, because that’s going to happen, [00:37:00] culture is confusing. Tea can hopefully help you not step in it. But if you do, I applaud those who have the humility to admit wrongdoing and commit to doing better. So I really think as a brand learning how to apologize and preparing that process is really key as part of your risk management. If you misread the room, at least read the room correctly to do an apology properly.

 

Alex:                         [00:37:30] So virtual reality, the metaverse. Metaverse is [inaudible 00:37:37] Virtual reality has been promised us for the first time I learned of it was when the movie Lawnmower Man came out, which I think is ’90s, that’s 25, 30 years ago. Did you see that movie?

 

Sarah Unger:             I did not see Lawnmower Man.

 

Alex:                         That’s good because if you had, you wouldn’t want virtual reality. Okay. [00:38:00] So VR has been on the horizon and approaching for 30 years. It’s still not here. What’s the problem?

 

Sarah Unger:             I actually think if we’re talking metaverse and even elements of VR, it’s like autonomous vehicles. We’re already engaged in components of it. On the broadest level, the metaverse [00:38:30] specifically is a persistent virtual space in which services, companies, individuals, platforms can be interconnective and interact with each other. But that’s just a very macroevolution of subnarratives like cryptos, NFTs and digital art, open source games. It’s emergent but there’s momentum. So perhaps your question judges it a little more harshly than you would autonomous vehicles. VR specifically, [00:39:00] I think it’s interesting because I find it fascinating to keep one eye on that industry. I think the cost of technology in implementing it, the access issue is certainly a barrier that I would hope we can get past. Cardboard VR options just never did it for me.

 

Alex:                         It’s the cardboard was the problem, not the VR.

 

Sarah Unger:             Cardboard. I don’t feel like spending for an Oculus. [00:39:30] So where’s the mass market there? So I think that’s a question that we’ve been asked. That said, I think there’s momentum towards all these things in the pandemic as everyone knows that we became much more open to doing things digitally that we had never done before. So I think that’s momentum and a dominant direction. I’m personally not ready to live my life entirely online. I’m actually working to spend less time online. So that’s me and my personal life. [00:40:00] As a cultural anthropologist, I’m totally here for it, ready to study the evolution, ready to get us into the next phase.

 

Alex:                         I generally feel and I think this has been true since 1995, that whatever is on the cover of Wired magazine three times in one year will not be cool for 10 years.

 

Sarah Unger:             It’s like the song that we’ve just heard too much on the radio. We need a bit of a break.

 

Alex:                         So how might [00:40:30] the autonomous vehicle sector go wrong? Everyone says, oh, there’ll be a crash. The whole industry will sink [inaudible 00:40:38]. I’m not entirely sure because I’m hoping people can distinguish between the good actors and bad actors, but from a cultural standpoint, how might it go wrong?

 

Sarah Unger:             Interesting that you asked that versus how might it go right. I think that [00:41:00] I’m going to go back to what we’ve been talking about when you first asked me my definition of words. I think that was a great way to enter into that headspace that communication will be really key. People fear what they don’t know. And so transparent, proactive communication will help. This is a scenario where you don’t want people to jump to conclusions or fill in the gaps when they don’t have information properly. But I also think in general, [00:41:30] you had mentioned beating people over the head with a safety message and I don’t know if you meant it as much, but I’ll give a watch out.

 

Typically fear or shaming is really not the best way to engender warm feelings. I’m not saying it’s not effective ever to parents out there, but the same way with animals. You talk about positive reinforcement [00:42:00] training. I got a puppy this year. It’s all about positive, not negative reinforcement. Optimizing things, not punishing. And that’s really true. I think that same goes to how you communicate these messages. You want to frame benefits in a way that appeal to people’s value sets so that people are open to learning versus shutting down.

 

Alex:                         Without naming names, [00:42:30] what is the biggest disaster suffered by one of your clients who ignored your advice?

 

Sarah Unger:             Not to be overly self-complimentary, but I think we haven’t really had too many disasters here. If the stakes are high and people hire us, they’re pretty open to taking the advice. We don’t have contentious relationships with our clients. We genuinely [00:43:00] like our clients. I love my work and I love our business for that reason. I would say the worst case scenario is probably something lower stakes like a TV show that’s not as good as it could have been or feels a little copycat or has been, but no one’s dying over that.

 

Alex:                         Is there anything else that you’d like to add, words of wisdom, rules for life or for business you want to share?

 

Sarah Unger:             [00:43:30] In these times, a lot of brands and people are coming to terms with the notion of being radically flexible right now. We mentioned that our ethos as a company is be like water, which reflects the fluidity that we need to operate, especially when we’re studying culture of a very chaotic matter. And we’re bespoke, which means we alter everything dependent on the situation at hand. So I think when we look at radical [00:44:00] flexibility, that means that pivoting is a really important skillset that’s going to prove very valuable to companies who desire longevity. And I think the importance of not resting on your laurels in that situation is key. So part of why I believe in culture and adapting your message and your business to fit the changing course [00:44:30] of culture is really communicating with people and helping them ride these waves.

 

That is why I think in what we’ve talked about, helping paint a big picture for people so they can get a headstart on flexing and adapting and their lifestyles to the eventual shift to not only human-driven vehicles [00:45:00] is probably an interesting way to go about it. And I think I also would add if anyone’s interested in learning more about our cultural insights, you can follow us on Instagram at @cultique, C-U-L-T-I-Q-U-E. C-O. What we post there is what we’ve been studying. So it’s a good window into our minds.

 

Alex:                         What you said about be like water is the Zen translation of finding product market [00:45:30] fit. If you’re unwilling to pivot, you’ll never find it ever.

 

Sarah Unger:             I think we mean it really as an ethos of not resting on your laurels and staying adaptable in a world that’s changing. It’s a zen-like meditative mantra. So I don’t think I could run a business about culture without it.

 

Alex:                         Sarah, it was great talking to you.

Sarah Unger:             Thanks, Alex.