The Future of Social Media with Ross Dawson

In this episode, Don MacPherson is joined by Futurist Ross Dawson to discuss the future of social media. They explore the growth of social media over the last two decades, potential forms of regulation, and the role of social media in the coming decades. Ross provides some potential solutions for the negative effects of social media use including nonprofit networks, user ownership of data, and AI technologies for regulating harmful content.

Season Four of 12 Geniuses is dedicated to exploring the future and how life is sure to change over the next decade. This episode explores trends in social media that are reshaping how and why we engage with others over the internet.

Ross Dawson is globally recognized as a leading futurist, keynote speaker, entrepreneur, and authority on business strategy. He is Founding Chairman of the Advanced Human Technologies group of companies, and the bestselling author of four books including the acclaimed Living Networks. Strong global demand has seen him deliver keynote speeches to business and government leaders in over 30 countries, while frequent media appearances include CNN, Bloomberg TV, SkyNews, ABC TV, Today and Sunrise shows, The New York Times, and many others.


The Future of Social Media with Ross Dawson

Don MacPherson: (00:08)
Hello, this is Don MacPherson, your host of 12 Geniuses. I have the incredible job of interviewing geniuses from around the world about the trends shaping the way we live and work. Today, we explore the future of social media. With nearly 4 billion users worldwide, we've seen social media used in toppling dictators, in creating celebrities and communities, and in assisting some of the most divisive politics seen in generations. To help us understand where social media is headed, Australian futurist Ross Dawson, joins us. We discussed censorship, regulation, and the risks and benefits of how social media will be used in the years to come.

This episode of 12 Geniuses is brought to you by the think2perform Research Institute, an organization committed to advancing moral, purposeful, and emotionally intelligent leadership.

Ross, welcome to 12 Geniuses.

Ross Dawson: (01:04)
It's fantastic to be on the show.

Don MacPherson: (01:06)
Why don't you tell us where you are today and what you do for work?

Ross Dawson: (01:09)
I'm in Sydney, Australia, and I'm a futurist and entrepreneur. So I help leaders and organizations to think about the future so they can act upon it today. And I have too many ventures on because I just can't hold myself back. There are lots of exciting things to do.

Don MacPherson: (01:34)
Let's talk about what the state of social media is today.

Ross Dawson: (01:37)

So many of me and my peers were incredibly excited about the potential of connection and that's what I've always believed in. And back in the nineties, putting some context in that, there were some very, very early manifestations of social media, but I just thought of technology as a way to connect people. Most people at that time thought technology was the thing that's, you know, geeks in basements who cut themselves off from the world. And that technology was about cutting yourself off from the world. And I said, well no, technology is about connecting to people. That's what the potential is. And so that's why I saw that it was inevitable that you're going to have social media. This is a way to connect. Humans want to connect to other people. And this was a time of promise and potential and people were saying this is a way in which we can discover like-minded people.

We can have interest groups, we can learn from each other. Twitter was the place where you could, wherever you were in the world, sit in on Silicon Valley conferences. So I think when we're looking at the state of social media today, we have to look at that context of the promise and the potential and the dream. So it was around five years ago that me and some of those other people who are just sort of dreaming of the possibilities and the potential started to say, "Oh, oops, something's gone wrong here." And so that's been over the last four or five years we've started to see, okay, they're all right. There's a downside. There's a lot of negatives which are coming to bear. And that has come to bear not least through how the large social media companies have been approaching this, but also, I think this is all a grand experiment in discovering who we are as humans.

Back in 2006 or whatever it was. I described this idea of latent humanity saying, okay, well, look at all these people getting onto Facebook a few years ago, they said, why would I ever do that? And then they started to get onto Facebook and they discovered aspects of themselves that they never even saw in themselves. And it's not as if we were being inhuman, we were just expressing parts that never had the chance to be expressed before. And so we were discovering who we were as humans beings, given the technology to connect in these ways and sharing — oversharing of course from way back. And then now we're starting to see this latent humanity as it were, play out in negative ways and discovering... So this polarization and this way to cluster and to self-affirm your views and the echo chamber — which of course was topic of discussion back from the very beginning, building up these echo chambers. But now this is seemed to be something where people say, well, I only want to hear from people that have the same views as me. I don't want to hear from all of these people that have different opinions. Why would I do that? And we're discovering these aspects of who we are individually and collectively that are quite negative.

Don MacPherson: (04:53)
What were the first indications that we were moving down a dangerous path?

Ross Dawson: (05:00)
It really was the political polarization in 2015 / 2016 in the US, which is the most evident. And this was where we could see that there was this amplification of very pointed opinions. And of course, there were a few different slices to how this idea of fake news and, you know, you can come to that phrase from a lot of different directions, but essentially, there was a lot of what was patently false information flowing on social media, which was shaping people's opinions and the way in which they voted. And in fact, there've been studies made that have demonstrated that untrue media is most often amplified more than true media. That's because you can design it just in a way to be amplified. So it was 2015, 2016 particularly, that was the most pointed point when we started to really see crystallization. I mean obviously, aspects of that coming to bear before, but that was the time when we started to see that shift in frame for social media.

Don MacPherson: (06:19)
Who's responsibility is it for this level of divisiveness? Because you talked about, I think latent humanity was the term that you used. And then there's also the social media platforms as well. Is it human nature or is it the social media platform that should be held accountable for this level of division that we're seeing?

Ross Dawson: (06:30)
As a futurist, I believe in the first instance that it is possible that we can create an incredibly positive future for humanity. And I think that's a starting point for all of us. And if we believe that it is possible to create a better future for humanity, then what we need to work out is, how do we do that? And humans have gone to war since we've been humans and that hasn't stopped. You know, obviously, humans are not angels, but we can imagine a world whereas individually and collectively we are living rich, full lives, benefiting each other. Whatever the positive future is you imagine, we can imagine that. So the question becomes all right, so whatever we are as humans, we can imagine that better future, how do we create that?

So this becomes the... we can look at the role of individuals. We can look at the role of corporations, the role of government, the role of not-for-profits. We all play a role. Everybody is creating the future as individuals and what they do and the choice they make. Corporations need to be making those choices. I think that is one of the fundamental shifts we've seen is there's more and more... studies have shown that we as a society expect companies to be not just making profits, but actually making the world a better place. And we give our business to those companies to do that. Now the role of government is to be able to — hopefully with light-handed rather than heavy-handed — be able to put into place the policies, the guidelines, the rules, whatever it is that's being able to shape that positive future. And of course, there is a role for not-for-profits or other sectors as well. And being able to bring ideas to light, to influence people, to shape the debates, to contribute, to potentially set up non-commercial social media platforms, for example, is another approach to that. So it's all the above. Yes. I believe that we can have a wonderful, positive future for all of us in society. And the question is simply how do we make that? And there are many, many aspects of that answer.

Don MacPherson: (09:08)
You mentioned the potential of not-for-profit social media organizations. Let's talk about business models for a moment. How do you see social media companies generating revenue in the future? Right now it's through advertisement, but can you see a future where there's going to be a subscription model or maybe two models where it's free for some people and they share their data or it's a subscription for others who want to keep their data more private?

Ross Dawson: (09:36)
Yes. The idea of subscriptions is real. And in fact, Twitter is beginning to essentially consider some variations on subscription models or other ways in which people can pay for access to Tweet Deck or the tools or particular levels of access. Facebook is predicated on scale. And so it is certainly not going to make subscription part of its model. It makes plenty from advertising. Facebook is also looking at other frames for providing the inputs to social media. So for example, through augmented reality, virtual reality, and thought interfaces, which essentially, of course, are ways in which one would amplify the scope of the social media model. So yes, there is the scope for some kinds of subscription models. Those are never going to scale in the way that they could. And the other challenges I suppose, are around the value of data for who uses the data about individuals.

Doc Searls, one of the coauthors of Cluetrain Manifesto a while ago, came up with this idea called VRM, which is vendor relationship management. Which is the opposite of CRM, customer relationship management. So at the moment, companies manage their customers. They've got to gather lots of data about them and say, okay, how do we maximize the lifetime value of our customers? From a customer point of view, it says, well actually, I've got lots of valuable data. I've got people that have things they want to buy — insurance or bedclothes or whatever it may be. And I'm prepared to share my data in specific instances if that gives me value. So it turns the whole thing around. And I think that there have been many, many initiatives over the years trying to do that. And the fact that we haven't actually had something of scale yet suggests that it's going to be very difficult to make it happen, but there still is this possibility.


We can have systems where individuals engage in social media or in commerce in a way where they are owning the data, controlling the data and engaging on the terms of their choice, and creating a very different landscape. And the role of this case of social media would be very different and maybe needing defined different models to serve well. And so Facebook for example, and Google of course, make as much money as they do from advertising because they are serving exactly the right ads to the right person. I think just about everybody has this uncanny thing of, “how it could be an ad for that when I just happened to stop thinking about it?” Or whatever it may be. And this ability to turn that around, I think could change the entire landscape of what social media is and the need to find new revenue models.

Don MacPherson: (12:46)
Are there any competitors to Facebook or any social media companies out there that have this model right now, this vendor relationship management?

Ross Dawson: (12:54)
So there was the Asper which essentially opensource platform. They were very ambitious and you have to be very ambitious of course, to compete with Facebook based on sharing protocols and platforms. And that has not succeeded greatly. Back again in 2006 or so, I was involved in organizations around open data exchange. And again, there have been some protocols to establish centrally. This is the information, your social media profile. This becomes something that is exchangeable and exportable between different social media platforms. So of course the giants have never really warmed to that because it means that people can leave and take their profiles and their connections with them. But there are very well-established frameworks that enable one to build essentially a social network, which has an open-source structure where people can own their profile. But at this point, no, there's nothing which I would see as a viable competitor to Facebook and getting people to leave.

Don MacPherson: (14:17)
How do you see the potential for regulation shaking out? Are there places around the world where social media companies are regulated and how would that look?

Ross Dawson: (14:26)
Well, yes. I mean, obviously, there is regulation and social media all around the world and a whole variety of different levels. One, which is certainly not just social media, but applies across the board is, the European Union's General Data Privacy Regulation, GDPR, which essentially has global impact, as any organization that has anybody at all touching them from the EU, has to comply with that. And that has really impacted social media globally. Of course, there are also various regulations and guidance from different authorities in the United States. And many people might be aware there's been an interesting, well, not quite micro, but certainly an interesting experiment in Australia recently, where legislation was making Facebook and Google have to pay to provide links to news. Now that's a whole hour or more discussion on its own, right?

I suppose the very short story was that some pretty ill-thought legislation was brought to bear to be able to say that there's value being taken from media companies by the social media companies. So they should pay for that. And essentially there was some negotiation, and they've come to a conclusion where in fact, Facebook and Google are paying for news in some ways on the modified legislation in Australia. And this was proceeded by legislation in Spain, for example, where they tried to force Google to pay for links through the new sites. Some other European Union countries also established some similar legislation. It's interesting that a number of countries around the world — or the governments of a number of countries around the world — made positive noises about Australia's experiment. In this case, essentially a super tax for the social media companies or changes in how they behave. So this is now, this has heightened the debate on particularly the financial aspects of social media, but many countries have specific legislation around social media and the European Union is probably being the most advanced.

Don MacPherson: (16:45)
Do you feel like that is GDPR is strong enough in terms of regulation?

Ross Dawson: (16:52)
I think at this point, GDPR in total has proven to be a positive step. But there's also this other point to make that value for individuals can be created through data. And back in the day, I always used to say, well, okay, let's look at the scenarios of what will happen with data. If data is used to be able to exploit consumers, then that's going to be regulated out. We're not going to use it. If data is used to create value for individuals and consumers, then people aren't going to object. And I guess the industry as a whole took the exploiting frame to it, as opposed to the helping frame. And I mean, now when I talk to organizations, essentially, we're saying, look, the way you think about data is not “how do you maximize your marketing value.” The way to think about data is, “how do you create value for your customer?”


I frame FinTech as you know, not just you're fed up with your banks and your financial institutions that are stuck two centuries ago, but the fintechs have said, we've got all this data. And we are going to use that to create value for our customers. And a lot of the fintechs, and the reason why people are going with them, is that they are framed around creating value for the customer. Which a lot of the banks have never quite thought of it that way. So the point being around GDPR or other similar data privacy legislation, there are risks that it cuts off value to individuals for how that data can be used to benefit individual customers.

And I think where GDPR is, yes, there's been a lot of positive benefits in being able to stop the onslaught of data. And so part of the context for this is for the last two decades, I've been following the unfolding of privacy for most of this time. I've been astounded at how open people have been to just letting their data be used for no value for them. And that now finally, in the last few years, we've come to a point where essentially people are starting to say, whoa, this is getting a bit too much. And we are starting to get the legislation to control that. And so now there is a, we might start to get more and more legislation around data privacy, but I do think we also need to recognize there is a risk that this is stopping the ability for companies, sometimes with good intent, from creating value for their customers.

Don MacPherson: (19:37)
Just getting back to what you said a moment ago about creating value for the customer. Wouldn't the social media companies say we are creating value for the customer by using their data? And the consumer often is going to say, no, you're exploiting me by overusing the data. I mean, isn't that the argument that the lobbyists and executives would say from the social media?

Ross Dawson: (19:59)
Yeah, absolutely. But what matters is what the individual thinks.

Don MacPherson: (20:03)
Yeah. And like I said, I am surprised that how willingly people are sharing their data, sharing their locations. It's mind-boggling to me. And I think a lot of it has to do with ignorance or just not knowing exactly what's being stored and how powerful it is in the aggregate, you know, if it is being sold or shared. We haven't read those user agreements. And so that puts us in a very vulnerable position I feel as a consumer.

Ross Dawson: (20:43)
Yes. And that's part of the education. So of education is a significant part of how we get to a better outcome here. And to a certain degree that has happened now, as in, as people are waking up into two aspects to that one is simply just over time and seeing what's going on and seeing how things are working. But the other is also the scope of data. You know, we are locally, literally looking at, depending on which pieces you're taking, certainly hundreds, fold more data about us individually than there used to be,

Don MacPherson: (21:17)
And continuing to grow. I heard something recently, I think in 2020, we created 44 zettabytes of data and we're on our way to about 12,000 zettabytes of data. And I think that's in an eight or 10 year period. So incredible growth of data, and how that data will be used or manipulated is of great concern. And it should be. Because we've seen companies, and not just social media companies, use the data to manipulate and exploit. And imagining that there might be 500x the data that currently exists about human beings, means that the ability to manipulate and influence is greater than ever.

Ross Dawson: (22:09)

Yeah. And I'd just like to take this in a slightly different direction, in saying social media are a commons. We're all participating. We have common value. It is the potential for value creation for everybody through the breadth of participation and how it's managed is how that value is allocated. Now, if we look to the world of health, and this is where there are billions of times more data than there was before, literally. And the ability to have data about individuals, and to aggregate that, to understand how this can be used to help other people and take... Case in point, let's say you have a full genetic profile and also behavioral data, and every aspect of your ailments for every individual on the planet, the boom to medical science would be unimaginable. We have to understand for an individual with particular genetic profile, with particular history, with a particular behavior, looking at a whole pool of other people, that we could actually know what the interventions would be. Which would make them as healthy as possible yet, that is fraught, of course, in the extent of personal data that would be required to be able to make this happen.

And so this is one of the big challenges for us moving forward is this idea of how do we take this aggregation of immense amounts of individual data, which people feel rightly... I suppose feel is very personal, to be able to create common value for society. And there is no simple answer to this. But I think this is an analog, in a way, to social media.

Don MacPherson: (24:03)
How do you see social media companies managing fake news going forward?

Ross Dawson: (24:08)
My next book is about how individuals can best make sense of a world, a universe of information. And knowing that some of the information out there is not going to be accurate. So in the case of social major organizations, I suppose there's a crude way of being able to use a whole variety of AI-based and other technology tools to assess the veracity of a particular piece of news or an article which is posted on the site to be able to either determine whether or not it should be allowed on the site, whether they should be flagged as contentious. But the thing is, there start to be judgment calls around, well, what is it, is this debatable? Okay. Maybe it's just presenting one side of an argument, but that is one arguably valid side of the argument, for example.

And so you have to start to get down to there are some things that... Okay, this is a fact, they state in this article, and that fact is incorrect. And so you can say such and such politician did this and you can go back and say, well, actually, no, they didn't. And so you can say, all right, well that, that article is not appropriate to be put out. But there's a lot more which is far more vague. And so this is where we've started to see essentially social media organizations y yes, there's the challenge of how do you scale this judgment of whether... You know, you place various degrees of everything from banning to flagging, to not making it very visible in people's streams to whole array of other measures. How do you scale that? And that requires, of course, many people who have good judgment, which is pretty hard to get.

And then there are also the political aspects of this in the sense of what do politicians think about these policies? Are they believing that we are biased in the way in which we are filtering these things? And if the government in power is threatening to do things to them... Oh, actually this sounds like a real life story! Then they start to shape the ways which they filter those things. And these are realities. What I've just described is what has been happening. But I can't see a way in which that's not going to happen. Politicians will sound off. Companies are subject to those politicians regulations. And there is no pure objective way to be able to filter this information in a right way. And so that will be subject to social aspects, but ultimately that aggregate of that is the political aspects of that. And this is deeply unfortunate, but this is essentially the situation we're in.

Don MacPherson: (27:08)
I like the idea of some sort of AI scoring. That's something that I've been thinking about for a while. So something gets posted and it's not exactly true, but it's not an outright lie. And so it gets a score of 72 out of 100. You know, out of a hundred, an absolute truth gets 100 something. If it's an outright lie, it's a 1. But there's potential for bias in that. And something that's accepted as a fact, may be proven to be an untruth later on. So there are some issues with that, but I've been thinking about the same thing as well.

How do you think that these companies can better manage hate speech and cyberbullying without impeding free speech?

Ross Dawson: (27:55)
I think we can get, let's call it academic, input from people who study the impact of these kinds of bullying. In the broadest sense, understanding what it is, the way it manifests itself, and how we can mitigate that. So this is where actually there is plenty of social research, which is very valuable in being able to frame these kinds of choices and discussions. And I think it's something where we tend to be harsher in a way than for some other domains. This is where we are impacting people's psyches. It has been demonstrated that there have been many negative aspects on people's feelings of wellbeing through engagement in social media. And in the case of the ways in which you are able to scale bullying, this is a particularly important moment, obviously particularly important for younger and more vulnerable people.

There is no easy way to distinguish the line, but I do believe it's possible to be able to ascertain whether something is having, or potentially having, negative impact on people and being able to filter that out. So of course, very early on, we had the ability to block people on social media, but as people point out, there are things which are said, which once you've seen you, can't unsee them. They leave an impact on you. You can block them afterwards. But they still had this impact on you. There are certainly ways in which AI can be a significant part of the picture for some predictive bullying, as it were. First step, provide as many tools as possible for the individual to control and to protect and to put up the layers of protection and having protected accounts or whatever it may be that's appropriate for them. Being able to have some degree of flagging through AI or other systems around whether there may be things which would be a negative for an individual or no aspersions to sectors of the population and being able to flag those. And the human judgment role again, which is the same issue with fake news. The challenge of scaling the people with good judgment to be able to promptly get on top of what is happening is a very deep challenge. But we can design systems, which can be as good as they possibly can be to address these issues.

Don MacPherson: (30:32)
The de-platforming of president Trump, while he was a sitting president, I think gave people a real clear understanding of just how powerful these social media companies are, and particularly the social media executives. And I'm curious to know what concerns you have over that power, because if it can be used against a conservative politician, it can be used against a liberal politician. And it just illuminated the incredible amount of power placed in the hands of these unelected people — Mark Zuckerberg, Jack Dorsey, and others. And I'm just curious how you see that playing out in the future,

Ross Dawson: (31:18)
As you say, it is a massive amount of power. And particularly in the case of Trump, where much of his impact was through his Twitter and taking that away. As you say, a corporate executive decision having massive political implications. So I suppose the way to frame this is by saying, how can we build in accountability for those decisions in terms of financial impact? You know, that impact a company in terms of social impact, in terms of how politicians or political structures are guided. In a way, this is something where... This is the center of how we need to be framing these debates moving forward. This is a critical juncture and that question is part of that critical juncture. I don't have an answer. I don't know anybody that has a really good answer to that question either. So in a way where I've been answering all these questions saying, well, these are the potential pieces of the answer, which we need to be considering in order to be able to get to the vision of me and my peers who were there from the very outset of social media... Is that these are a commons, that they are not owned, that they are something which is created by and managed by all of us collectively.

Don MacPherson: (32:56)
Somebody suggested to me that this is like a publishing platform and that the companies need to be accountable for what's published here, like the Washington Post or the New York Times or any other newspaper. And that just seems absolutely impossible to me to hold these companies that accountable, the individuals have to be held accountable, or maybe the advertisers are held accountable if they're advertising on something that is clearly fake. And then if the advertisers pushed back to the platform, maybe there is self-regulation versus having politicians regulate. These are just some of the things that I've been thinking about. But it definitely is not a newspaper or a television channel. It's not possible.

Ross Dawson: (33:54)

Absolutely. They can't manage that however much politicians would like them to do so. They are platforms. You can post. You can't moderate everything. There does need to be levels of moderation, but you can't make a technology company responsible for everything that goes up on it. They do have responsibilities and accountabilities, but it is not to say you are liable for everything that goes up.

Don MacPherson: (34:23)
We're recording this in March of 2021. And you've been following social media since its very early days. In your opinion, has social media benefited humankind more than it's hurt us?

Ross Dawson: (34:36)
I would say yes. In all. I think it's hard to be able to remember before social media, in terms of the ability to connect. So I don't, everyone has reconnected with people. They went to school with people I've worked with in the past. Now keeping in touch with people we're able to share with family, with others to share their lives. So these are all, I think, fundamentally good things, humans connect. That's what we do. And I think we've been able to do that in ways where I think everyone would miss it if we would not have these tools to be able to connect to other people. There are many downsides, including, I think many people have got sucked into the vortex of following the feeds, et cetera, et cetera. There were a lot of mental issues in terms of comparing yourself to others and many, many others, and go on for a long time about the negatives. But yeah, in answer to your question, which is, if we'd said we'd never had social media or if we've had it then yes. I think overall, it has been a positive. The negatives have become extremely apparent. But again, how do we create a positive future? We need to look at how do we augment the positives and minimize the negatives.

Don MacPherson: (35:58)
I saw a television interview that you did back in 2012. And you said that social media is going to move to the center of how things work, how governments work, how people find jobs, the center of our lives, and you were absolutely right. Nine years later, it's clear that you are absolutely right. What does the future of social media look like when you think out five years, 10 years, what do you see?

Ross Dawson: (36:20)

Firstly, just putting on the futurist hat and as to where it will go. Yes. Our desire to connect is unquenchable and it is going to become richer and richer. We will have more and more virtual reality connections, we'll have augmented reality used to be able to bring other people into rooms and to have conversations with. And yes, we will have thought interfaces, not just directly, but to other people. And we will be sharing more. And potentially sometimes to smaller groups of people. So there's a lot of technologies that will deepen the extent of that connection. But back to the sort of themes of our conversation in terms of what does this look like as an industry? I think I wrote a blog post maybe 10 years ago or something saying, what scenarios are there for Facebook not to be the dominant platform five years from now?

And I looked through a number of possibilities and I said, well, no, it's not very likely. And so now I think that's again a great question to ask, what are the scenarios whereby, you know, LinkedIn's relatively benign and I think it will continue to exist as hard to manage compared to it. We're starting to see a lot of younger generation social media, which is far more about fast-paced sharing. And I think we're going to see a lot more of that. A lot more visual, a lot more video, a lot more of using these deeper forms of engagement for very rapid fire, short form, exchanges. For those who are not teenagers, engage more in tools, such as what currently Facebook and Twitter. I think we can start to ask the question, what goes beyond this? And I think one of the key things is going to be still looking to that dream or that potential for a agnostic third party platform, which is open where people can engage and go beyond that. And that's not likely, but I think something which is possible is where we start to see a viable alternatives for the current giants.

Don MacPherson: (38:31)

Ross, where can people learn more about you?

Ross Dawson: (38:34)

The easiest place is RossDawson.com or @RossDawson on Twitter. Or you can follow me on LinkedIn at Ross Dawson.

Don MacPherson: (38:44)

Wonderful. We'll put that in the show notes. Ross, really appreciate your time today, and thank you for being a genius.

Ross Dawson: (38:51)
It’s been a real pleasure to be with you.

Don MacPherson: (38:54)
Thank you for listening to 12 Geniuses and thank you to our sponsor, The Think2Perform Research Institute. In the next episode, we'll explore the future of policing with Jim Burch, president of the National Police Foundation. That episode will be released on April 20th, 2021. Thank you to our historical consultant, Brian Bierbaum and a very special thank you to 12 Geniuses producer, Devon McGrath. This will be Devon’s last episode as a member of the 12 Geniuses team. For two and a half years, Devon has been largely responsible for the quality of the guests and the show itself. If you're listening to this show, there's a good chance Devon had a hand in bringing 12 Geniuses to your attention. Devon, we wish you the best of luck in the future. You're going to continue doing great things.

To subscribe to 12 Geniuses, please go to 12Geniuses.com. Thanks for listening, and thank you for being a genius.

[END]