Based in Sydney, Australia, Foundry is a blog by Rebecca Thao. Her posts explore modern architecture through photos and quotes by influential architects, engineers, and artists.

Episode 141 - How Social Media Censorship Affects Your Election Information Diet

Episode 141 - How Social Media Censorship Affects Your Election Information Diet

As the US presidential elections approach, Big Tech companies are clamping down and enforcing social media censorship policies in a seemingly ad hoc fashion.  

In this episode, Max dissects how the New York Times covered the social media blackout of a New York Post article. He also examines Big Tech's guidelines on censorship or lack thereof and what this means for us. Lastly, he explores multiple scenarios on how social media will evolve over the next few years and identify potential error sources in polling.  

Listen to the full episode to learn how social media censorship influences the information you consume.      

Here are three reasons why you should listen to the full episode:

  1. Examine the growing role of Big Tech in social media censorship.

  2. Explore multiple scenarios of how social media will develop after the election.

  3. Appreciate the math that goes into electoral polling.

Resources

Related Episodes

Episode Highlights

New York Times’ Bias

  • Max discusses a New York Times article about how Twitter reversed a move to block a New York Post article on Hunter Biden. 

  • The New York Times’ bias harms the Democrats’ chance of securing the election. 

  • Biden’s victory might be in jeopardy because of the Streisand effect: the more you try to conceal something, the more it gains traction. 

  • The Times article casts doubt on how the Post obtained the information. 

  • Max feels that online platforms do not apply censorship standards evenly. 

Censorship and Disinformation

  • The article cited a Facebook spokesperson who said that they support free expression while implementing guidelines to minimize harm. 

  • These are opposing ideas since Facebook has become the de-facto arbiter for what counts as “harmful” content.  

  • Censorship is getting more brazen but might ease after the election. 

  • The hot topic has shifted from regulating hate speech to controlling disinformation. 

Evading Censorship

  • Internet activist John Gilmore said, “The net interprets censorship as damage and routes against it.” 

  • Max explains that centralized or linear information topologies are more vulnerable to censorship, while lattice topologies are less vulnerable.

  • The Internet's lattice topology has changed — Facebook and Twitter act like centralized information hubs.

  • New services might spring up to deliver censored information.

The 2016 Elections

  • Twitter became prominent during the 2016 elections. 

  • The public criticized Twitter and YouTube for having algorithms that sent people toward extreme content. 

  • Max explains that algorithms hijack your mind to keep you scrolling. 

  • The backlash in 2016 that social media platforms didn’t do enough to stop Trump content. 

Back to 2020 

  • Social media giants never came up with a solid set of information policies. 

  • Blockages and censorship remain rampant. 

  • Big Tech firms only respond when there's pressure from a particular interest group.  

What to Expect in 2024? 

  • Ideally, Twitter and Facebook will lose some of their influence. 

  • Realistically, the status quo will remain unchanged until there’s a significant paradigm shift. 

  • Mega-trends take around a decade or two to develop. We might have to wait for several elections before things change. 

Bear and Bull Scenarios

  • Max envisions a scenario where Big Tech firms end up even more entrenched. 

  • Another possibility is that an internal revolution happens, and one of these companies changes dramatically. 

  • Another less likely possibility is that Big Techs lose their power as alternative platforms rise. 

Overcompensating for 2016 

  • Max dissects another Times article about Facebook and Twitter overcompensating for what happened in 2016. 

  • Max observes that Times uses the word "cyberattack" not in the traditional sense but in the context of disinformation.  

  • Engineers have a considerable influence on content since they create the tools. 

  • Max feels that the Times is using emotionally and suggestive words deliberately.

  • The Times article argues that Facebook and Twitter are targeting powerful people, but Max counter-argues that Facebook and Twitter are the ones in power. 

Complexities in Polling

  • Max discusses the Trafalgar Group, the only pollster that puts Trump ahead of Biden. 

  • Trafalgar founder, Robert Cahaly, says that the only people who answer polls have either extremist views or a lot of free time.

  • Cahaly raises the possibility that some respondents may be lying. 

  • Listen to the episode to learn about the many difficulties in electoral forecasting!

5 Powerful Quotes

“I could see standards being applied personally. But how can anyone think that's being applied evenly?”

“The algorithms and all of that are not designed to work in your interest as the consumer. They're designed mainly to work in the interest of the business, which is to keep you on there for longer and longer.” 

“It doesn't seem like they have any set rules to do this. It's all kind of like ad hoc, and it almost seems like it's driven by pressure from certain interest groups, whoever they're talking to.”

“What's happening now is simply that as these companies move to rid their platforms of bad behavior, their influence is being made more visible, rather than letting their algorithms run amok, which is an editorial choice in itself. They're making high stakes decisions about flammable political misinformation in full public view with human decision-makers who can be debated and held accountable for their choices. That's a positive step for transparency and accountability.”

“The New York Times suggests that] the people who are in charge of Facebook, the people who are in charge of Twitter, they're not the powerful people who are used to getting their way. But it seems like it is the reverse. It seems like the people who are in charge of Facebook and Twitter are the powerful people who feel or who are used to getting their way and they're continuing to get their way.”

Enjoy the Podcast? 

Are you hungry to learn more about how social media giants are controlling your information diet? Do you want to expand your perspective further? Subscribe to this podcast to learn more about A.I., technology, and society.

Leave us a review! If you loved this episode, we want to hear from you! Help us reach more audiences to bring them fresh perspectives on society and technology.

Do you want more people to understand social media censorship policies? You can do it by simply sharing the takeaways you've learned from this episode on social media! 

You can tune in to the show on Apple Podcasts, Soundcloud, and Stitcher. If you want to get in touch, visit the website, or find me on Twitter.

To expanding perspectives,

Max

Transcript

Max Sklar: You're listening to The Local Maximum Episode 141.

Time to expand your perspective. Welcome to The Local Maximum. Now, here's your host, Max Sklar.

Max: Welcome, everyone! Welcome! You have reached another local maximum. Today is going to be what I think is a pretty quick solo show. I always say pretty quick in the beginning, but you never know if I can go on for an hour, but I'm pretty sure I'm not going on for an hour today—maybe 20, 30 minutes. And we are going to talk about some of the news stories—yes, related to the election—so my apologies, I know some of you are sick and tired of the election.

But look, we've got two weeks left until the election and then you don't have to hear—well, we might have to talk about the fallout—but you won't have to hear this stuff about for a while—about the polling and about, you know, social media. But you'll hear about social media content moderation and censorship, of course, because that's something we talked about in the local maximum but not so— not on such a politically charged, you know, environment. So, alright, why don't I just get to it?

Today is going to be the first story. Today is going to be social media censorship in the New York Post on Hunter Biden's emails. You all know about this, some ground rules a little bit for me talking about it. I know a lot of you are following these stories and other news and political podcasts. So I try not to go on the big headline of the day. I'm kind of breaking that a little bit. So I'm just going to kind of summarize it because I know you guys are following the news. It's like big news everywhere.

And then maybe we could talk a little bit about where our social media platforms might be headed, like what this really means. And also, I'm not going to get into the story itself in terms of like, you know, what did Hunter Biden do and all that. I'll let that develop. Or is this even real, I'll let that develop over the next few days. Because I just, I don't know, I'm not a journalist, but I do observe. I do observe companies like Twitter and Facebook. And I do have some, let's say personal experience in kind of moderating these large user-generated content systems. So I just want to talk about that aspect.

So we're going to start with a couple of New York Times articles from this weekend—this is the Times not the Post. And even though I know that the Times is biased as we'll see, I don't want to focus on the Post email itself. So unfortunately, it's getting harder and harder to see the New York Times as attempting to be objective. I can understand when you're making the attempt to be objective, and then kind of falling short, but it doesn't even look like they're attempting anymore.

And this is harming—this is harming the Democrats—this is harming their side. I think that the expected election win for Biden might be in jeopardy, not from the emails, but from the response here due to the Streisand effect—the effect that if he tried to hide something, and more people share it, and it looks even worse when it comes out. But let's talk about this New York Times article, this is from October 16th.

And they say that this is after Twitter has blocked a New York Post article, linking to Hunter Biden's emails that was apparently found on some laptop and then he dropped off at some store. And then Rudy Giuliani picked it up. I'm not even going to go in the whole thing. But the headline is that in a reversal, Twitter is no longer blocking the New York Post article because they got a lot of pushback on that. You all said last Wednesday, Twitter blocked the links to the news article so you couldn't tweet it. It's not like they blocked individual people who were tweeting it. You couldn't tweet the link to the newspaper article. And New York Post is like, you know, pretty major newspaper. You couldn't DM it to someone, you couldn't message it to someone. So that's really interesting. I think that kind of takes things to a whole new level.

What I find so crazy is what insane things that people say on Twitter that don't get any response. And so, look, if you think the link into the post, maybe, lacks journalism standards or something. I don't see that as being—as any kind of bar that they are going to set—that filters that. They would have to filter out a lot of other stuff. But anyway, the Times I mean, that's like, you know, that's checking other journalist for journalists—they're basically setting the journalistic standards there. So the Times is careful to say—the New York Times is very careful to say, "Many questions remain about how the New York Post obtained the emails." But again, I can't see how anyone can see that as an even-handed response.

You know, there has been, remember the dossier on Trump and all that? If you listen to—well, look, there's all sorts of things that come out and sometimes some organizations are happy to put it out early. And some want to wait. But why start now? They also—the New York Times is very—they make sure that they have certain phrases in here like, Twitter is doing this “under pressure” from the Republicans. It's always, you know, under the pressure.

So, anyway, the Times says that, this is kind of Trump’s fault and the Republicans are like these rabid dogs ready to pounce on Twitter's every move. “Other misinformation experts said Twitter and Facebook have had little choice but to make changes on the fly because of the off norm-breaking behavior of Mr. Trump, who uses social media as a megaphone.”

Okay. I could see standards being applied personally. But how can anyone think that's being applied evenly? I'm going to continue to read, “From the start, the New York Post article was problematic. It featured purported emails from Hunter Biden, a son of Joe Biden, and discussed business in Ukraine. But the provenance of the emails was unclear,”—it's like kind of where they came from— “and the timing of their discovery, so close to the election appeared suspicious.”

And then, there's kind of a contradictory claim by Facebook spokesman Andy Stone he said, "We remain committed to free expression, while also recognizing the current environment requires clearer guardrails to minimize harm." That is, those are two opposing ideas—it's very hard to make that claim unless you're going to say how you're going to link those two ideas, how you're going to adjudicate between them. Again, pretty kind of simple statement for me to make.

But hey, it's just we've got to make it over and over. Because that's how it goes these days. It seems like this stuff, this censorship stuff, it only gets more brazen with every passing month. So I do want to talk a little bit about, what does this mean for the future? I think it actually may cool down a little bit after the election. I think a lot of it is election-related. But I think that's only temporary because we've been talking about this for ages.

You know, a couple years ago, Mark Zuckerberg was worried about hate speech. I think I talked about that in episode nine. And he said he basically had a five-year plan to use AI to filter out hate speech. But now we're not talking about hate speech anymore. It's interesting. Is hate speech a problem on Twitter and Facebook? Still, I would argue that it is, they haven't done a very good job filtering it out.

But it's in terms of—now, they're concerned about disinformation, more than hate speech. So the narrative has shifted quite a bit over the last four years—over last two years, really—since I started the podcast, which I found very interesting. And yeah, there are other people saying, you know, really over-the-top inflammatory things about the emails. I mean, one Republican here I think said something like, you know, "My sources just told me, they saw hunter Biden raping and torturing little Chinese children." This is just— this is a Republican with a hundred followers.

Unbelievable. Like, look, I mean, if you're just gonna, if you're just gonna censor someone for putting out false information, why not censor that person, instead? We're talking about just the New York Post saying, oh, yeah, we found these emails, which are probably correct. Anyway, this whole thing reminds me of a quote from John Gilmore. I'm going to go back in time here. John Gilmore is an internet activist from past generation. I mean, he's still around. But he said this in 1993 so that really was a past generation of the internet. He said, "The net interprets censorship as damage and routes around it." That is a really interesting statement to me.

And I kind of almost started to think about my episode on topology with Aaron, let me see what episode was that? It's funny that I started with a political topic. And now I'm getting back into topology, which he said might not have a point. But anyway, we're talking about pointless topology in Episode 133. And I think the idea is that certain topology—certain network topologies—are more amenable to censorship than others.

So for example, a line topology, which is kind of like a game of telephone between me and someone else in the end, and I have to go through each person. Then the message has to go through each person. That's very easy to censor, because each person can censor the message on its way in. Another network topology is kind of like a spoken hub. So for example, I might put my message out and I give it to someone important, who gives it to a centralized service, who gives it to someone important on the other end, who gives it to my recipient. And you know, maybe the centralized service since everything goes through them. They can just sensor messages and information and the like. But when you have something like a lattice, you know, say a two-dimensional lattice kind of like a, let's say a grid, with each point being a node. A few nodes are not willing to share information, you kind of route around that. So it's very interesting.

And then, of course, if you want to talk about the continuous topology of—it's just a surface. If I want to get a message from one point in the surface to the other, there are like blotches on the surface, I could easily go around that. So I think that's what he means, you know, the net interpret censorship is that as damaged routes around it. Unfortunately, now, the topology has changed. We're not talking about the more, I guess, kind of lattice topology of the internet. We're talking about the centralized kind of spoken hub topology of these centralized services like Facebook and Twitter.

But the interesting thing to think about is they are built on top of the internet. So is the internet going to respond by, say, maybe having new services that route the information around? We've seen that starting to happen. And does that matter? I think that remains to be seen. I think the question to ask is, what does this all look like in 2024?

First, let's go back in 2016, because 2016 was the first election where Twitter was that big. I was on Twitter in 2012—and Facebook—but really, people were very used to it, and in 2016 it was a very different medium. And so in 2016, these companies were criticized for not just allowing too much open debate—the ideology of the people who started Twitter was free speech to begin with. Don't forget that. But they were criticized for not just allowing too much open debate, but for using their algorithms to get people hooked and addicted and to send them to more extreme content.

And I tend to think that's probably correct—that they did that. I don't know if it's necessarily all right-wing content that was happening. And I don't think all of the content that people are saying, oh, we've got to get rid of, you know, some people say—and therefore we must get rid of all right-leaning content. They kind of put everyone in this one box of the extreme content. I think that's wrong. But is it getting people? Was it getting people hooked and sending them to more extreme content or more addicting content? Yes, yes!

I think that was—especially YouTube—but the algorithms and all of that are not designed to work in your interest as the consumer. They're designed mainly to work in the interest of the business, which is to keep you on there for longer and longer. And it's kind of sad if you think about it. Like it's sort of—they're in the business of like, hijacking our minds and getting us to do things that are kind of useless to us but helpful to them. Which is not, I mean, that's kind of a parasite. I don't like that idea.

Okay, and then so the criticism in 2016, was also that they—they didn't do enough. The criticism from the left, I guess, you could say was that they didn't do enough to stop Trump content, as if like, that's the only problem. But okay, they took that to heart and now they kind of have to overcorrect. And where do we land here four years later in 2020? Or where do we land in 2020 is that we kind of have just blatant blockages and censorship, and they never wrote down a very clear set of guidelines—or at least a clear set of guidelines they can amend that can say, what is, what could be taken down, and what can't.

I mean, even if they want to say, okay, now we're going to write down a set of journalistic standards that we're going to hold everyone to. And the New York Post did not meet our journalistic standards in that one article. Well, they never even wrote down those journalistic standards. So how do we know that the censors over Twitter are actually applying this evenly and fairly? It doesn't seem like they have any—and Facebook as well—all of them.

It doesn't seem like they have any set rules to do this. It's all kind of like ad hoc, and it almost seems like it's driven by pressure from certain interest groups—whoever they're talking to. And it's not necessarily, oh, the Democrats. I think it's more like certain journalistic factions, certain groups that, for some reason, have the ear of social media who have gotten in there.

I don't know exactly how, but that's how it seems like it. So if you want these guys to get more and more power, because you're like, what they're censoring. I mean, I don't think that the people who are in control are necessarily the people you like. And you know, what happens when someone you don't like has that power? So where does this land us in four years? Where does this land us in the year 2024? I was thinking about that. And, my hope is that these organizations—Twitter and Facebook—have much less power. We can come back in a few years and check that out one possibility.

It's always a possibility that we kind of stay on the road that we're on. And basically, these organizations kind of freeze up and keep doing what they're doing. And we sort of need to wait for the next paradigm shift, there would be a lot of, let's say, precedent for that outcome. You know, it's sort of like, hey, we've got newspapers, there are a few major ones. We got cable channels, there are a few major ones, it's very difficult to start a new cable channel—people do sometimes but, you know. Look, the hegemony of CNN, MSNBC, and Fox News and a few others are—is kind of set in stone.

And the only thing that really disrupted that is the rise of the Internet and the rise of social media. So maybe the next paradigm which probably—will be internet-based as well. We'll have to route around that, as that code says. But, of course, it might take a long time like these large trends, these large platforms are kind of a decade or two, when you look at emerging technologies. So in this case, we will have to wait several elections to see any difference.

I think that this outcome—this calcification is freezing up of what we have now—is probably most likely to happen if we see a large amount of regulation on these companies that they're actually calling for. Because then, it will make it even more difficult for competitors to come in and it will make it more difficult to change things. So kind of have the lawyers in the front seat, which they already are in a lot of these companies, but it would be very difficult to come in with a new startup.

So another possibility is that you somehow have an internal revolution in these companies. Not going to happen if their current actions are kind of deemed unprofitable. If they take a look at what they're doing and they're saying, look, you know—well, first of all, one thing that could happen is they could say, look, we're all colluding. All these big socials were colluding, we're kind of doing the same thing.

So one of them might say, you know what, we're gonna do it a little differently, we're gonna go— and they all kind of do it a little differently—but one could really veer off and try to differentiate themselves in order to pursue profit. Maybe they could decide that, hey, this certain thing that we're trying to do here—this is all driven ideologically a little bit and it's really not in the interest of the business. So they might start doing that as well. That pressure could come from public markets because a lot of these are publicly trading comp-traded companies.

Another possibility is they could just lose power much more quickly. That's kind of wishful thinking. I mean, these things have ingrained themselves. They're pretty new, but they've ingrained themselves pretty hard into our life and it's very difficult to have consumers change behavior that quickly. But there are alternatives that are rising. 

I don't think that any of these you know—okay, some like, conservatives are going to like these conservative alternative. I don't think that that is going to be—that's going to stick because that will never appeal to the general public. It would probably be something that appeals more to young people who are not as political, maybe, and want to talk about something else. And maybe people who are—have interests that are far outside the mainstream that might lead to the rise of different social organizations. But who knows, all this remains to be seen.

I think I'll come back to my list in—I don't have to come back in four years. I can come back in six months or a year, and we'll see what it says. So these, all these ideas are kind of born out of another Times article that I want to go over real quickly. And I want to quote from—this is from the 15th, the day before. And basically the title of the article is—let me see what it says. It’s Facebook and Twitter dodge a 2016 Repeat, and Ignite a 2020 Firestorm. So again, they are overcorrecting for the criticisms that they received in 2016. I could have easily said that was going to happen.

“Since 2016, when Russian hackers and WikiLeaks injected stolen emails from the Hillary Clinton campaign into the closing weeks of the presidential race, politicians and pundits have called on tech companies to do more to fight the threat of foreign interference. On Wednesday, less than a month from another election, we saw what doing more looks like.”

So you can look at the parallels—the kind of history rhyming here. Hillary Clinton emails, those go out. Hunter Biden emails, now they're doing things very differently—it’s kind of interesting— both situation involved emails being leaked. Although it looks like the Hunter Biden stuff is not kind of a foreign intervention. I don't even know if the Hillary one is proven foreign intervention. That's just conjecture, I think. But okay, the article goes on later.

“It's true that banning links to a story published by 200-year-old American newspaper—albeit one that is now a Rupert Murdoch-owned tabloid”—that's an interesting Times dig at the Post—”is a more dramatic step than cutting off WikiLeaks or some nest or some lesser known misinformation for purveyor. Still, it's clear that what Facebook and Twitter were actually trying to prevent was not free expression, but a bad actor using their services as a conduit for a damaging cyber attack or misinformation.”

A lot of these words that are kind of—like a cyber attack, I think of actually like attacking the website, like making it unfunctional or making it dysfunctional, or taking over somebody's account and stealing money, not putting out information. You know, but the Times calls it a damaging cyber attack. They also say that if companies did nothing, the Times says that if the social media companies did nothing, they risk getting played again—”getting played again by a foreign actor seeking to disrupt an American election”. They didn't know that it apparently was not a foreign actor.

So the ending to this article is kind of amazing. So I'm just gonna read the whole thing, and, and maybe react to it real fast. The Times goes on, and then I'm done with quotes, the Times goes on.

“The truth, of course, is that tech platforms have been controlling our information diets for years, whether we realized it or not. Their decisions were often buried in obscure community standards updates, or hidden in tweaks to the black-box algorithms that govern which posts users see. But make no mistake: these apps have never been neutral, hands off conduit for news or information. Their leaders have always been editors masquerading as engineers.”

So this is very interesting. I feel like the Times has given away the game here, which, you know, I've seen. I have not seen that happen at Foursquare but I totally could see how that happens.

Because engineers, like myself, are given a lot of power over what happens in the content because we make all the tools that control the content. And then, you know, people just say, all right, good enough, basically. So anyway, time goes on.

“What's happening now is simply that, as these companies move to rid their platforms of bad behavior, their influence is being made more visible, rather than letting their algorithms run amok, which is an editorial choice in itself. They're making high-stakes decisions about flammable political misinformation in full public view with human decision makers who can be debated and held accountable for their choices. That's a positive step for transparency and accountability even if it feels like censorship to those who are used to getting their way.”

It's interesting, “feels like censorship” is kind of an interesting term of the Times uses here. This is why I feel like the New York Times isn't actually—they're using terms that are very emotionally and sort of suggestive, which they've been doing for many years—but it's so obvious here. And finally, the Times says, "After years of inaction, face”—I wouldn't say years from action but anyway—"after years of inaction, Facebook and Twitter are finally starting to clean up their messes. And in the process, they're enraging the powerful people who have thrived under the old system.”

So that's a really interesting narrative. So it's no, the people that they are getting rid of are the powerful people, the people who are in charge of Facebook, the people who are in charge of Twitter—they're not the powerful people who are used to getting their way. But it seems like it is the reverse, it seems like the people who are in charge of Facebook and Twitter are the powerful people who feel or who are used to getting their way and they're continuing to get their way.

So basically, I don't know, that's all you need to know. Are we've been boiled alive here? And we've been kind of, you know, set up by these companies to just take control? Control what we see and what we don't see. And even when we, I mean, I've never been happy with my information diet on these socials. And I'm thinking very hard about how I can exit them. I don't know exactly if it's possible. I advertise the podcast on Twitter and Facebook, many people find it in there, but I'm seriously concerned finding a way to exit them. I know my co-host, Aaron—he's not on Facebook, he's not on Twitter. So I don't know maybe this is a resolution for 2021 to figure out if I can do that. Let me know what you guys think.

So all right. One more article that I want to give—and this is kind of another election article. Also kind of right-leaning, so sorry. Apologies for those of you who get annoyed by this stuff. But this is about the polling—the polling episode that we did last week in Episode 140, with Alex Andorra.

And I got a lot of very positive feedback from that article, because polling is a very interesting thing. And so I read this article from the National Review about the Trafalgar Polls. And so Trafalgar is a very controversial pollster because it's the only one that has Trump ahead. So they're kind of—so everyone's saying, you know, head by a lot, Trafalgar thinks Trump's ahead. So it's interesting to think, all right, well, what does Trafalgar think? What are they going for here? So I want to see what they have to say, this is an interview from—well, basically, an interview that was turned into an article from Robert Cahaly, who founded Trafalgar.

One thing that he said that I find interesting is that it's very difficult to gauge the self-selection of the people you poll. So it ends up being people who want to sit around answering political questions in the evening when they have other things to do, kind of want to poll people who have kids, who have jobs very hard to do. Or podcasting me. If a pollster called me now, I wouldn't answer it because I'm in the middle of doing a podcast, you know, so they're not getting the podcast to vote very good.

Anyway, so he says that the people who want to answer political questions are either very left, very right, or very bored. I like that. So, Trafalgar they stick to under ten questions, but I still think that's going to be an issue. And he says there's a possible Bradley effect. Bradley was the last name of someone who ran for governor of California, I think it was 1982, who underperformed his polls. And they say it was because, you know, maybe he was— it was kind of like, socially acceptable to vote for him. And so a lot of people kind of lied to the pollster.

And they're saying, well, that's probably what's happening now. Trafalgar also says that they have a much larger refusal rate among Republicans and conservatives—all pollsters. Notice this, the Republicans and conservatives aren't answering the poll. So what they normally do is and—what I would do if I were doing kind of an observational study, like I did, when I was doing advertising studies at Foursquare—is that they would include a corrective factor for that. So, you know, if half of the Republicans refused to answer, I'll just take the ones that did answer and multiply that result by two. That may be an oversimplification, but you can kind of see how it works.

Cahaly from Trafalgar thinks that what's happening is, and by the way, it's not like, most pollsters don't know that. Like pollsters know that they correct for that, everyone corrects for that. So that's good. They're not stupid. But Cahaly thinks that the Anti-Trump Republicans are way more likely to answer than the pro-Trump Republicans. So in other words, the polling data within the small number of Republicans who answers, he thinks is very skewed, and that skewing other polls.

I'm not saying that that's exactly that's what's happening. I'm just saying that that's what he thinks. And so that's kind of another way to think about it. So another issue that came up is that they need to find people who have a high propensity to vote, but have a low propensity to being polled. And, again, this—I really noticed this from the works that I did, in terms of measuring ads for Foursquare attribution, when it was like, we have to look at people's propensity to see an ad. And their propensity to visit the store that the ad was trying to get them to visit.

And in this case, you know—so this is—this is a very legit thing, like, you know, if a group of people has a very low propensity to being polled, and you neglect to pull them, then your results can be way off in that subset, which could affect the election. So just the conclusion of this—this is just very fascinating stuff. As we learned last time, polling is not an open-and-shut calculation, as many assume that it is. And we'll just have to wait until Election Day to find out. So Election Day is now just two weeks away. And that means I have two episodes before the election, or both of them are going to be about the election—probably not.

But so you might only have to have one more pre-election episode, I'll probably invite Aaron in. That's great. Another thing before we go, if you go to the website, which is localmaxradio.com—some of you have noticed this, I think I said this before—I have a kind of a cheat sheet on Bayesian inference that you can get if you sign up for the email list. And it sort of gives you—kind of a quick way to calculate Bayes’ rule and to solve problems.

But what I'm trying to build on top of this is something a lot more, even more use. So check that out if you're into Bayesian inference. I'm also doing a Bayesian conference, that happens at the end of the month, PyMCon. Let me make sure that I get it first. Okay, yeah, you can actually register for this conference. And you can hear what I have to say. This conference since October 31st. It's on. Well, it's PyMCon, which is a Bayesian framework. But it's also for Bayesians in general.

And I will post that on the website for this episode, which is going to be localmaxradio.com/141. So this is PyMCon 2020, PyMCon. And yeah, you could register for that. My particular talk, I think, afterwards, I might make it into a podcast episode, if I can. I talked about you what is probability, and what I learned through the course of asking that question on the podcast and interviewing a lot of people. So okay, so you go online and get your cheat sheet on Bayesian inference.

Go to localmaxradio.com, sign up for the newsletter, great. I don't actually send out the newsletter yet. But I will, it's just my email list, I will send good stuff, I don't want to just spam you with stuff. So that's not going to happen. One thing that I do want to build, it's kind of an extension to that is a kind of a much bigger worksheet or a pamphlet—I'm thinking like 10, 11 pages, on using Bayesian inference to solve problems.

Because Bayesian inference really is just the mathematical form one mathematical formulation of the scientific method. It's kind of the scientific method with a particular point of view on how to do it. And I think that as I look through the steps that I take in solving problems, and I often ignore my own advice to my own peril. When I ignore my own advice, I usually go back and say, oh, if I only had taken my own advice, I would have been able to solve this problem. And it's not just like problems at work. It's not just numerical problems, data problems. But I think this stuff can be used for personal problems as well.

And so I'm kind of, coming up with like, eight steps I have here—who knows, I might change that number—in order to problem solving, or to gaining new knowledge, which is based on the scientific method and based on Bayesian inference.

I think that I can write, kind of a good take on that, and a good kind of pamphlet on that. So what I'm thinking is, maybe I will end up doing kind of a step, a day on the podcast and talk about each individual step. Maybe I won't do it every day. Maybe I'll just do it with Aaron, when he comes on. And then at the end of this, we'll have a really nice kind of collated pamphlet for you to use for—for any of us to use when we are solving difficult problems. And sometimes when you have that framework, it makes it easier to solve your problems in life.

So I'm looking forward to that. All right, so next week, I have—there's been a whole lot of interest now on people who are interested in being interviewed on a local maximum. So if you're interested in being interviewed, or if you're interested in just doing like a five-minute call in let me know. But, you know, I'm trying to wade through who I should interview and who I shouldn't. So it takes a little while, but I might get a few fascinating people on here. So great. That's it. Like I said, quick solo show today. Glad I kept it at 30 minutes. Have a great week, everyone!

That's the show. Remember to check out the website at localmaxradio.com. If you want to contact me to host or ask a question that I can answer on the show, send an email to localmaxradio@gmail.com. The show is available on iTunes, SoundCloud, Stitcher and more. If you want to keep up, remember to subscribe to the local maximum on one of these platforms. Follow my Twitter account, @maxsklar. Have a great week!

Episode 142 - The Newest App for Augmented Reality from Foursquare, Marsbot for AirPods

Episode 142 - The Newest App for Augmented Reality from Foursquare, Marsbot for AirPods

Episode 140 - Why the US Election Polls Are Tricky with Alex Andorra

Episode 140 - Why the US Election Polls Are Tricky with Alex Andorra