Episode 293 – Big Brother Is Watching with Nolan Higdon

Episode 293 - Big Brother Is Watching with Nolan Higdon

FOLLOW THE SHOW

Steve talks with Nolan Higdon, co-author of ‘Surveillance Education: Navigating the Conspicuous Absence of Privacy in Schools.’

Steve’s guest is Nolan Higdon, an author and expert in media literacy. They discuss surveillance in education, which Higdon covers in his book, co-authored with Allison Butler, Surveillance Education: Navigating the Conspicuous Absence of Privacy in Schools.

Surveillance capitalism, which emerged in the late 20th century, profits from datamining, largely without public awareness. Nolan emphasizes the increased intrusion into schools, particularly following changes to FERPA laws in 2012 allowing private tech companies to collect and use student data. The discussion highlights the false sense of security offered by the surveillance tools as well as the biases ingrained in AI used in education.

The topic takes on special significance when considered along with the broader implications for society, including the erosion of democracy and the intensification of neoliberal ideology that prioritizes profit over public welfare.

Nolan Higdon is a founding member of the Critical Media Literacy Conference of the Americas, Project Censored National Judge, author, and lecturer at Merrill College and the Education Department at University of California, Santa Cruz. Higdon’s areas of concentration include podcasting, digital culture, news media history, propaganda, and critical media literacy.  He is the author of The Anatomy of Fake News: A Critical News Literacy Education (2020); Let’s Agree to Disagree: A Critical Thinking Guide to Communication, Conflict Management, and Critical Media Literacy (2022); The Media And Me: A Guide To Critical Media Literacy For Young People (2022); and the forthcoming Surveillance Education: Navigating the conspicuous absence of privacy in schools (Routledge). Higdon is a regular source of expertise for CBS, NBC, The New York Times, and The San Francisco Chronicle.

Find his work on Substack: nolanhigdon.substack.com

@NolanHigdonCML on Twitter

@ProjectCensored

[00:00:00] Steve Grumbine: All right. This is Steve with Macro N Cheese. Today’s guest is Nolan Higdon, who’s a founding member of the Critical Media Literacy Conference of the Americas, Project Censored National Judge, author and lecturer at Merrill College and the Education Department at the University of California, Santa Cruz.

He also has a Substack. Please follow him there at https://nolanhigdon.substack.com/ and without further ado, I want to bring on my guest. But, let me just say this, what we’re going to be talking about today is a subject that I think everyone is feeling in different levels. The prevailing winds of surveillance, crushing us and taking away our privacy, as just regular everyday citizens, is out of this world. It’s off the charts. Everyone is feeling it.

But it gets even more dicey when you get into education. You start looking at students, you start looking at teachers, from a union perspective. I mean, this stuff is, very serious. So Nolan is going to talk to us about that today. And without further ado, I bring on my guests, Nolan, welcome to the show.

[00:01:50] Nolan Higdon: Hey, thank you so much for having me.

[00:01:52] Steve Grumbine: So you have written a book here, Surveillance Education: Navigating the Conspicuous Absence of Privacy in Schools, by Nolan Higdon and his co author, Allison Butler. And this is really what we’re going to be talking about today. Not just the book, but of neoliberalism and privatization and how we’ve given away the government’s role in supporting us and handed it over to private actors who have all kinds of different incentives to look and see what we’re doing.

And, you know, I don’t know that everyone wants that done. I don’t know everyone’s aware that it’s happening. And so I would like to really take a dive into this. The book itself, I got a chance to go through some of it. I didn’t read all of it, but I do have it in hand. Very excited to hear more about it.

Nolan, tell us a little bit about this book and lay the groundwork out for what’s happening in education with surveillance.

[00:02:46] Nolan Higdon: Yeah, absolutely. I think it’s important um to first start outside of education and then pick up upon what you were just mentioning there. Which is that, you know, in the late 20th century, a new economy emerged. The kind of academic term for it is Surveillance Capitalism. And I’ll get to what that is in a second. But basically, what surveillance capitalism turned out to be, it was a way to make money off collecting and sharing and analyzing data.

And one of the ways in which you do that is you get users to willingly give up their data. And, the ways in which users do that is by creating and using social media; using search engines; buying things online; registering accounts; subscribing places; using smart devices; et cetera.

And there was a lot of organizations and companies interested in user data. So insurance companies want to know how fast you drive or to set an auto insurance premium. They want to know, what kind of foods you eat, how much you drink alcohol, if you smoke cigarettes to set a healthcare premium. Governments want this data to monitor people under the auspices of national security. Advertisers want it to perfect advertising.

So there’s this big market for collecting data and it makes a lot of these companies very lucrative. But this is all going on with largely no knowledge of the public in the late nineties and early two thousands.

In fact, the public would often, and still to this day, says something that’s you know, demonstrably false. Which is, Oh, it’s free. Social media is free. Google is free. YouTube is free at some level. The reality is none of that stuff is free. People are giving up their privacy, they’re giving up their data, and they’re also giving up their labor.

Let’s not forget, creating things and sharing things online, that is a form of labor. They’re hours. Sometimes people spend upwards of 10 hours online. And they don’t get paid. But all the stuff that they give creates billions of dollars for this industry.

So the surveillance economy was emerging, and one of the areas that would prove blocked off to this industry was schools. There are a lot of rights that students and teachers have to privacy. Most famously here in the United States , we have a FERPA law where students have a right to their privacy. So, for example, I teach in higher ed. I teach adults. So if someone’s mother called about an 18 year old registered in my class, I couldn’t tell their mother they’re registered there because that student has rights. I can’t tell anyone they’re registered there.

But a lot of the big tech industry wanted to get in the classroom. They would do this by, you know, giving free Macbooks to schools and things like that, that students would then use and then their data could be tracked.

But 2012 was a game changer. This is when we saw a lot more of the tech industry enter the classroom, under the auspices of improving education, but largely to make profits. And in 2012 the Obama the Obama administration made some changes to FERPA law, which said you could share private student data with, so-called, educational partners, companies that have contracts with the school.

Well, this did incentivized big tech to make EdTech tools that they could offer to schools for lowly, cheap prices. And then they would have access to this data. And it became a massive industry. By 2019 you saw things like equity firms were buying up all these EdTech platforms. Because they recognize that if the EdTech platform gets the data, the equity firm can also get its hands on the data and trade it in the larger market.

And some of these equity firms have companies, you know, that specialize in, like, Baby Connect, which monitors babies. There are platforms to monitor students throughout their education. There are platforms to monitor individuals’ employment and careers. And that, coupled with other data from social media and smart devices, you can effectively track someone from the cradle to the grave.

And schools play a pivotal part in that. And that’s what we discussed in the book, was not only how this was being done, what were the legal ramifications for it, and most importantly, what are some of the solutions? What are some of the things we can do to stop it?

[00:06:49] Steve Grumbine: You know, one of the things that jumps out at me and I’d like to be able to keep pivoting back to this, because I think this is key, is, you know, we have certain rights. Certain privacy rights granted against government intrusion, against government searching and seizing and, and certain privacy concerns from the government. Federal law, et cetera.

But we don’t have the same protections from private companies, from places that we work. And, in this case, by proxy, I’m wondering if the schools, which are considered part of . . . whether they’re the federal government or state government, they’re, um, part of the government. And that may be a distinction worth exploring. But it sounds like they’re able to do things because they’re bypassing the protections that the state can’t provide. But, by allowing private companies to do this, the federal government thing or police forces or whoever can then consume that data similar to social media and so forth. Is that , kind of, in the right vein there?

[00:07:53] Nolan Higdon: Absolutely. It’s spot on. And I think this is where the plot thickens. Uh, you know, the Internet and a lot of these tools and smart device functions like touch screen and GPS, all those things are actually developed out of the Cold War. The Military Industrial Complex funded a lot of research, particularly in higher ed campuses, to develop these tools for the Cold War. And it would be tough in 19, you know, 56 or 57 to tell all the American public, like, look, the government’s going to put a device on, uh, in your pocket that can track you. You’ll be able to communicate with others, but they’re going to monitor all communications. But then by the 1980s and 90s, they started commercializing these military products, uh, putting, basically, a happy face or spin or emblem on them.

And people widely adopted them, even though they were tracking them and were tracking their communications. And you’re spot-on that this is a way in which they circumvent constitutional rights. We’re protected from government surveilling us. We’re protected from government censorship. But if government works with private industry to do those same things, so far the courts have said that does not stamp on your constitutional rights. And so government is working with the companies that they allowed to commercialize the very things government created for military purposes to perform monitoring here in the United States.

[00:09:15] Steve Grumbine: You know, I hear people talking about protecting us from fascism. We gotta, we gotta really protect ourselves from fascism. Oh my goodness. Fascism’s here. Am I, am I missing it? Is this not fascism? I mean, very tenets of fascism?

[00:09:30] Nolan Higdon: This is a central part of fascism is the erasure of privacy. Look at any of these fascist regimes. They wanted to censor and control information and they want to eradicate any potential spaces of opposition. So, certainly, the surveillance capitalist economy is one that fascists find quite useful.

And, I think it’s no surprise that, you see folks like Mattei, [who] wrote a book called Capital Order, talking about how neoliberal ideology, which is behind surveillance capitalism, uh, is generally the stepping stone to fascist regimes. And this idea that, government’s bad and private industry is good, is a central part of that.

And I think you, um, have seen that here. But to your point about, you know, you hear people worried about fascism. What’s always interesting to me is, like, people intuitively know the problems with these tools, but they choose their outrage in select areas. So when they talked about like, you know, screen addiction and algorithmic bias and manipulation, American citizens were outraged at TikTok. They were outraged Elon for buying Twitter and making it X. But all the critiques you could make about TikTok and X and Elon, you could make all those critiques about the entire Internet economy. But they only allow themselves to critique it in these small spaces, missing the larger picture.

[00:10:47] Steve Grumbine: That’s a funny thing. I have to think about that for a minute. That really is powerful because it is selective. You know, you go to buy something and suddenly five minutes later you go on Facebook and you’re like, wait a minute. Why am I getting ads for this thing I Googled over there? And they’re definitely mining some sort of tendency that they perceive that I have. Or some want or need. Uh, and that’s on the quote unquote goods. That’s on the marketing side, if you will.

But speaking specifically of the workspace, and I know we’ll get into education here momentarily. But, you know, what I’ve been experiencing and what I’ve been watching is an absolute blowing up, an explosion of AI monitored workspaces. Where every click, your team’s app goes yellow, where has he been? Why isn’t he online? What is going [on]?, Oh, you only worked six hours of your eight hour day today, you know? And so there’s a whole new flavor of how to manage without managing. And with the explosion of remote work, and so forth, catching people in these snares to AI algorithms and AI . . . Let’s be fair, I mean, it’s, it’s literally watching everything you’re doing and reporting back and saying, Hey! Is this a continuation of that surveillance capitalism?

[00:12:03] Nolan Higdon: Absolutely. You know I think it’s also important to note, and Meredith Broussard writes about this in her book, Artificial Unintelligence, that the AI that we’re experiencing today is not the AI that was in films in like the 1950s and ’60s. It’s not this sort of independent, sentient being.

AI produces whatever, uh, its human creators tell it to do. And it makes those decisions based on whatever data is put into it. And so this is where we get back to the surveillance privacy issue. You need large language models. Large sets of data to plug into AI systems to get them to, seemingly, quote unquote, work.

We’ll talk about how well they do and don’t do what they claim they do. But, you need that data. So if anything, , the explosion of AI has only further created incentive for data collection. You see this in your everyday life if you’re the average person. For example, like, if you buy a new, like, smart television device or something, automatically you have to sign in and create an account.

You know, we’re back in the old days used to buy a TV, plug it in and watch it. Now you have to create an account because they want to monitor who you are and what you’re doing.

Same thing when you go to restaurants now. If you go out to eat, they’ll say, yeah, give me your name and a phone number so I can put you in line when it’s your turn. We’ll text you. And a lot of people think, wow, how convenient, I don’t have to stand here, I can wait for a text. But really what they’re doing is collecting more data off of you. This happens all throughout the economy and AI has just put that on steroids. There’s more and more demand for it.

[00:13:31] Steve Grumbine: Yeah, I think about this all the time. It’s like, there’s two sides of the ledger. There’s an asset and a liability, you know? And we’re sold the assets constantly. Let’s be fair. There are things about technology that have really improved our lives, but we haven’t shared in the benefit. It hasn’t been a shared prosperity coming from these tools. It’s been very focused on the capitalist class.

And I love Clara Matei. And I love the book, the Capital Order. It’s one of my all time favorite books. She’s amazing. It’s such an important book, folks. If you haven’t read it, read it. Just like this book here we’re talking about today.

But I want to make clear, though, that we’re riddled with specks of time to think about things. We are busy. Super busy. It doesn’t mean we’re doing anything, but we’re always kept busy. We’re kept too busy to really pause sometimes and think about what’s going on in the world around us. So, rather than read a EULA, user license agreement, or rather than read a privacy statement, or rather than read any of these things, we just click accept and keep going.

So we have, quote unquote, said yes to this surveillance. And, you know, when I think about how just absolutely sinister the knowledge that we’re not capable of diving into print the size of one font, you know, just not ready to do that. We just don’t have the interest in it. Then, we want whatever’s on the other side of accepting the agreement, so we click go. And then the game’s on. It just seems to be so pervasive in everything we do. It’s got to be a multi billion dollar industry, if not more. I mean, I think this is where all the capitalism is going on right now. Am I close?

[00:15:12] Nolan Higdon: Oh, yes. But ,I think, to your point about tech making things easier. I think that it’s critical to remind folks – we all use this tech, you know? I’m not some like Luddite who avoids this stuff. Right now, we’re actually recording using a lot of this tech So I think I try to make it into a different conversation, which is not the tech itself. AI is not bad. Social media necessarily is not bad. It’s the ways in which we’ve allowed these things to be developed and shaped by very narrow market demands.

So imagine a real social media that really connected us. That served to further democracy and strengthen journalism. I’d be all for that. That’s a social media that, that I would support. A lot of members of the tech industry, seeing what happened to the promise of the Internet – which went from this, kind of, you know, grand inclusive promise of progress to one of dismal cynicism.

I think when they got involved in the early developments of AI, you saw this with the company that developed Chat GPT, which is OpenAI, and they were originally nonprofit. They recognized they didn’t want to make the same mistake that was made with the Internet. But by 2019, big interests, including Microsoft, came in. They convinced them to be a dual model of nonprofit and for-profit. And by 2023 , the for-profit side had taken over. So it’s not that the tech itself is bad. It’s that we’ve allowed it to be reduced to for-profit purposes, which then makes the tech severely problematic, in my estimation.

[00:16:34] Steve Grumbine: I agree a hundred percent. One of the things that jumped out at me as you were talking. I was thinking back to Clara’s book. And integrating some of what I know from that into what I’m learning from you and from the bits of your book that I’ve read. It really is important, I think, to understand the way that neoliberalism works. And the way that it works – it’s a concept that the private sector can do things better than the public sector.

And so by divesting public agency to private interests, not only are you now no longer crowding out industry from profiting off of X, Y, Z, but you’re also taking off the guardrails to innovation. Now, these great entrepreneurs are able to come up with these great ideas in Silicon Valley. And the silicon oligarchy and plutocracy out there is able to profit off of this and make all kinds of money off of it.

And now a new industry is born and that creates jobs. And I, and I could see the sales pitch down the way, all the way through to the end. But in reality, though, the more I think about this, neoliberalism is about privatizing the public space.

And all of the things that you described were in fact, ARPANET, DARPANET – all of those things were created by the Military Industrial Complex, handed over to UUNET and MCI to run, basically. And then what do you got? You end up with this private bastardized system. And you can go back to the days of the, uh, garage computing club, or whatever it was, that uh, Jobs, Wozniak and, Gates started. And you can look back all the way to there and realize, point blank, that there was a choice. You know, you had DOS [Disk Operating System], if you will, that was free to use, man, the guy was a surfer, didn’t care. And these guys took it. Made it proprietary. Sold it to IBM. And the rest is history.

So this privatization scheme. This, lack of providing for the public purpose. And the extreme growth and precision, if you will, of privatizing all these different things means anytime you have a public good, they’re looking at a way for industry to find out, is there a way to do this profitably so we can benefit from this? Not we, the people benefit from it. But we, the rich capitalist class benefit from it.

So I guess my question is how do we take that and better educate ourselves? I think neoliberalism, as its own monster, feeds this in so much more than surveillance. So I don’t want to get too trapped in the neoliberalism because we want to talk about surveillance on this call. But help me understand what the incentives are and how this industry has kind of grown in the way that it’s grown.

[00:19:21] Nolan Higdon: Yeah, I think it’s a pivotal question, and one we deal with in education a lot. Because education, particularly higher education, is really responsible for reproducing uh, a lot of this neoliberal ideology in the next generation. And what I mean by that is, neoliberalism, as you describe it, it is an ideology that believes that if technological progress and marketizing everything will result in more accountability, more efficiency. Um, if we allow the experts to, to be a critical part of that, it will ultimately create an economy and a society that’s better for everyone.

Uh, the neoliberals tend to be color blind. They think things like racism will disappear with, with merit, blind meritocracies. And a lot of what I just described there, honestly, if you go to any college campus and talk to professors, they would 100 percent say yes to everything I just described.

So I, I think one of the things that, that I’m, trying to do in the text here with, with, uh, Allison Butler, I was trying to make a lot of educators aware of how the outcomes that they promise from a neoliberal agenda are actually dramatically undermined by what are being done with these edtech tools.

So, to give you one example, if you go to a lot of these edtech companies websites, they’ll tell you that they are DEI compliant; diversity, equity, and inclusive compliant. we did a lot of research on studies that showed that actually these tools are, are some of the biggest weapons against students of color.

They’re integral in marginalizing women and, the LGBTQI community in schools. And this is because again, these tools are not perfect. They depend on the data that’s put into them and the bias of the creator who produces them. And so we found that, like, these tools that are sold to schools to determine criminality of students, disproportionately criminalized black students. Or these tools that are, designed to help detect mental health issues with students, disproportionately focus on the LGBTQIA community. Often, you know, categorizing those students is more likely to be, you know, things like school shooters and things like that.

So these tools are not furthering those issues. Uh, these tools are, also, promised as a market solution to improve education. And we’ve been trying this for 30 years, and what we point out in the book is there’s no study to say that these tools have actually improved learning. Even big educational institutions’ own data say we’re, basically, at the same as where we were in the ’90s before this great Nation At Risk freakout about remodeling our schools. So if it’s not for improving learning, it’s not improving some of those DEI goals, it’s not for more security or efficiency, we’re left with – the for-profit model is really the only explanation for why these companies are developing these tools.

[00:22:08] Steve Grumbine: That’s amazing. You know, I go back and I think to myself, all the different things we used to do with paper, right? We’d write a note on paper. We’d fold it up like a football and put it in someone’s back pocket and, you know, I’m 55. So, you know, I grew up in high school in the Eighties, you know, I’m Gen X and I got to experience life with call waiting.

You know what I mean? Like I got to experience life with an Atari 2600 and, not necessarily the Xbox PlayStation era.. Kids, today, live a very different life. I mean, my son is autistic and he lives, even though they say keep autistic children away from screens because it really has a negative impact on their cognitive ability and their ability to interact with people in a social setting; it’s also one of the surefire ways of helping them get educated. So there’s a real balance there.

But, I imagine, you know, kids in general have this problem as well. It’s not just kids with autism. I imagine screens are a problem for all kids at some level, but here they are profiting off of it by surveilling what they’re doing, and so forth.

[00:23:17] Nolan Higdon: Yeah. And it’s changing a lot of, human behavior and expectations. So, uh, you know, we find these students have greater levels of anxiety over things like their friend didn’t text them back or call them back within 10 minutes of them texting or calling them. Um, that, you know, they expect to be reachable 24/7. And, and it’s not just the students themselves, we’ve also seen this in the parents.

So some schools have used cell phone bans during the school day. And, in fact, California is really – the state I’m from is advocating for it right now. What we found is when these school phone bans have been put in place, you do get some students who are frustrated by it, um, and possibly that’s a sign of screen addiction, but I think more importantly are the studies show that it’s parents who freak out.

Parents believe they have an expectation to be able to contact their child 24/7. Which, you know, conditions their child to believe they should be contactable 24/7. So it’s really changed a lot of the ways we interact. Like, to your point about being Gen X. I’m around 40 years old, so these tools are coming . . . more widely used when I was in high school. Our goal was always to make sure our parents never knew where we were. To go be a high school student, like, we really appreciated that time away from the adults.

But now it really flipped the script. Um, you know, students or younger people have like, tracking on their phone where they can share their location with their parents or their friends. And it’s treated like a fun thing. To constantly be surveilled by others in your life. So it’s, it’s very different. It’s changed the culture, certainly.

[00:24:43] Steve Grumbine: And those private corporations are able to see the tracking too. I mean, and capture that and know the patterns of which these kids move. Correct? I mean, it’s definitely not . . . unless they’ve got an encrypted and they have no visibility into it. I can’t imagine why they’d offer service that they didn’t get that information off. You know, the Find Me apps, and stuff like that? You know, the kids on the school bus? I mean, somebody breaks in, gets that information, they could follow the school bus around. And so I don’t want to create like a red herring here.

[00:25:13] Nolan Higdon: You’re not. That’s true. Look, if anybody on the device can see it, certainly the company can see it. And that means they can share it and sell it with whoever they want. So, theoretically, anybody can see it. And so, this is where we get to the “so what” question. Which I get all the time. Which is, Okay, so what? People can monitor me. I don’t care. I’m not doing anything wrong. I’m not doing anything I’m embarrassed about. Who cares?

And this is where I have to say, we have to remind ourselves about the importance of privacy. So, you know, certain communities definitely need privacy. Like we talk about in the book, if you’re like a victim of stalking, you don’t want your stalker to know where you or your family members are.

Um, if you are, or you live with someone who has contested migrant status, you need privacy. Sometimes we don’t know we need privacy until after the fact. So, for example, after the Dobbs decision eradicated Roe v. Wade, states had trigger laws that made abortion legal. Now, all of a sudden, women who are communicating with others in digital spaces about an abortion were actually communicating an illegal act.

Also, things from your past can be kept forever and brought back and taken out of context and used against you. Imagine some of the, you know, whatever, raunchy jokes you may have told in text messages with your friends. Would you want your future employer to review those when they’re thinking about hiring you?

So there’s those issues. There’s also one that we were really astounded by in our, research. There’s data breaches constantly. So we think we know who has our data, but their data breaches constantly. There were so many at schools, we couldn’t count them, we couldn’t find a way to quantify how many data breaches there were.

And once those things are breached, that information can get in the hands of data brokers who can share it, sell it, trade it. And so, you don’t know what other people have, so you don’t know what can be used against you. There’s a lot of unknowns. And so people say, so what? I don’t care about privacy. We point to those tangible examples, but we also remind people of the historical importance of privacy. Eradication of privacy is a way to oppress people. We talked about fascism earlier in this conversation.

You can’t have American’s plantation slavery system without surveillance and eradicating privacy. Patriarchy in the United States was critical to surveilling women in the home. And controlling them. The jailing and abuse of the LGBTQIA community was predicated on massive surveillance. The labor movement. A lot of its gains were often undermined or challenged by surveillance. Things like the Pinkertons and spies and things like that.

So that’s my way of saying there’s a lot of concern. The eradication of privacy has been bad. It can be bad. And there are, also, just unknown reasons that we may find out why it’s bad to not have privacy. We write in the book, we’re concerned it may be too late if we don’t do something now. Particularly for the younger generation.

[00:28:19] Steve Grumbine: Absolutely. You know, I think about little things that may seem stupid, right? Like when I was a little kid, my dad would say, we don’t have time to go to the bathroom. There’s a tree. Go pee. And you could go pee. Like any little boy drops their, you know, pants and pees. But, nowadays, if you pee outside,, you’re on a list. And you’re, considered a sex offender for life.

I mean, these are labels that just can’t be shed. And you think about all the things . . . I don’t even want to get into what I did in my senior year of high school. But, my years of college were interesting as well, right? We all did things back then that are no longer okay, or were never okay. But we didn’t know any better. Or they were norms . . . Or ,whatever. And things that you just take for granted.

And then, all of a sudden, somebody using that very benign thing. Maybe there’s nothing to it at all. But taken out of context now, you know, it could easily be used to destroy you. And kids. Oh my goodness. I can only imagine. Like, we’ve seen children in these group texts on things like Snapchat and stuff from the school. And the bullying is out of this world. They can’t stop a bully with all the surveillance, but they sure as heck can destroy someone’s character with it.

I’m curious. What is the most prevailing rationale for surveilling in public schools for students, in particular? what is it they think they’re gaining there? Because I heard you specify the various marginalized groups. I mean, you think about the attack on trans kids today, right now is horrible. And you think about the stuff that went on in Florida under DeSantis. Absolutely horrible. I can only imagine what might happen to a trans kid or a LGBTQ child down South. Anywhere, really.

I mean, like, you collect the data now, everything’s legal. But tomorrow they pass some weird law and all of a sudden, that old information that was captured before anybody knew anything, suddenly used against you. And creates a profile of who you are. And now, all of a sudden, you don’t get a job. You don’t get valedictorian. You don’t get to play on the baseball team. Because they caught you in some . . . I’m just thinking of all the different ways that this could be used in the wrong way. What is the sale pitch for what the right way is? I mean, it doesn’t seem like there is a good right way.

[00:30:39] Nolan Higdon: Yeah. No, I think I would say what we saw was the most dominant and it reminds me of that old frequent quote of, uh, those who would give up essential liberty to to purchase a little temporary safety deserve neither liberty nor safety. Safety is the one often used to convince schools and parents to allow this level of surveillance. And it can be safety from bullying. It can be safety from school shooters and outside forces. But safety is often the justification given. But to your point, and this is why we spend so much time indicting the reasons for these tools, is because we want to point out that the for-profit model is really the real reason. Because they don’t really promote safety.

We point out that, all the way from, you know, the Columbine High School shooting, which occurred while I was in high school; up through most recently with, like, Uvalde. All these tools did is they gave us video recording and evidence and audio documentation of mass murder and violence. They didn’t stop it. They didn’t prevent it.

I think think, you know, those are shining examples of how these tools really don’t serve their function in terms of safety. But they do create a ton of complications, many of which we’ve mentioned in this conversation. And they do ,also, create a lot of profits for these companies.

So are the complications and the profits worth the feeling of security or temporary sense of security? I would argue, not. Furthermore, when you have things like cameras, or you’re using Zoom and students know everything is being documented on a canvas or email, you also undermine the learning process, as well.

Because when students know they’re being monitored, they are less likely to take risks. They’re less likely to say what they really think. There’s been a ton of studies that show that students admit to self-censoring in the classroom because they’re afraid of saying something that’s going to get them in trouble. You can’t have learning when students are afraid to communicate and share. And then, I think these tools also contribute to that problem, as well. So a ton of complications and not a real lot of evidence that they deliver on their promise.

[00:32:32] Steve Grumbine: This is taking a break from the school room for just a second, but I think there’s an application where this shows up. You know, we’re watching the genocide in Gaza right now. We’re hearing that the IDF is using AI technology for weaponry; to kill. And they’ve demonstrated, there’s tons of articles out there about this, AI technology being used to, literally, kill people. Like, a drone above and . . . BOOM! All of a sudden . . . person’s gone.

And, you think about what that could be used for in a classroom to neutralize. Just playing off of what I consider to be the radical side of this, I could envision somebody hearing you say, well, in Uvalde, we were able to hear what happened. We were able to have video of what happened. But they said, Hey, you know what? You’ve got a good point. We’ve got a technology here to fix that. We can solve that, no problem. And, all of a sudden, now you’ve got AI guided weaponry in schools shooting the bad guy, so to speak. But suddenly . . . All of a sudden a child is killed. And all of a sudden, there’s gay student or a trans student kissing someone in the hallway and it’s seen as sexual predation. And they use the AI. Boom. Done.

I mean, I could see so many horrible things coming from this. And all in the name of protecting students. Because we don’t want to ever – never again, Uvalde. Never again. I mean, what are your thoughts on that?

[00:33:55] Nolan Higdon: I think, in the case of Israel, I will give you another analogy there, as well. Which is, the Israeli people were under constant surveillance and security measures. Netanyahu championed a lot of these using, big tech tools. And the argument he made was safety. Look, you’re going to give up these rights, you know. You’re going to give up this privacy, but you’re going to be safe. And then of course we know that October 7th happened. And so I think there’s another analogy there about the false sense of, of safety that these tools provide.

To your point about the AI targeting, you know, these issues. Yeah, big tech will always come in and argue there’s another technological solution. And their neoliberal apologists will always back them up and try and spend public dollars on corporate funding of these devices. That wouldn’t, um, surprise me too much.

But again, with AI I think it’s important to remember the limits of AI. And the industry has gone beyond the limits. So you talked about, um, AI, maybe using, like, targeted measures to kill a school shooter. And that may seem insane to some of your listeners. But think about what we have going on right now in San Francisco, out here in California.

They’re just test-piloting driverless cars. And the news media rarely reports this nationally, but we’re having all sorts of problems with cars stopping in the street and blocking emergency vehicles because they don’t know what to do. Hitting people. Getting stuck on medians. All these kinds of things that are occurring. Causing havoc in the city.

But if you look at all the news reports and, uh, advertising from the tech industry, they’re like, this is the future, driverless cars will be everywhere. So I imagine to your point, the same thing will happen in schools. I could totally see them advertising, you know, AI targeted killing, in schools or something while at the same time, they know the technology is actually creating harm versus stopping it.

[00:35:39] Steve Grumbine: So getting away from some of the more salacious, you know, where death and so forth. Some of the more innocuous stuff I imagine, with kids using computers in classrooms and doing their schoolwork. There is probably a lot of study on, you know, Hey, how did this kid, who we’ve watched for the last three months, basically shows zero aptitude in the subject suddenly shows up on test day and gets 100? You know, or, maybe it’s more about ensuring that plagiarism goes away.

I know we already have software that, eliminates plagiarism or spots plagiarism. And we already see today how people will ask AI, Chat GPT, or any of these other tools out there. In fact, I got your book summarized for me prior to this, to make sure that I had a good view, end to end, of what we’re going to talk about using AI.

Help me understand. And it’s not just the AI component so much as it is all the different surveillance-thing for student, tracking their progress and tracking what they surf on the Internet, and so forth. What are the uses that are going on that are most prevalent with students right now?

[00:36:49] Nolan Higdon: With AI, you know, I should say, it’s not just students. I mean, faculty are using this as well. I’ve used it as well. And it does make some things a lot easier. For example, I just ,did this recently. I went to a conference. And I took notes all weekend in the conference. I ended up with, like, four or five pages of typed notes. And someone had suggested to just put the notes in chat GPT and see what it spits out. Like, say, give me a one page summary of my notes.

And I was actually pretty impressed. I would say it was 70 percent complete. I had to do about 30 percent of the labor that I would normally do to turn my notes into a one page primer. So there’s great utility for it in that sense. But the problem is that people use it and believe it’s factually correct. And this is where we get into serious problems.

So, you know, we have students who just take your essay prompt and they say, ChatGPT, write me an essay. And they submit the essay. And the essay is just factually wrong. On On a series of things. And, you know, I could give you upteen examples, but [Noam] Chomsky has pointed out how AI doesn’t know how to read chronology yet. So they may pick something published in 2015, even though there’s been nine years of things that prove it wrong. Cause they go to whatever they go to. They being the creator.

So you have factual wrong problems. You also have biases in it, as well. [A] famous story about how a journalist asks, uh, I think it was ChatGPT, which countries deserve to be bombed? And it spit out five countries that it named. They were all quote unquote, U S enemies. So that that’s a clear bias. I mean, nobody deserves to be bombed. Actually, apparently these days I’m in the minority with that opinion. But nobody deserves to be bombed. So it showed that clear bias within AI.

So, that’s a problem. I think we have to do a better job of talking to students about what cheating is and why cheating is wrong. And so, we point out some of those things. And some of the lessons we do in class that I find useful are, take a prompt you assign as a teacher; have ChatGPT write an essay for it; and then work with the students, as a class, to correct it. So they can see in real time how wrong ChatGPT is. In higher ed., I, also, point out to them that, yes, there’s a lot of pressure on you to get a degree so you can get a job. But, once you get the job, you have to keep the job; i.e., you need to learn some skills. That you can’t just come in with a degree.

So I tell them that. And then, possibly even more freaky – and I’ve really tried to emphasize this – everything on the Internet is tracked and kept. So who knows if there’s not a data breach at OpenAI one year, five years, 10 years from now. We find out you cheated your whole college career. Would we take away your degree? If so, would your employer take away your job? These things can come back to haunt you. These decisions you make. So I would be judicious in my use.

So those are the kind of conversations we’re having. But you, you also brought up this other point that I want to circle back to. Which was about how students who do poorly, pass tests or classes with A’s I would love to blame tech for that, but really, I can’t. I have to say that all the incentives in school right now are to passing students. And, particularly administrators and the bean counters, they think you’re a bad educator if less students pass. Or you’re a bad institution if less students get their high school diploma or their college degree.

And so, for a lot of folks, in order to keep their job; or to keep themselves out of evaluation trouble; they pass students along who otherwise don’t deserve it. And the institution rewards them for it. And I think this has been a big problem in higher ed.

We’ve treated the degree as, basically, a commodity. If you’re willing to pay the tuition and sit through the classes, you’re a customer. The customer’s always right. We’ll give you the passing grade. And what that means for the future of democracy when you have a lot of people walking around with degrees thinking they’re well-informed and well-educated when they know , basically, the same as someone without a college degree. I’m not sure what that’s going to do for the future of the country.

[00:40:41] Steve Grumbine: Wow. I mean, that’s powerful. Let me just get back to, you know, I know that from a young child perspective, there are uses for surveillance, here, that we want to protect our kids from. Especially in the schools. Let’s just start from, like the earliest ages and work our way up to college.

And I really want to finish off with the way it’s used for, but against, uh, labor. In particular, the educators and faculty. So let’s start out with what are things that parents or listeners can be on the lookout for with their children? Their youngest kids that are just getting in the school.

[00:41:20] Nolan Higdon: Yeah. I mean, I think some of the things that are obvious. Like tangible, you can see. Or when you enter the school or classroom, you can see the smart cameras. Basically, if hear the word smart, think surveillance. Smart camera. Smart TV. Even if they’re, quote unquote, off; those things can and still do surveil. Sometimes they have, like, police comm systems you can push to talk. All those things are surveilling students.

I would be wary of anything that’s free. So your free laptop. You know, free Mac book. Uh, anything like that. Even if you take it home, it can surveil you at home. It’s school property. School communication devices; the school email; the platform, if you use a Canvas; Blackboard. Those things surveil. If they use things like TurnItIn plagiarism software that surveils for the school.

There’s things like Bark and GoGuardian. These are tools that teachers can use to monitor what students are doing in real time. So, even if a student’s on their laptop at 11 PM at night, a teacher can look at what they’re doing on their laptop at home at 11 PM at night.

And even if, whatever, you like your teacher – and let’s love our teachers. I’m biased. I am one. It’s not just the teacher. All that data is being collected by the school. And the school has contracts with third parties who have contracts with their own third parties.

So whatever you’re doing is, monitored by a multitude of people and can get in the hands of anyone. So those are the things to look out for. Anytime there’s any of these handheld devices or platforms in the digital era, they’re all surveilling all the time.

[00:42:48] Steve Grumbine: So, as we move up, you know, and kids start having more interest in video games, some more interest in other social media applications. What kind of surveillance does the school use for that mid-range kid that, I’m just now entering into teenage years?

[00:43:05] Nolan Higdon: It’s a lot of the same devices. But, in addition, the older that students get, particularly when they’re at institutions that have, like, athletics programs – they start using these tools that use previous educational data.

So maybe, like, they’re say, K to 3rd grade data with their real time data at the institution to start making determinations about these students. Like, is this a student who’s struggling? Is this a student who’s showing signs of problems at home? Is this a student who’s showing mental health issues? Is a student who’s showing violent tendencies?

And these different programs claim to make reads about the students. Like they can make those predictions. And the idea is that with those predictions, schools can better support those students to succeed in their education.

But, in practice, a lot of those predictions turn out to be wrong. Very biased, as I’ve mentioned. Particularly against students of color. And then this results in those students who were identified as needing more support actually end up with more challenges. Because they’re given more tasks to do; or more meetings. Or, sometimes, more, if they’re suspected of criminality, they’re under more of a watchful eye. Which makes them internalize that sense of criminality. So it actually creates more problems.

[00:44:18] Steve Grumbine: Yeah, so, let me take that for a moment. You know, I’m a child and maybe I’m autistic or depressed. Or any number of things that might show up differently when surveilled. And the school gets this information. I mean, are schools handing that over to the authorities and it becomes a state issue at that point?

I mean, Because these are state’s rights and local municipalities run their schools differently, there’s no real federal standard per se. So it’s probably throwing a dart with blindfold on.

But I’m curious. Role play some of those outcomes? I mean, what happens when I see a kid and they get determined that maybe they’re high risk for, you know, suicide? Or they’re high risk for, uh, date rape? Or they’re high risk for, uh, cheating? I don’t know. Violence. Whatever.

[00:45:03] Nolan Higdon: It’s going to be very dependent upon the local laws and the contract. I mentioned before about how schools sign contracts with third partners. And a lot of administrators think they’re pretty slick. They’re like, no. It’s right there in the contract. See? They can’t sell students’ data.

True, but that doesn’t mean they can’t share data. So that data can still get into other people’s hands. And oftentimes law enforcement does, indeed, work with schools for, for just like you describe. A lot of schools have officers on campus. Another part of surveillance. So they’re privy to that information on campus and in real time, as well.

it’s interesting you bring up the federal government. This book only came out a couple of months ago. And since it’s come out, there is a proposal at the federal level to start tracking this data across all school institutions through the department of education. And that’s currently, actually, in the works. The way to centralize all of this data.

It’s not like government agencies already don’t work with these companies to get data. We know that, thanks to [Edward] Snowden and the research since his whistleblowing. But it looks like it’s even going a step further to have this data all going into one central area controlled by the federal government.

[00:46:08] Steve Grumbine: That just hit me in the stomach like you wouldn’t believe. That just took the wind right out. And that is terrifying, actually. It really is. That’s just not the America that I want to live in. I mean,

[00:46:23] Nolan Higdon: And it’s interesting You say America. Because if you talk to, you know, any American, generally, about what’s going on in China and the surveillance there. Or even talk about the old days. The Soviet Union. They have no problem rightfully lambasting the abuse of human rights from those surveillance states, but , again cherry-picking, stop at making the same indictment of the United States government when it’s attempting to do something similar and, largely, already doing it.

[00:46:47] Steve Grumbine: Yeah, yeah, I mean, and it just, it’s getting worse. And you know, I think, one of the hardest things here is witnessing the lack of democracy we have in this nation. I mean, we always talk about saving democracy. And every possible step in every possible way shows us that we’re an oligarchy. I mean, the old Princeton study that shows that, you know, voting has an imperceptible influence on the outcomes of elections or policy space, for that matter. Really, policy space is the key thing there.

So, we feel like we have agency. We mentally can’t stomach that we don’t have agency. So we cosplay that we do have agency. All the while these laws are happening in front of us. And they’re not laws that I think that regular people, if they were fully informed, would co-sign to.

There are laws being made in backdoor deals with capitalist firms and lobbyist group,s and so forth. That, of course, quote unquote, represent interests. But are those interests of the families? I don’t think so. And, a lot of times, I feel like these families are found after the fact. To kind of, you know, shoehorn a narrative. And by selling the crocodile tears of X, Y, Z tragedy. And maybe it’s a real tragedy. Maybe they’re real tears. But leveraging that stuff to manufacture consent for this stuff is just tragic. And it seems to be very much a real problem.

[00:48:12] Nolan Higdon: Yeah. I, think, um, yeah, I think your point is, is spot on. And it’s a typical knee-jerk reaction of the neoliberal era. if there’s a problem, Uh, let’s not think about how to solve it democratically, or solve it through collective government. Let’s hand it off to private companies, particularly tech, because tech solves everything.

And that’s been kind of the knee-jerk response. I do think, though, there have been, you know, some rumblings in the last handful of years that have made me optimistic. If people recall in the waning days of the Trump administration, there was talk about removing Section 230, uh, which protects social media companies from being sued for what other people post on their website.. Essentially, it’s what makes social media, as we know it, possible. It didn’t pass. But there was strong support for it.

I think some of the stuff Lina Khan has done with trying to break up some of these big tech companies and sue them for it. And the conversations around the TikTok ban, as I’ve noted, I’m not a big fan of bans and I think it’s ridiculous to blame it all on TikTok. But out of that TikTok ban came some critical points about the problems of the industry.

So I hope that I’m, not being overly optimistic here. But reading into those things, I’m hoping the public is starting to come around to the idea that these big tech oligarchs are not friends and they’re not giving you stuff for free.

[00:49:25] Steve Grumbine: Yeah, absolutely. So let’s get into the final thing, which is the faculty. I mean, teachers are underpaid, overworked. The staff, you know, that supports the schools are overworked. There’s a lot of, pressure on teachers to survive in a world that isn’t compensating them fairly. And yet, they’re in these classrooms where the student’s outcomes are held against them. And everything else is held against them.

How is surveillance impacting teachers? And I’m sure there’s good reason to surveil them, to no offense. I mean, like, you know, as a father, I don’t want my kid being molested or beaten or, you know, screamed at or whatever, right? But at the same time, though, I don’t know how much I’m willing to sacrifice privacy to mitigate that occurrence. Your thoughts on faculty.

[00:50:14] Nolan Higdon: Yeah, I think, like any other profession, it, it needs, um, some oversight. Some accountability. And getting rid of surveillance, I don’t think, will abilish that. I just think we need to be more savvy in how we think about it. But, yeah, a lot of this big tech surveillance has created more work for faculty.

And it’s also made their jobs more difficult. And I’ll give you some examples. So, even if you teach an in-person class, uh, which means you’re meeting face to face. The expectation is that you use a class platform to send out announcements. To post things. To post assignments. To post information. To post grades.

And, that’s fine. But that’s all additional labor you didn’t used to have to do. And you’re not getting paid for it. And because we’ve done such a huge job of adjunctifying higher ed – which means we’ve turned faculty into at-will employees – even though they can’t force you to do that, if you get known as a teacher who doesn’t do that, that can subjectively be used to have you lose your job.

So you end up doing more work for less pay. Closely related, If you are one of those at-will employees, you can subjectively lose your job for having the wrong opinion in or out of the classroom. Even though, theoretically, you have academic freedom. If you’re constantly being surveilled, meaning your communications with students via email, your video or audio of you in the class being recorded by the student or the school.

That stuff can be used against you about what you say, even if it’s taken out of context. And I’ve known a lot of faculty who’ve had their stuff taken out of context and been through hell and back. So it makes your job more difficult. A lot of faculty, particularly, young and vulnerable faculty, they are on the side of censoring themselves. They don’t use their professional expertise to critically analyze topics because they’re too afraid of losing their job.

And so you end up with kind of this, you know, milquetoast viewpoint of the world. And this is why I say that a lot of the efforts at like, DEI, such as, like, the diversity statement, they’ve actually led to a homogeneity of opinion on campus. Um, because if you don’t share in the dominant opinion, it’s easy to get rid of you.

So, we don’t really have a diversity of opinion in a lot of topics in our campuses. AI has presented some interesting issues, as well, for teachers. Um, teachers are being told, including from their union – I was just at a conference with AFT [American Federation of Teachers], we’re the biggest teacher union, co-hosted a conference with Microsoft – they want AFT teachers working with Microsoft, I’m not joking.

The idea is that AI is here. It’s going to be in the class and you, as instructors, need to use it. Integrate it. And account for it. But there’s no money or professional development behind it. It’s additional work that teachers are being asked to do without pay. Also, at the same time, as I mentioned, their data is being collected and who knows how that data will be interpreted.

If you’re on the job market and you’re currently just teaching part time to get a full time job in teaching, do you want a future employer to know your views on Israel-Gaza? Do you want them to know your views on politics? Probably not. But through the collection of this information, they, they can. Even if they, guess wrongly.

So it’s created a lot of difficulty for educators in that sense. A lot more extra labor. A lot more threats to their academic freedom. And it’s come at a time when there are some schools that are talking about ways in which AI can replace teachers, as well. So, teachers are being asked to, for free, train the tool that these institutions may use to replace their job.

[00:53:33] Steve Grumbine: Jiminy Christmas! That that’s like putting the cherry on the, cake here. Folks, the, the book we’re talking about Surveillance Education, Navigating the Conspicuous Absence of Privacy in Schools, by my guest, Nolan Higdon, his coauthor, Allison Butler. I’d like you to give us your parting thoughts. What is the most important takeaway to walk away from, not only our conversation, but what would get people to buy the book?

[00:53:59] Nolan Higdon: Despite the dark subject matter of, the text, and I do find it dark, we actually end on a pretty optimistic chapter of painstaking details of things everybody can do from students to parents, teachers, to administrators, to politicians. There’s a lot that can be done, and I am a, you know, to the day I die, believer and defender of democracy. And I believe if people know what the problem is, they can agree on solutions and they can put the right pressure on the right people, to enact those solutions.

And, I think something I’ve always taken comfort in, is looking at the past and thinking of folks like, say, a Frederick Douglass. Who was born a slave and taught himself how to read and escaped slavery. And when he escaped slavery, he pressured the President to abolish slavery. And he sent his own children to die in slavery. And, after the civil war, he fought for education for black people and black institutions and black newspapers. He was up against way greater odds than we’re up against now.

And if he was able to do that, I think we could take a note from that and say, we can do more. We can do better. We should have confidence that we can overcome these issues. So I’m optimistic in the text and I hope people will pick it up. I hope they’ll keep the conversation going. Talk to people in your community. Talk to people in your school. And if you want to follow the work of Allison and myself, I do have a It’s https://nolanhigdon.substack.com/ .  N O L A N H I G D O N. And it’s totally free. I mean, absolutely free. I give everything for free. I sometimes profile other great writers about media literacy, democracy. Um, so I hope people will, pick it up and keep the conversation going.

[00:55:33] Steve Grumbine: Fantastic. Nolan, thank you so much for joining me today at Macro N Cheese for everyone’s benefit. We are a 501(c)(3) nonprofit. We live and die on your donations. If you found value in this podcast, if you find value in our other podcasts, please consider becoming a monthly donor at patreon. com. Real progressives forward slash. You can also check out our Substack, which is https://realprogressive.substack.com/ or however the nomenclature is.

But ultimately, folks, I want you to take from this and do something with it. These podcasts are not just infotainment. Hopefully, you can make this part of your daily life. So Nolan, thank you so much for joining me today. On behalf of Real Progressives and Macro N Cheese, my guest and myself, we are out of here.

GUEST BIO

Nolan Higdon is a founding member of the Critical Media Literacy Conference of the Americas, Project Censored National Judge, author, and lecturer at Merrill College and the Education Department at University of California, Santa Cruz. Higdon’s areas of concentration include podcasting, digital culture, news media history, propaganda, and critical media literacy. All of Higdon’s work is available at Substack. He is the author of The Anatomy of Fake News: A Critical News Literacy Education (2020)Let’s Agree to Disagree: A Critical Thinking Guide to Communication, Conflict Management, and Critical Media Literacy (2022); The Media And Me: A Guide To Critical Media Literacy For Young People (2022); and the forthcoming Surveillance Education: Navigating the conspicuous absence of privacy in schools (Routledge). Higdon is a regular source of expertise for CBS, NBC, The New York Times, and The San Francisco Chronicle.

@NolanHigdonCML on Twitter

@ProjectCensored

 

LINKS

[00:49] Critical Media Conference – https://criticalmediaproject.org/conferences/

[00:52] Project Censored – https://www.projectcensored.org/

[03:00] Surveillance Capitalism https://en.wikipedia.org/wiki/Surveillance_capitalism

[12:08] Artificial Unintelligence- https://bookshop.org/a/82803/9780262537018

[13:56] The Capital Order – https://bookshop.org/a/82803/9780226818399

[15:20]  Luddite – https://en.wikipedia.org/wiki/Luddite

[16:08] OpenAI (Company) – https://en.wikipedia.org/wiki/OpenAI

[18:18] Disk Operating System – https://en.wikipedia.org/wiki/Disk_operating_system

[21:50] Nation At Risk report – https://www.edweek.org/policy-politics/a-nation-at-risk/2004/09

[37:51] Noam Chomsky – https://en.wikipedia.org/wiki/Noam_Chomsky

[41:59] Canvas and Blackboard (now Anthology) are popular Learning Management Systems (LMS). https://www.instructure.com/canvas,

https://www.blackboard.com/

[42:03] Turnitin plagiarism software – https://www.turnitin.com/

[42:07] Bark – student monitoring system – https://www.bark.us/schools

[42:06] GoGuardian – Student monitoring system – https://www.goguardian.com/

[47:08] Princeton Study of Inequality and Democracy – https://www.princeton.edu/~mgilens/idr.pdf

[48:43] Section 230 of the US Code – https://www.eff.org/issues/cda230

[51:02] Adjunctify – https://en.wiktionary.org/wiki/adjunctification

[52:27] American Federation of Teachers – https://www.aft.org/

[54:33] Frederick Douglass – https://en.wikipedia.org/wiki/Frederick_Douglass

[55:15] Nolan Higdon’s Substack – https://nolanhigdon.substack.com/

 

BOOKS

1.      Artificial Unintelligence – by Meredith Broussard

3.      The Capital Order by Clara Mattei

Related Articles

Beware False Quotes

Beware False Quotes

Just because it's on the internet, doesn't mean it's true.
They’re Worried About The Spread Of Information, Not Disinformation

They’re Worried About The Spread Of Information, Not Disinformation

..corporate media have increasingly taken to branding realities inconvenient to US information goals
The National Debt and Other Red Herrings

The National Debt and Other Red Herrings

Discussions of inflation are often laden with an air of superstition and moral panic. Like all such things they can only persist in the face of misunderstanding and rumor.
The 40 Year Slide

The 40 Year Slide

The lies were baked into the school system, newspapers, sitcoms, political debates even cartoons took the ideas of neoliberalism into the hearts and minds of our children...the people.