Scott: Welcome to the AI Show. Today, we're asking the question: What does the AI dystopia look like?
Susan: Oh, man, we are going down the tubes. It's going to be terrible.
Scott: Let's take it to negative town. The world is over.
Susan: The world, as we knew it, ends basically every year and a half as the next revolution hits, but this is the last one.
Scott: There's a law against that, isn't there?
Susan: There is?
Scott: Everything has a life expectancy about twice what its current age is, but until it abruptly dies.
Susan: Oh, yeah, that's cool. I've got to look that one up.
Scott: Yeah, so the abrupt death is coming. Everything looks like we're going to live twice as long.
Susan: Well, exactly. Well, everybody keeps saying, “Hey, the pendulum's going to swing back, and technology is going to help us more than hurt.”
Scott: Hey, we're still alive, right?
Susan: That's all true up until the very last time. Then, that last time, people are like, “Well, I guess it didn't swing back that time.”
Susan: Man, there's a lot to be pessimistic about.
What's the first thing that's going to go?
Susan: Oh, the first thing is privacy.
Scott: Privacy's number one?
Susan: Everybody knows that their privacy is not nearly what it may have been in the past.
Scott: What is privacy anyway?
Photo credit iskallvinter
Susan: Now, we're going to take the loss of privacy to the next level. Not only will they have your data, but they'll have the computing power and the algorithms to actually do something with it.
Scott: This isn't like drones looking in your bedroom window.
Susan: Who cares how many data points you have, if you can't actually make sense of them? But you know, we can actually listen to every single phone call and record it. But, if you can't actually do anything with all that audio, who cares?
Susan: Now we can take all the surveillance cameras, and we can analyze video.
Susan: We can put together your entire human history by taking pictures from 4000 different things.
Scott: Your browser history.
Susan: The smarts are finally there to analyze multiple terabytes of data and come up with you.
Scott: Does Google become a state-owned company?
Susan: Well, no, Google owns the state.
Scott: They become the state. Google's the state.
Susan: The United States of Google?
Scott: The United States of Google!
Susan: How about you, Scott? Where do you think the dystopian future of AI goes?
Scott: Well, I think we should bring this to self-driving cars. Every inch of your life is known.
Susan: Is this the trolley problem?
Scott: You're going to start driving around, or you're not driving around anymore. You've got mainly machines that drive for you.
Susan: Of course.
Scott: But now, what are they going to do? They're going to drive you around and take you by the billboard that's also AI powered.
Susan: I love it! It's like why are we taking this route? Oh, what's that billboard there?
Scott: Everybody always has the Uber driver experience. “Why are we going this way?” Well, this is going to have a monetary reason behind it.
Susan: From upfront a voice says: “This is faster. Trust me.” Why are we stopped in the middle of nowhere, with nothing but advertisement around me?
Scott: Is the AI that drives the car like a humanoid? It turns around. “Trust me. It's faster. (robotic voice)” Yeah.
Susan: Oh, well, there's a privacy bent to this, too. Just think of it.
Scott: No more sexual relations in the back of the Uber?
Susan: No, it's creepy if you did, but no. Now, every last inch of your entire life is ... Your position is known. You get in the car, and it knows where you're at, but more than that, it's just a big data collection device. It's built for it. All those LIDARS are constantly going, scanning every single thing around them, all that stuff.
Scott: Before, it just knew your position. Now it knows your total surroundings.
Susan: The natural outcropping of self-driving car technology is really fantastic image recognition and classification. Not only is it going around recording every last square inch of visual detail, but man, it's saying, “That is a flower pot from Ikea.”
Scott: You could buy this.
Susan: As you're in the car, you're sitting here. You're buying stuff.
Susan: As you're in your car, you're looking around over there, and a little advertisement pops up. It's like, “You can also have this flowerpot for $9.99. Just press here.” It sees your eyes, and there you go. The attention economy.
Scott: Well, have you ever seen the show Black Mirror?
"Fifteen Million Merits" Series 1, episode 2 of Black Mirror.
Susan: First of all, if you see the first episode… Go to the second or third episode. The first one will just...
Scott: Yeah, don't watch the first.
Scott: Yeah, but the attention economy is discussed there in one of the episodes a little bit, right?
Susan: There's an episode where they're on bikes, just pedaling aimlessly, just to pay off their credits, and they're being forced to watch TV advertisements. Oh my. That's dystopian.
Scott: If they start to fall asleep, it'll jolt them back awake, because AI's watching them. It knows what they're looking at, what they're taking in.
Susan: No! Yeah. They're forced to watch a commercial, and it knows whether they're paying attention or not.
Scott: You can skip the commercial, but you'll have to pay.
Susan: How evil is that? I've got my headphones on now, and a commercial comes on, I just go like that, but then the commercial will pause, and you put them back on, and it's right back there where it was at.
Scott: Right back there.
Susan: I was like, what? You're just like, oh, I'm going to get smart and switch channels, and it's still there—the commercial does not go away!
Scott: Yeah, it's the same one.
Susan: Until you listen to that commercial and actually pay attention.
Scott: Then you give in. You're like, okay, my AI overlord.
Susan: Oh, whew. That is is tremendous.
Are they going to take our jobs?
Susan: If you drive something, you can forget about it. Cars, trucks, planes, which ... Man, get in an aircraft. There's no cockpit.
Scott: This is the easiest job ever, right? Why not have an AI pilot?
Susan: Yeah, exactly, and by the way most accidents happen up in front, just saying. Maybe it'll be a lot safer, not that airlines are less safe right now. The autonomous revolution. I mean, drones are going to start delivering your food.
Scott: They could poison you. Or they just selectively, “Oh, you're not a Trump supporter? Your food is going to be cold.”
Technology is high-tech mediocrity
Susan: I was just thinking the whole drone revolution there, delivery revolution, suddenly you're starting to get the seconds stuff. You've got to pay for Prime to get the fresh eggs, right? These are still technically good, but they're one day away from expiration. It's like, magically, all the food that gets delivered to your house, unless you pay the premium, is one day away from expiration.
Scott: But it's been managed very well, like the warehouse is near you. It's been stocked just for you, because they know.
Susan: Oh yeah, yeah. It knows exactly that cutting point, you know? We call this Susan's Law here.
Scott: Susan's Law.
Susan: Susan's Law: Technology allows you to make something just good enough. The better your technology is, the finer you can cut that line to be just good enough for the customer that they'll pay for it ... I think we've seen this.
Susan: As technology's gotten more and more capable, we've gotten to the ... We're always on the verge of saying, “This is so bad! If one more thing happened, I'd get rid of it. If just one more thing ...”
Scott: Yep, but you won't. You just keep paying.
Susan: You won't, because It'll be so perfectly honed to you, that you'll never be actually happy. You'll be on the verge of so unhappy that you'll get rid of it, but you won't actually get rid of it.
Scott: AI will optimize the frustrations. AI knows your dreams, and it's going to make sure that you never achieve them, but you're going to be very, very close, always.
Scott: Just one more little bit, that's it. You're so close.
Susan: You will be addicted to things, because of AI, that are ridiculous. It's like they will have honed the rewards system on whatever to be, well, if you just click this one more time-
Scott: Just one more, just one more, what's the big deal.
Susan: Just one more ... Eight hours later, you're sitting there in a pile of filth, and you're like, just one more click, and I'll go to sleep.
Scott: Well, that might already happen on YouTube. Doctors and pilots are now unemployable.
Scott: Yep, what about doctors?
Susan: Oh, doctors, geez ... Not to offend any doctors out there.
Scott: You're gone. You're a goner.
Susan: A subjective opinion is probably not a good one.
Scott: Are you saying doctors are subjective?
Susan: Sometimes, sometimes. They're professionals. They're well-trained, but they're still people.
Scott: Yeah, they get tired. They're trying to get their Medicare bill/Medicaid bill paid.
Susan: After you've seen the 90th simple cold come in and act like they're about to die of the plague and ask for all the wrong medicines, and you have to tell them, “It's just a cold. Drink some fluids. Get some sleep. Tomorrow, you'll feel fine.” AI will take over that.
Scott: I mean, saying that's kind of an easy job?
Susan: I'll say this: that we train doctors and pilots for that last little one percent. AI's going to cut that down to a half a percent, and then a quarter of a percent. Take away that 99, that big huge grunt of stuff that is all normal, right?
We understand the things that they go through. Here's the lifecycle of the flu. Here's the whatever. Here's the things to look for. It is this.
Scott: Doctors, all your jobs, they're gone.
Susan: Doctors, pilots.
Scott: Pilots gone.
Susan: Oh, did you see the ticket bot?
Scott: Ticket bot?
Susan: Trying to get you out of tickets, law programs.
Scott: Well, now it's an arms race, right? To give you tickets and to get you out of tickets. Funding the AI technology boom. The war on tickets. Old laws become asphyxiating.
Susan: Another interesting thing is, as technology gets better we're able to enforce laws that when they were put in place they were never intended to be enforced at that level.
Just think about speeding tickets and the like. The idea of speeding and the resources put into catching people and all that were from before we had technology — like cameras and stuff like that. The laws were put in place back then.
Photo credit: Eun-su
Scott: Yeah, a while ago.
Susan: Now, we get better and better and easier and easier enforcement, and we enforce disproportionately to how the laws were initially put in place.
Susan: This enforcement allows for new realms of ... I don't want to say abusing the law, but making it very easy to be in violation of the law and get caught, to the detriment of society, you know?
Scott: It might go sour, if you can be caught for everything.
Susan: Literally. I mean, you walk out the door and you get in your car, you've probably broken four laws..
Scott: Maybe the laws will get better defined now. Couldn't that be a good thing? No, probably not. That's not going to happen.
Susan: An AI-enforced legal system, oh man!
Scott: Things might go quicker. No, they'll just frustrate you to extract more money from you.
Susan: Pretty much, yeah.
Scott: Isn't that the Government's job, roughly, to protect you just enough to extract value out of you?
Susan: Just enough? Ooh, I think we're in a different territory there.
Scott: Well, they're investing a lot in AI.
Susan: Maybe I want some AI in my government.
Creatives join doctors and pilots in the homeless shelters
Scott: Well, for now, until they get too efficient. Creative jobs, writers, anything like that? What's that? What's going to happen to them?
Susan: Well, I mean, doesn't Facebook already have like a snippet writer for articles?
Scott: Yeah, rather than a clickbait title, how about a little summary? Here's two sentences that, hey, if you just knew this one weird trick about swimming pools...
Susan: There's been a lot of research about actually generating well-formed text, well-formed software. We're seeing code assistance inside of, again, Facebook, trying to write little snippets of where things might be going wrong. We're seeing ... Coders, look out. Writing copy, actually taking and summarizing articles, these are all areas where AI is making real progress. I mean, let's be honest here. It's not there yet, but does that mean six months from now, a year, five years?
Scott: Give it a couple years, stir the pot.
Susan: Even if it's ... We'll give it 10 years away, 10 years away, taking away what we thought was deeply creative work? That is a staggering thought right there, you know?
Susan: That's deeply into interesting job territory there.
Scott: You mean, pop music is formulaic?
Susan: Oh, geez, of course. Pop music ... Oh man, we've got to write a...
Scott: A pop music bot?
Susan: A pop music bot.
Scott: I love it.
Susan: Well, we've got to give it a cool, cool name, too, like Electon, elector ... I don't know. Let's see what. I'm going to make up a word and see how it gets transcribed.
Scott: Yeah, there you go.
Susan: That's what it'll be, Eclector.
Susan: Yeah, DeadElectron! I like it, DeadElectron.
This company really does make AI music.
Scott: There we go, yeah. Is anything going to get better? Is my life going to be easy now, because AI dictates my life? Are you going to get the Jetson's finger cramp from pressing the button?
Susan: From pressing the button? It was a long day at work! My finger is so cramped from pushing that button all day long!
Scott: Now you can have a to-do list, generated from your friendly AI, telling you what to do.
Susan: That's a cool thing. Tell me another good thing, Scott?
Scott: Yeah, do you think that that's actually a good thing, though?
Susan: Of course, it's going to be shaping the way that you're going to do your day. I mean, yeah, AI is saying, hey, you had these four tasks, and this fifth one that you don't quite remember me putting on there for you.
Scott: You're supposed to do this. You're supposed to walk by this billboard.
Susan: I don't really remember saying I had to do this.
Scott: You're supposed to buy the no-egg mayo, okay?
Susan: We're basically going to be well-trained humans. I mean, really, people want to be lazy, right?
Scott: It's the human condition.
Susan: It's the human condition. If you take that cognitive load off, it's like, “Yours; you can take it.” Then, you just follow along like the herd. As long as it's not too far away from what you expect, you're like, ah, whatever. Here's the list of things I'm supposed to do today to be a functioning adult. Oh, on there is buy such-and-such a brand mayonnaise. Oh, of course, that's what I wanted, click.
What about AI dating?
Susan: What do you think about AI dating? Will your soulmate be picked out?
Scott: For you, yeah, so you have my AI talk to your AI. They'll figure things out.
Susan: Ooh, they can go on a pre-date.
The press seem to believe that we will be dating robots, when indeed, it's the robots who'll date in our stead—proxy dating.
Scott: Yeah, a pre-date to figure out if the ... It's like blind dating, but you blame the AI now, if it doesn't go well.
Susan: In a blink of an eye, will the AIs go through the whole thing, the whole date a couple times?
Scott: The whole thing. Yeah, a second later, they decided.
Susan: They'll decide to shack up together. They'll get married. They'll have little AI children. They'll have some messy fights. Then you'll decide to get divorced, and it'll all break up, and by the end of it, it comes back to you, and it says, "Stay away from this one."
Scott: It happens in the span of a second.
Susan: It's not going to end well.
Scott: It's going to take all the spice out of life.
Susan: Yeah, but what if it comes back and says, “This is the one.” Now, you sit down and you're like, “My AI says you're the one.” The other one says, “Yeah, my AI says you're the one,” so how should we act at that point.
Scott: Now you're entitled, right? You're stuck with me. We have to be together.
Susan: We have to be together. Should I even try?
Scott: You just start letting it go right then, right? There's no salad for dinner. It's two T-bone steaks and some Indian food afterward.
Susan: Your first meal, you're sitting there, you eat, you belch, you undo the top button. It's like, it doesn't matter. You're my one. You're not going to leave.
Scott: It's decided.
Susan: The AIs have determined what's going to go on here.
Susan: Wait. Let me check how many children we're supposed to have. Oh, two. AI is your new boss
Scott: They'll be watching you at work, too. It's like this is some little Santa stuff here.
Susan: Do you think they're going to be watching at work? What are they going to do?
Scott: Yeah, performance measurement, man. It's like the thing that everybody gets advertised to on Instagram now, that's about slouching. You tape it to your back, and it tells you if you're leaned over, and it zaps you, right? Now, at work, it'll be cameras watching you.
Susan: A performance review will come into your email, completely crafted by some sort of machine learning algorithm.
Scott: Every day.
Susan: It'll be brutal. Here are the 10 things you did well, and here's the 10 things you need to work on, you know?
Susan: The 10 up at the top— They're kind of fluff.
Scott: It's like we have to give you a sandwich.
Susan: Yeah, we've got to make you think you're any good but...
Scott: A perfectly crafted visual representation of your day to make you react and like, damn, I've got to work harder.
Susan: Here's the box you need to be in. If you're not in this box, you are a problem.
Susan: There's huge promise in that. Don't get me wrong, but there are dangers of fitting too much to the mean there.
The machines use us for only one thing
Scott: What's the value of a human anymore? I mean, this is just going to be The Matrix soon, where we're just a power source for AI overlords. What's the deal?
Susan: I never quite got the power source of the Matrix anyway.
Scott: I don't know the power source thing either, but some other thing ... I don't know.
Susan: That was weird.
Scott: Are we actually going to be more creative?
Susan: Oh, is only creativity left?
Source: Jehyun Sung
Scott: Our only job ... Yeah, that's it, and you take in all the inputs and you're like, “No, it should be this way.”
Susan: You do that one creative thing a year. That's it. There's this exact one moment where you add randomness to the system.
Susan: You do some irrational thing.
Scott: "No, it shouldn't be this way!"
Susan: Suddenly, you get $100,000 because that was your job. That one creative thing.
Scott: We needed that. We needed that random thought there. Yeah, everything ... We're too logical.
Susan: Thanks, you're our random number generator.
Scott: Yeah. What ... We already know the answer.
Scott: Random number generator.
Susan: They don't need us for power. They need us for random number generation.
Scott: We figured it out, yeah, yeah. Well, this is the value of children, right?
Susan: Talk about randomness. You never thought the things that could happen would happen. Yeah, 2:00 in the morning, what is that sound? Why is there paint everywhere? Please, please, I told you, not the cat! I'm sorry, is that what we were talking about, Scott?
Scott: We'll just all be children in the AI world now. You know, just banging pots and pans together.
Susan: I'm looking forward to that.
Scott: Yeah, it sounds like a pretty sweet existence, right?
Susan: Nap time, you know?
Scott: With milk, warm milk, and you have your blanket.
Susan: It's going to be awesome. Yeah, the lights slowly go down.
"Napercise", as David Lloyd Clubs in London calls it, is not far off. Soon all humans will do is be randomly creative and drink juice from sippy cups.
Scott: Is this something you did at school back in the day? This is something that they do at school now in California, at least. You have your own blanket. You get your milk. You take a nap.
Susan: I love it.
Scott: You're like 10.
Susan: Whoa, 10?
Scott: I mean, 9, 8-
Susan: Great. I remember doing it from kindergarten.
Scott: Yeah, I know. I remember it way back. Did you have your special mat?
Susan: The most valuable skills from kindergarten ... We completely forget those. Everybody should have nap time.
Scott: Well, because we get trained to be more like robots, but now the robots are finally going to take over the rightful owner of those tasks, and now we just get to be children all the time.
Your insurance premiums go up
Susan: They're going to tell you when you're not doing the optimal thing each day, right? This is when you should be napping. Oh, man, oh, you didn't nap. For that day, your insurance premium went up by a dollar. You are not living the optimally healthy life.
Scott: Yeah, for randomness. They need you to be healthy for the randomness.
Susan: Yeah, but I'm just saying that AIs are going to come in and judge every second of your life, and you're going to be charged more based off of you not living the right way. That beer you wanted? It didn't just cost a couple bucks. It also got reported back to your insurance company.
Scott: Well, maybe human lifespan doesn't need to be as long, now, because we lose our mojo by the time we're 30, 20, 15, you know?
Scott: It's like, eh, screw you, after a while.
Susan: Ooh, it could be like Logan's Run. At 25, you’re dead—only they trick us. They say, “Your brain is being uploaded into the cloud. In reality, no.
Scott: Bye, everybody. I can't wait to see all my ancestors.
Susan: Yeah, some quick little chatbot has put up that fake that says it's you for about a day, until everybody forgets that you exist.
Scott: That's true. Let's say, a week, right? Yeah.
Susan: A week, for a week your loved ones are typing to you.
Scott: "Oh, look at you. Hey ..."
Susan: "Oh, it's so great in the cloud." They say, “Yes, you will love it here.” A week later, "Got to go now. There's so many exciting things. I can't pay attention to chat anymore."
Scott: Yeah, "I can't wait to see you."
Susan: Then erase that from the system.
Scott: Then they ghost you, yeah. All right, well, we have some worst case scenario. Maybe in the future, we'll have some best case scenario. I mean, the best?
Scott: At least something.
Susan: Do you think AI could actually be good for the world?
Scott: Nah, I don't think so.
Susan: You know what? I have a sneaking suspicion that for everything we said was bad, there might be a couple good things.
Scott: Are there some good things?
Susan: There just might be.