Will There Ever Be Too Much AI?

.. Technology. It’s advancing faster and taking less time to be widely adopted than ever before, like as in it took roughly 10,000 years to go from writing to printing press, but only about 500 more to get to email.

Now it seems we’re at the dawn of a new age, the age of A.I… Artificial Intelligence. Please define. [automated voice speaking] Uh-huh, okay. There you have it. What does it mean? I don’t know.

Tons of folks are working on it, right? Most people don’t know that much about it, and of course, there’s no shortage of data or opinions. Anyway, I’ve heard it said that the best way to learn about a subject is to teach it, but to level with ya, I have a wildly incomplete education.

.. Not in my day job, where I’ve been A.I.-adjacent for over a decade. Anyway, I figured now would be as good a time as any to catch up on the state of things regarding this emerging phenomenon. My sense of it is it kind of feels like Pandora’s box, maybe.

.. ish? Much of my understanding on this topic has come from sci-fi stories, which usually depict us heading toward Shangri-La or dystopia. Like most things, I suspect the truth is probably somewhere in the middle.

Now, along the way, we’ll demystify some common misconceptions about things we thought we understood, but probably don’t, terms such as “machine learning,” “algorithms,” “computer vision” and “Big Data,” they will be conveniently unpacked to help us feel like we know what we’re doing, kinda.

By the way, Pandora’s box… wasn’t a box. It… was a clay jar. How about that? Demystified. A.I. is teaching the machine, and the machine becoming smart. Each time we create a more powerful technology, we create a bigger lever for changing the world.

[computer] Autonomous driving started. [Downey] It’s an extraordinary time, one of unprecedented change and possibility. To help us understand what’s happening, this series will look at innovators pushing the boundaries of A.

I… No, stop! [Downey] …and how their groundbreaking work is profoundly impacting our lives… Yay! [laughing] [Downey] …and the world around us. In this episode, we’ll meet two different visionaries exploring identity, creativity, and collaboration between humans and machines.

Intelligence used to be the province of only humans, but it no longer is. We don’t program the machines. They learn by themselves. Mm. Ah. That’s good. All right. My background’s always been a mixture of art and science.

I ended up doing a PhD in bioengineering, then I ended up in the film industry, working on King Kong to Avatar, simulating faces. I’d got to a point in my career where I’d been, you know, lucky enough to win a couple of Academy Awards, so I thought, “Okay, what happens if we actually tried to bring those characters to life, that actually you could interact with?” [toddler crying] Baby.

.. Ooh. [toddler fusses] What can you see? So “Baby X” is a lifelike simulation of a toddler. Hey. Are you excited to be here? She’s actually seeing me through the web camera, she’s listening through the microphone.

Woo… yeah. Baby X is about exploring the nature of how would we build a digital consciousness, if it’s possible? We don’t know if it’s possible, but we’re chipping away at that problem.

Hey, Baby. Hey. [Downey] “Problem” is an understatement for what Mark’s chipping away at. His vision of the future is one where human and machine cooperate, and the best way to achieve that, he thinks, is to make A.

I. as life-like as possible. Peek-a-boo! [Baby X giggling] [Downey] Which is why he began where most life begins… a baby… modeled after his own daughter. So if we start revealing her layers, she’s driven by virtual muscles, and the virtual muscles, in turn, are driven by a virtual brain.

Now, these are radically simplified models from the real thing, but nevertheless, they’re models that we can explore how they work, because we have a real template that exists, the human brain. So, these are all driven by neural networks.

[Downey] “Neural network” is a virtual, much simpler version of the human brain. The brain is the most complex system in our body. It’s got 85 billion neurons, each of which fire non-stop, receiving, processing, and sending information.

Baby X’s brain is nowhere near as complex, but that’s the goal. Instead of neurons, it’s got nodes. The more the nodes are exposed to, the more they learn.

[Sagar] What we’ve learned is it’s very hard to build a digital brain, but where we want to go with it is we’re trying to build a human-like A.I. which has a flexible intelligence that can relate to people.

I think the best kind of systems are when humans and A.I. work together. One of the biggest misconceptions of A.I. is that there is a super-intelligent being, or what we call a generalized A.I., that knows all, can do all, smarter than all of us put together.

That is a total misconception. A.I. is built on us. A.I. is mimicking our thought processes. A.I. is basically an emulation of us. [Downey] Like visionaries before him, Mark’s a dreamer. The current state of his moonshot, however, is a little more earthbound.

[computer] Thank you for granting access to your microphone. It’s good to hear you. [Downey] Today, most avatars are basically glorified customer-service reps. [service avatar] Rest assured, your health is my primary concern.

[Downey] They can answer simple questions and give scripted responses. I love helping our customers, so I’m keen to keep learning. [Downey] Beats dealing with automated phonelines for sure, but it’s a far cry from Mark’s ultimate vision.

.. [Sagar] Hey, Baby. Hey. [Downey] …to create avatars that can actually learn, interpret, and interact with the world around them, like a real human. What’s this? Spider.

So we’re starting to get a spider forming in her mind here, she’s starting to associate the word with the image. So, Baby… spider. Spider. Spider… Good! Okay, what’s this? [Baby] Spider.

No. This is a duck. Look at the duck. [Baby] Duck. [Sagar] Yeah. [Downey] Baby X uses a type of A.I. called “object recognition.” Basically, it’s how a computer sees… how it identifies an object, like a spider, or tells the difference between a spider and a duck.

It’s something that you and I do naturally… …but machines, like Baby X, need to learn from scratch, by basically sifting through enormous piles of data to search for patterns, so that eventually, it can drive a car, or pick out a criminal in a crowded photograph, or tell the difference between me and.

.. that guy. [Sagar] But now I’m gonna tell her that spiders are scary. Look out! Rawr! Scary spider! Rawr! [crying] Hey, hey. Don’t cry. It’s okay. Hey… [Baby crying] Hey, it’s okay.

Now she’s responding emotionally to me as well, so we’ve gone all the way down to virtual neurotransmitters, hormones, and so forth, so Baby X has a stress system. If I give her a fright… Boo! So we’ll see basically some noradrenaline was released then, and she’s gone into a much more vigilant state of mind.

[Downey] What Mark is working on is known as “affective computing,” A.I. that interprets and simulates human emotion. I believe that machines are gonna interact with humans just the way we interact with one another, through perception, through conversation.

So as A.I. continues to become mainstream, it needs to really understand humans, and so we want to build emotion A.I. that enables machines to have empathy. Hello, Pepa. -Hello. -[man] Hello. -Hello.

-Hello. -Hello. -[laughing] Oh, dear. -We can do this forever. -I know we could. [laughs] [Howard] They’ve showed, for example, older adults who have A.I. aides at their nursing homes, they are happier with a robot that emotes and is social than having no one there.

That’s really the enhancement of human relationships. [Sagar] Hey… Hello. You know, human cooperation is the most powerful force in human history, right? Human cooperation with intelligent machines will define the next era of history.

Using a machine which is connected with the rest of the world through the Internet, that can work as a creative, collaborative partner? That’s unbelievable. [will.i.am] Jessica. Jessica. One more time, one more time.

See also  Debunking History: How Artificial Intelligence Emerged Victorious

We’re gonna go from just the first two verses, and the first two verses will take us to three minutes, okay? I love music. The whole concept of music is collaboration, so if there are some people that see me as a musician, that’s awesome.

I first became interested in A.I. because A.I. is a very fruitful place to create in. It’s a new tool for us. I dream, and make my dreams reality, whether the dream is a song or the dream is an avatar of myself.

One time, a friend was like, “Well, you can’t clone yourself. You can’t be in two places at once.” That’s the promise of the avatar. I left it over there. All right, here we go. [Sagar] So, you’re about to enter the Matrix.

I’m gonna sort of direct you through just a bunch of poses. [will.i.am] The team from Soul Machines is here to create a digital avatar of myself. They had to put me in this huge contraption with these crazy lights.

What do you want me to do? [Sagar] Your face is an instrument. All the wrinkles on the face is like a signature, so we want to get the highest-quality digital model of you that we can. Okay. [chuckles] [Sagar] Yeah, that’s perfect.

Okay, go. [rapid shutters snapping] [Sagar] So we have to capture all the textures of their face. The geometry of their face… Big, gnashy teeth. How their face deforms to form the different facial expressions.

And how about a kiss? You could do… With my eyes closed? ‘Cause I don’t kiss with my eyes open. Every once in a while, I peek. [cameras snapping] I wanted to have a digital avatar around the idea of Idatity, and that’s the marriage of my data and my identity.

Everyone’s concerned about, like, identity theft. Meanwhile, everybody’s giving away all their data for free on the Internet. I’m what I like and what I don’t like, I’m where I go, I’m who I know.

I’m what I search. I am my thumbprint. I am my data. That’s who I am. You pull your eyelids down like that. We want to get that… yup. [will.i.am] When I’m on Instagram and I’m on Google, I’m actually programming those algorithms to better understand me.

Awesome. In the future, my avatar’s gonna be doing all that stuff, because I’m gonna program it. Get entertained through it, get information through it, and you feel like you’re having a FaceTime with an intelligent entity.

[laughing] “Yo, check out this link.” “Oh, wow, that’s crazy.” “Yo, can you post that on my Twitter?” [laughter] -Hey. -Hey. All right, I’m the Soul Machines lead audio engineer.

Hopefully we’ll be able to build an A.I. version of your voice. After creating Will’s look, then we now have to create his voice. For that, we actually have to capture a lot of samples about how Will speaks, and that’s actually quite a challenging process.

-Shall we kick off? -Yeah, let’s kick off. -A’ight, boo, here we go. -Yeah. I’m Will, and I’m happy to meet you. I’m here to bring technology to life, and let’s talk about Artificial Intelligence.

Oops. Really? Whoa. That’s dope! So there’s so many ways of saying “dope,” bro. Yeah, yeah. Now, how realistic is it going to be? This will sound like you. The sentences can be divided up into parts so that we can create words and build sentences, like LEGO blocks.

It will sound exactly like you. Well, maybe we don’t want to have it too accurate. So you don’t freak people out, maybe I don’t want it accurate. Maybe, there should be some type of… “That’s the A.

I.,” ’cause this is all new ground. -Yeah. -Like, we’ve… we are in an intersection of a place that we’ve never been in society, where people have to determine what’s real and what’s not.

[Downey] While Mark jets back to New Zealand to try to create Will’s digital doppelganger, Will’s left waiting, and wondering… can Mark pull this off? What does it mean to have a lifelike avatar of you? A digital replicant of yourself? Is that a good idea? How far is too far? [Domingos] We’ve been collaborating with machines since the dawn of technology.

I mean, even today, in some sense, we are all cyborgs already. For example, you use OKCupid to find a date, and then you use Yelp to decide where to go, you know, what restaurant to go to, and then you start driving your car, but there’s a GPS system that actually tells you where to go.

So the human and the machine decision-making are very tightly interwoven, and I think this will only increase as we go forward. [Downey] Human collaboration with intelligent machines… A different musician in a different town with a different approach is giving the same problem a shot.

[Gil Weinberg] People are concerned about A.I. replacing humans, and I think it is not only not going to replace humans, it’s going to enhance humans. I’m Gil Weinberg. I’m the founding director of Georgia Tech Center for Music Technology.

[plays piano] Ready? In my lab, we are trying to create the new technologies that will explore new ways to be expressive… to be creative… Shimon, it’s a marimba-playing robot. [playing marimba] What it does is listen to humans playing, and it can improvise.

Shimon is our first robotic musician that has the ability to find patterns, so, machine learning. Machine learning is the ability to find patterns in data. So, for example, if we feed Shimon Miles Davis, it will try to see what note is he likely to play after what note, and once it finds its patterns, it can start to manipulate it, and I can have the robot playing in a style that maybe is 30% Miles Davis, 30% Bach, 30% Madonna, and 10% my own, and create morphing of music that humans would never create.

[band playing tune] [Downey] Gil’s groundbreaking work in artificial creativity and musical expression has been performed by symphonies around the world… …but his innovation also caught the attention of another musician.

.. Okay. [Downey] …a guy who unexpectedly pushed Gil beyond enhancing robots to augmenting humans. [Weinberg] I met Jason Barnes about six years ago, when I was just about finishing one phase of developing Shimon, and I was starting to think, “What’s next?” [Barnes] I got my first drum kit when I was 15, on Christmas, and when I lost my limb, I was 22, so I was kind of used to having two limbs.

I started trying to fabricate prosthetics to try and get me back on the kit, which eventually led me to working and collaborating with Georgia Tech. [playing drums] [Weinberg] He told me that he lost his arm, he was devastated, he was depressed, music was his life, and he said, “I saw that you develop robotic musicians.

Can you use some of the technology that you have in order to allow me to play again like I used to?” So that’s the prosthetic arm that we built for Jason. When he came to us, he just wanted to be able to use sensors here so he can hold the stick tight or loose.

I suggested “Let’s do that, but also, let’s have two sticks. One stick can operate with a mind of its own, understanding the music and improvising. One stick can operate based on what you tell it with your muscle, and also, each one of the sticks can play 20 hertz.

.. …faster than any humans, and together, they can create polyrhythm, create all kind of textures that humans cannot create.” All right. I think we’re ready to play. [all playing tune] [Downey] In some ways, the robotic drum arm allows Jason to play better than he ever has, but it still lacks the true function, or feeling, of a human hand.

[Weinberg] They don’t provide the kind of dexterity and subtle control that would really allow anything. [Downey] This revelation drove Gil to his next innovation… the Skywalker Hand.

Inspired by Luke Skywalker from Star Wars, and created in collaboration with Jason, the revolutionary tech brings what was once the realm of sci-fi a little closer to our galaxy.

See also  Coloring Book with Private Label Rights Review

[Barnes] This is just like a 3D-printed hand that you can, like, download the files online. [Downey] Currently, most advanced prosthetic hands can’t even thumbs-up or flip you the bird.

They can only open or grip, using all five fingers at once. Most of the prosthetics that are available on the market nowadays, um, actually use EMG technology, which stands for “electromyography,” and essentially what it does is there are two sensors that make contact with my residual limb, and they pick up electrical signals from the muscles.

.. So again, when I flex and extend my residual limb, it will open and close the hand, um, and I can rotate as well, but the problem with EMG is it’s a very vague electrical signal, so zero to 100%.

It’s not very accurate at all. The Skywalker Hand actually uses ultrasound tech. Ultrasound provides an image, and you can see everything that’s going on inside of the arm. [Downey] Ultrasound uses high-frequency sound waves to capture live images from inside the body.

As Jason flexes his muscles to move each of his missing fingers, ultrasound generates live images that visualize his intention. The A.I. then uses machine learning to predict patterns, letting a man who’s lost one of his hands move all five of his fingers individually, even if he’s as unpredictable as Keith Moon.

[Howard] The work that Gil is doing is really important. Gil comes from a non-engineering background, which means that his technology and the way he thinks about robotics is actually quite different than, say, the way I would think about it, since I come from an engineering background.

And the commonality is that we want to design robots to really impact and make a difference in the world. [Weinberg] We were able to create a proof of concept with Jason Barnes. Once we discovered that we can do this with ultrasound, immediately I looked at, “Hey, let’s try to help more people.

” [Jay Schneider] That’s okay, just leave me hanging, holding it. It’s not heavy or anything. [Barnes] It’s safe, if you want to slide it back… No, no. I’m messing with you. So I met Jason Barnes at an event called “Lucky Fin Weekend.

” They’re a foundation that deals with limb difference. There we go. -Ah, all right. -And it’s out. [Schneider] Do you ever work on your car without the hook? Not really. It’s just way easier and efficient for me to.

.. The hook, the hook really trips me out, though, man. [Schneider] When I lost my hand, it was close to 30 years ago, and prosthetics were kind of stuck in the Dark Ages. [rock drums and bass playing] In general, they didn’t really do a whole lot, and even if they moved, they seemed to be more passive than actually worthwhile to use.

I don’t like to talk about my accident, because I don’t feel it defines me. The narrative on limb-different people has been the accident. “This is what happened, and these are these sad things,” and it becomes inspiration porn.

For me, for example, right, if I do something, I have to, like, smash it out of the park, because otherwise I feel like there’s gonna be this, “Oh, well, he did it good enough because he’s missing his hand.

” -Yeah, yeah. -And I’m like, “F that!” Like, I want to… I’m gonna be as good or better than somebody with two hands doing whatever I’m doing, you know? Prosthetics, at this point in my life, don’t really seem like something I would want or need.

[Weinberg] Manual robotic prosthetics have not been adopted well. Amputees try them, and then they don’t continue to use them. [Barnes] Yeah, man, you stoked to check out the lab? Yeah, yeah, for sure.

Right now, I’m the only amputee that’s ever used the Skywalker Arm before. Did you have… were you right-handed? No, I was born left-handed, actually. Oh, you lucky bastard. -Yeah, I know, right? -I was right-handed.

[Barnes] It was extremely important to get as many different people as we can in there, including other amputees. It’s hard to find people that are amputees in general, and then, like, upper-extremity amputees is the next thing, and then finding people who are willing, to step out of their comfort zone -and then do this.

-Right. [Schneider] When I met Jason, I found it really interesting that we had a lot in common, because we were both into cars, we were both into music. -Hi, Gil. -Hey. What’s up? -Jason. Nice to meet ya.

-Nice meeting you. He’s a step or two ahead of me with the technology stuff. [Barnes] The way this hand works is it essentially picks up the ultrasound signals from my residual limb, so when I move my index finger, it’ll move my index.

.. ring… [Schneider] Wow, for the first time, prosthetics are finally getting to the point where they’re getting pretty close to actual human hand. You know, it got me excited. I was like, “This is the type of thing that I’ve been waiting for.

” If I was ever going to try one again, this would be the type of stuff that I would want to check out. When I move my thumb… [laughter] I know from experience that it’s not always working perfectly.

It’s very interesting for me to have someone else who comes and tries our technology to see if it can be generalized. Is my arm getting warmer because you’re wrapping it, or does that have heat in it? -It does have heat in it.

-Oh, okay. First thing we need, if we’re gonna get Jay to try the hand, is we need to get a custom-fit socket to his arm that’s comfortable and fits nice and snug. You comfortable when they do this? This is the most awkward part for me.

-Nah, it was kinda weird. -Ah, yeah. Yeah. I was 12 years old when I lost my hand and had a prosthetic for six months, and pretty much ever since then, I haven’t used it, and it’s been close to 30 years now.

And there’s the impression of your arm. That’s way easier than I thought it was gonna be. That’s wild, yeah! It may not be right for me, but this is something that could really, really help people’s lives.

It would be really cool to have a hand in helping to develop the technology. All right. All right, ready? Just slide it in. Turn this… tighten. [knob ratcheting] How tight? As tight as you can before it really hurts.

.. -Oh, really? -…because the tighter it is, -the better reading we’ll see. -Okay. -Now we apply the probe… -Okay. …so it can read your movements. Now we also have to work on the algorithm and the machine learning, and for this, we will need you to train.

Okay. An able-bodied person, when you move your finger, you’re not thinking about moving your finger, you just do it, because that’s how we’re hardwired, but, honestly, I don’t really remember what it was like to even have that hand.

[Weinberg] Even though an amputee doesn’t have a thumb, they still have the muscle. You still have some kind of memory of how you moved your fingers, and you can think about moving your phantom fingers, and the muscles would move accordingly, and that’s exactly what we use in order to, uh, recreate the motion and put it in a prosthetic arm.

But does Jay still remember how to move fingers that he didn’t have for, I believe, 30 years ago? Now we’ll run the model, and you’ll be able to control the hand. [chuckles] You’re optimistic.

I’m crossing fingers. Can I cross these fingers? [laughs] Is that… is that an option yet? Having Jay here for a day and hoping to get him to a point that he controls finger by finger, I’m a little concerned that it will not work in such a short period of time.

Okay. And… -Ready? -Yeah. You should try each of the fingers. All right, that’s the thumb… -Oh, shit! -Unbelievable. All right, index… Yay! Wow, I’m surprised. Middle… [Barnes] Dude. Five for five? -[all cheering] -All five of them! -Whoa.

See also  Deep Reinforcement Learning: Witness How AI Perfected the Art of Parking

-That’s wild. All right, let me do it again. You’re a natural, man. Doesn’t that feel crazy? -Yeah! -Feels wild. -I didn’t think it’d be as good. -I didn’t either. He hit me in the back after it worked, so.

.. That’s the first time. [Schneider] It’s like a game-changer, even in its infancy, which is kind of insane, because it can only get better from there. And it’s really cool to play a small part in that.

[Weinberg] Now we have two main goals. First, you need to move your muscle or your phantom finger, and immediately see response, so this is one direction of research. The other direction is to make it more accurate.

Being able to type on a keyboard, use a computer mouse, uh, open a water bottle, things like that that most people take for granted. It’s kind of like a… you know, sci-fi movie, soon to be written.

-[laughter] -Give us five, right? That’s awkward… oh, robot to robot hand. Nice! -That’s… that was real, right? -Yeah. If I find out you guys had a button under that desk… No, nah, I promise.

I promise. [Downey] What began as one man’s pursuit to innovate music through A.I. and robotics unexpectedly became something much greater. A human body cooperating with a bionic hand is one thing.

.. but is it possible to humanize a machine to the point that it truly seems lifelike? Or is that still sci-fi, and far, far away? [Greg] How did things go with Will? [Sagar] You know, one of the real challenges there was just getting enough material that we could actually come back with.

We can’t possibly capture somebody’s real personality, you know, that’s impossible, but in order for it to really work, it’s really important to capture a feeling of Will. Right, so…

[Downey] Will’s avatar is actually Mark’s first go at creating a digital copy of a real person. Wow, that’s looking pretty good. [Downey] He’s not just trying to clone a human, by any stretch, but trying to create an artificial stand-in that’s somewhat believable.

Still, like most firsts, it’s bumpy, and it’s a cautious road into the unknown. [tech] A big challenge that I’ve found while I’ve been looking through a lot of the images is it seems that Will was moving a lot during the shots.

[Colin Hodges] Okay. When we’re building digital Will, we have about eight artists on our team that come together and pull all of the different components to bring together this real-time character that’s driven by the artificial intelligence to behave like Will behaves.

Big challenges we’ve got is how we create Will’s personality. Yeah. Like, the liveliness and the energy that he generates, and the excitement. The facial hair was a challenge. Because it’s so sparse, it’s quite tricky to get the hair separated from the skin.

[Sagar] We have to be able to synthesize the sort of feel that you’re interacting with Will. So, Teah, I’ve got some stuff to hear. We’ve got 16 variations. -16 variations? -Yeah. [Sagar] We take the voice data that we’ve got, and then we can enable the digital version of Will to say all kinds of different things.

[digital Will] Here’s the forecast. Yo, check out the forecast. Yo, check out the weather and shit. Here’s the weather. Check out the weather. Yah, ’bout to make it rain! Kinda.

[Sagar] That’s fantastic… the words, the delivery, emphasis… Shows you just how complex people react. [will.i.am] It’s awesome where we are in the world of tech. Scary where we are, as well.

My mind started thinking, like, “Wait a second here. Why am I doing this? What’s the endgame?” Because, eventually, I won’t be around, but it would. [Downey] Will’s endgame is more modest than Mark’s: a beefed-up Instagram following, a virtual assistant, anything that might help him expand his creative outlets or free up time for more creative or philanthropic pursuits.

Okay, so, here we go. That’s looking really different. It’s gonna be really interesting, because, you know, it’s not every day you get confronted with your virtual self. Right. Does he feel that this is like him? If it’s not representative of him or if he doesn’t think it’s authentic, then he won’t want to support it.

-What up, Mark? -Oh, hey, how are you? -You can see me, right? -Yes. Yo, wassup? This is will.i.am. [laughing] [Sagar] This is the new version of you. We can give him glasses there.

[will.i.am laughs] That’s awesome. I remember I had a pimple on my face that day. You captured it. The good thing is, it’s digital, and we can remove it really easily. How come you didn’t remove that? [laughs] [Sagar] You can make him do a variety of things.

Let’s play “Simon Says.” Say, “I sound like a girl.” I sound like a girl. Say that with a higher pitch. [high voice] I sound like a girl.

Raise your eyebrows. Poke out your tongue. [Will laughs] [will.i.am] Tell me about growing up in Los Angeles. I was born and raised in Boyle Heights, which is west of east Los Angeles, which is east of Hollywood.

Just east of downtown. [will.i.am] Should it sound exactly like me? Nope. Should it sound a little bit robotic? Yes. It should. For my mom. My mom should not be confused. What’s your name? [in Spanish] Mi nombre es Will.

[in English] You speak Spanish? I don’t know. [laughing] I know it needs some fine-tuning, but the way it’s looking so far is mind-blowing. Thanks, Mark. Yeah, no worries. [Sagar] How far do you go down that path until you can label it a living.

.. a digital living character? This raises some of the deepest questions in science and philosophy, actually, you know, the nature of free will. How do you actually build a character which is truly autonomous? Peek-a-boo! [Baby X giggles] What is free will? What does it take to do that? [Weinberg] Artificial Intelligence is crucial to the work we are doing, to inspire, to surprise, to push human creativity and abilities to uncharted domains.

[all cheering] Unbelievable. [playing drums] [Downey] Free will… …it’s something we’ve been grappling with for thousands of years, from Aristotle to Descartes, and will continue to grapple with for a thousand more.

Will we ever be able to make an A.I. that can think on its own? A second, artificial version of me that is truly autonomous? A Robert that can actually think and feel on his own, while this Robert here takes a nap? [engines roaring] Impossible? Well, when you consider what human cooperation has already accomplished.

.. a man on the moon… decoding the human genome… discovering faraway galaxies… I’d put my money on dreamers like Mark and Gil over the “Earth is flat” folks any day.

Until then… nap time. [man 1] Look at our world today. Look at everything we’ve created. Artificial Intelligence is gonna be the technology that takes that to the next level. [man 2] Artificial Intelligence can help us to feed the world’s population.

[man 3] The fact that we can find where famine might happen, it’s mind-blowing. These are conflict areas, this is an area that we need to look at protecting. Then launch A.I. [man 4] We are going to release the speed limit on your car.

Tim, can you hear me? [man 5] With A.I., ideas are easy, execution is hard. [Domingos] What excites me the most about where we might be going is having more super-powers… [firefighter] I got him! [Domingos] .

..and A.I. is super-powers for our mind. [man 6] Even though the limb is synthetic materials, it moves as if it’s flesh and bone. [woman 1] You start to think about a world where you can prevent disease before it happens.

[man 7] A.I. can give us that answer that we’ve been seeking all along… “Are we alone?” Bah! [man 8] I love the idea that there are passionate people dedicating their time and energy to making these things happen.

Get Your Download Immediately

Get Instant access to our Keto Recipe Ebook

You have Successfully Subscribed!