I know this was already reviewed by Leon last year. I watched it for the first time this March. It has bothered me ever since, so I decided to purge it from my system by writing my own review of the film. Can you tell I didn’t really like it?
The movie opens up and we meet our main character Caleb (Domhnall Gleeson). He just won the “Staff Lottery”.
Whatever gets him out of having to program a search algorithm in C++ is a good thing. C++ is not the most fun language to program in, which is why I assume he switches to using Python later on in the film. If you think that’s going to lead to Caleb asking the robot about non-computable functions such as The Halting Problem, then you’re going to be very disappointed. These are functions that to a computer cannot be evaluated to obtain their result. That means there are problems that cannot be solved by machine intelligence. This of course leads to a debate about what exactly does natural intelligence have that machine intelligence doesn’t, and what does that say about whether a machine has consciousness. All things that will not be brought up. These are things I learned in a course I took before a basic ground level Computer Science course. A remedial CS course. Caleb not knowing these things is like making a movie about racing when the main character, who is a mechanic, can’t change a tire.
Now we are off to where Oscar Isaac’s character lives.
I know his name is Nathan, but I just kept referring to him as Beard while watching the film. When Caleb shows up to enter his place I had to pause the movie.
I love watching movies on the iPad with the Amazon Prime app because it not only tells you the characters, but it also drops in trivia about the current scene. This is how I know that this takes place at the Juvet Landscape Hotel in Alstad, Valldal, Norway. Beard has a pretty nice place. It comes with it’s own Kubrickian hallways…
The Shining (1980, dir. Stanley Kubrick)
and Oldboy (2003) prison rooms.
Oldboy (2003, dir. Chan-wook Park)
Beard gives Caleb a key that will only open rooms and let him use devices that he is allowed to use. It’s kind of like a computer game. In fact, that’s how you could describe the whole movie in a nutshell. It’s a game composed of cutscenes with the robot and Beard, except you don’t get to run around in between, and there are no dialog trees.
After showing Caleb into his room, he tells him he needs to sign an Non-Disclosure Agreement. Again, I had to pause the movie here cause I was taking care of my dog while a new ceiling fan was being put in.
Okay, if you say so, then Beard saying his home is his research facility is a reference to It’s A Wonderful Life (1946). Now we get the Ex Machina definition of what a Turing Test is that isn’t what a Turing Test actually is. Caleb says:
“I know what the Turing Test is. It’s when a human interacts with a computer. And if the human doesn’t know they’re interacting with a computer, the test is passed”
Actually, the Turing Test is when you have a “Human interrogator” that is separated by a barrier with an isolated interface that let’s the interrogator interact with two sources that are both separated from each other and the interface. One source is a human being who has never met the interrogator. The other source is a machine that is being tested. If the interrogator cannot distinguish the two sources from each other than robot has passed the test.
Such a test is never really performed in this film unless you think that the point when Caleb cuts himself, he believes he might be a robot because he thinks he is indistinguishable from the robot. It’s not the same, but it may have been stuck in there to allude to that. Regardless, it means that it would be literally impossible for the Turing Test to even be performed since it would require three humans (a source, an interrogator, and someone setting up the experiment), and there are only two humans at Beard’s home. Here’s a nice diagram from the Second Edition of Introduction to Artificial Intelligence by Philip C. Jackson Jr.
It doesn’t really prove that the computer has “artificial intelligence” either, but that a human being can’t tell the difference. That simply means it can pass for human in this controlled environment. That’s of course why the film will unceremoniously toss it aside in favor of a setup that will allow for a lot of engagement between actors that is done face-to-face.
Thus begins the “test”.
We see Beard’s office first which is covered with post-it notes. I haven’t seen that since I think either the remake of Oldboy or that episode of Beverly Hills, 90210 where Brandon covered a professor’s office with them. Caleb also sees foreshadowing about previous robots who got a little cabin fever when he enters the interrogation room.
Yes, that is glass. Does he turn right around and walk out because he has obviously been lied to about taking part in a Turing Test? Nope. We just meet Ava played by Alicia Vikander.
I have to admit that I feel a little sorry for her. She really was cast in some terrible movies in 2015. You have this movie of course. You have her in The Man from U.N.C.L.E. where she is just there to make a reference to Anita Ekberg in La Dolce Vita (1959) by standing in a fountain while the male leads all but start making out. She was in The Danish Girl (2015) that will have more people asking children about their genitals since it makes being transgender all about bottom surgery. I loved how it didn’t tell you she had a uterus transplant at the end. Probably because leaving it in would mean the movie is actually equating having babies with identifying oneself as a woman like people think Avengers: Age of Ultron (2015) did. Then she was also in Burnt (2015), but I’ll be damned if I could find her in it while watching it.
No, I’m not kidding about The Danish Girl having that effect. People ask transgender children all the time about whether they plan to have surgery when they get older and that movie doesn’t help. Neither do other things, but I refuse to review that movie right now.
After some initial pleasantries, he asks how she learned to speak. She says she “always knew how to speak.” She also says she thinks she is clever since “language is something that people acquire” but she can apparently do it out of the box. He responds about language being inborn in the human mind, and that it is only attachment of words to this built-in stuff that allows us to speak. You could say it’s an explanation for why everyone gets a free language. This confuses Ava, but I’m more confused why rebooting Tomb Raider in 2018 with her as Lara Croft is a thing that’s happening. This is also a conversation of foreshadowing because we will get an explanation later that, surprise, surprise, shows again that Caleb really doesn’t know about Computer Science.
Now you’d think they would leave the Turing Test thing behind here, but no. They feel the need to rub their nose in it some more.
He says that “if I hid Ava from you, so you just heard her voice, she would pass for human.” Maybe, except the Turing Test uses terminals, not voices, since that really wouldn’t be a test of human intelligence, but speech synthesis. Take for example talking to Siri or a similar intelligent agent. The very fact that it speaks instantly makes people start to think of it as human. That’s an example of Weak AI rather than Strong AI, which is what Ava is supposed to be. They’ll bring up Strong AI later, but will conveniently leave out Weak AI because it would open up holes in the film.
He wants to show him Ava, then see if he feels she has consciousness. What he is actually saying is whether he has effectively recreated superficial aspects of a humanoid robot with a reasonably passable intelligent agent controlling those parts. That’s not even close to the same thing. That would be like saying you have proven somebody actually works for immigration because illegal immigrants run when they see someone in pressed pants, a white shirt, and holding a clipboard enter a factory. You’ve simply proven that you can make something that can socially engineer a person. This is why the separation factor is so important to not have in a movie like this that can’t have its ending if it doesn’t ignore these things.
I don’t know why he couldn’t just say that he already had some people test her in a proper Turing Test, but now he wants him to do what he is asking him rather than just claiming that of course she would pass. Oh, well.
Since this movie isn’t very well written, they now have Caleb spout some jargon about her language abilities. Beard quickly shuts this down by saying he isn’t going to explain how she works. By that, he means until later when he decides to do just that in order to remind us of a real world event that happened a few years ago. The power soon goes out after this too for the purposes of foreshadowing that somebody is doing it, and it changes where and what Caleb is able to do with his keycard.
Caleb is now woken up by some speechless Asian lady robot.
That makes two female robots. There must be a male one around here somewhere, right? Of course not. This would lead any reasonable person to wonder if he is building a brothel for straight men. The actual reason the movie tries to subtly slip in is that part of them being human is sexuality and gender. I’m assuming that means he has to test to make sure their vaginas work, and since they are both straight, then the robots must be women. Nope, still comes across like he is building a brothel.
Now Beard and Caleb have another conversation about how to test her. He basically breaks out more jargon, which Beard says isn’t important because too much thinking gets in the way of the drama and building of tension. Seriously, before it cuts to the image below, he says: “How does she feel about you?”
It starts off with Ava showing him a picture, but they already want to turn the tables and have Ava ask Caleb questions as if we really are interested in him. But first we find out that Caleb works for Beard’s own version of Google. Then the funniest thing in the movie happens.
He says that he is an advanced programmer. Sure, Caleb. Sure, and the majority of people knew what a race condition was when they started up Steve Jobs (2015).
Ava goes on to brag about how Beard wrote the code for not-Google when he was 13. She then asks him if he likes Mozart to which he responds that he likes Depeche Mode. What are you trying to say here, Garland? Maybe that people are people? Doesn’t matter because she doesn’t want to listen. She has other priorities like setting up the ending. The movie actually has very little to with robots. It’s about a woman who is imprisoned and uses the dumbest guy she can find to manipulate in order to get out. That’s the real movie in a nutshell. You won’t be asking yourself interesting questions here. It’s all kind of sleight of hand with what appears to be intelligent writing. Then the power goes out again and she takes credit for it before trying to drive a wedge between Caleb and Beard.
Then the power comes back online.
I have an idea. Ask her if on a hot summer night would you offer your throat to the wolf with the red roses. It’s totally random. No matter what she says, you respond that you bet she says that to all the boys. See how she responds. Of course not. This game has you on strict rails and doesn’t give you a dialog tree.
Blah, blah, blah. Now Beard says he is going to show Caleb where he created Ava even though he said he wasn’t going to tell him how she works earlier because it would ruin the test. Remember that they have established that Caleb is an advanced programmer, and at least as a degree in Computer Science. Beard asks him if he knows how he got her to “read and duplicate facial expressions.” That’s easy, you simply get a lot of training data and use it train something like a neural network or some other statistical model. Basic stuff for someone with a Computer Science degree that people use and interact with everyday in the form of a spam filter.
Moving onward, Beard reminds us of that incident a few years back when it was discovered that Apple was using people’s iPhones to do war driving in order to improve their location services. In Beard’s specific case, he simple turned on the camera and microphone on cellphones to get a bunch of data and used it to train whatever he is using. He says the search engine itself, but that doesn’t make sense unless you want to say that his Google can search by image and sound bite, thus allowing for him to build a language of sorts between the collected data and the meanings humans assign them. That actually is probably what they were going for given the explanation of language earlier. It still doesn’t explain why they just had Caleb throw up his hands to say he has no idea how it was done.
Next we find out that writer/director Alex Garland is probably a fan of Quantum Leap. I say that because that is Ava’s brain, which Beard refers to as wetware. What that means is that it should be a combination of physical aspects of the human brain interconnected with mechanical parts to create an artificial intelligence. This is what Ziggy on the show Quantum Leap has as its’ “brain”. That’s why they refer to her as a hybrid computer, and as having aspects of Sam in her since the brain cells are his. We also find out that the software running on it is his version of Google’s search engine.
After showing Caleb another picture, she decides to manipulate him more by putting on hair and clothes. I had to pause the movie again here.
In other words, they told her to just walk rather than walk while swinging her hips. I’m not sure why that was a thing they bothered with honestly. Without that bit of trivia popping up it would have just come across as someone who was shy rather than someone who was starting to conform to the gender forced upon them by form and/or programming.
Now we get Caleb explaining that AI doesn’t need a gender, so that they can get into a conversation that amounts to explaining how things such as neural networks work by using sexual preference as an analogy. A neural network is a graph formed of vertices and lines that is based on the way the human brain works. The vertices have some sort of function that acts on the inputs that are sent into it in order to spit out a value either as a final result or as inputs to other vertices. The lines have a weight that is assigned to them, which is then multiplied by the value being sent along it in order to create the value that enters the vertex it connects to. Here is a very simply example of a neural network taken from AI Application Programming by M. Tim Jones.
What this all means is that using certain techniques such as backwards propagation, you can send information through such a network that in turn adjusts the weights in order to change the way it will operate by changing what it will output. In the context of his explanation, it means that if you pass a bunch of black girls through your brain, and you respond in a certain way, then your brain’s own neural network adjusts to having an attraction to them, or developing a dislike of them sexually. It also depends on the structure of said brain initially and how it is formed during your early years. To go back to the mathematical example, it would mean how many vertices you have, how they are linked, and the functions at each vertex.
Caleb doesn’t seem to understand because he isn’t very good at Computer Science. Beard takes Caleb into a room with a Jackson Pollock painting.
He tries to explain what Automatic Art is to Caleb. It sounds like he is referencing Fuzzy Logic to me, which is when instead of using simple true and false, or 0 and 1, you make decisions based on everything in between 0 and 1. It’s not a fixed algorithmic if this then that, but something more human.
Then Beard references Star Trek by telling Caleb to “Engage intellect”. I’m still thinking he is making a brothel, so I’ll go with Mudd’s Women as the episode of the original series he is referencing. Amazon Prime says it’s a reference to the episode Requiem for Methuselah. I also think of The Measure of a Man from Star Trek: TNG where Data’s designation of being sentient or property is put on trial.
Beard now asks him to reverse the challenge of doing something not partially deliberate and partially unconscious. He asks Caleb what would happen if Pollock didn’t make a single mark if he didn’t know exactly what he was doing. Caleb says that he wouldn’t have made a single mark. In other words, he is describing the difference between how humans operate with fuzzy logic instead of with a strict rule system. I have a feeling that somebody told Alex Garland about how they tried things like theorem-proving software and knowledge systems in AI before Strong AI research collapsed and we switched to research into Weak AI. Weak AI being what we have been enjoying at an ever-growing rate since the 1980s in the form of speech recognition, image identification, and even programs that can write new songs in the style of Bob Dylan. Those things operate on probabilities, which when attached to incoming examples such as speech, images, and lyrics written by Bob Dylan will spit out another probability that is used to make a decision while also creating something that will make the kind of decisions you want based on the samples you gave it. Thus, it isn’t a strict rule, but something in between structure of the model and the current state of the probabilities in that model being used to generate the result. The Bob Dylan example would be feeding his lyrics into a chain that builds a series of probabilities between words so that if you picked a starting word, then it would generate the rest of the words based on the actual lyrics Bob Dylan wrote. You can do this with music as well. They are called Markov Chains/Hidden Markov Models in both cases.
This is all stuff to make you think that the more information you give to Google, then if the model is built correctly, all that data created by human beings will train that model to operate as a human does by becoming predictive of correct human behavior. Unfortunately, that’s not Strong AI. It’s simply Weak AI used on a large scale that impresses us the way a shiny object does a small child.
Then Caleb comes right out and tells Ava he took a semester in AI. Yeah, sure you did Caleb. I took one and a half of them at Cal. Trust me when I say that if Anastasia Steele from Fifty Shades of Grey (2015) is the worst English major in recent film, then Caleb is the worst Computer Science major in recent movies.
Why does he say this? He says it because the movie wants to get arty by showing black and white shots. No really, there is no other reason. Caleb brings up a famous thought experiment about a person living in a black and white room who has perfect knowledge of color without having ever stepped out of that room. That would mean that the person in question would have never experienced color. However, Caleb screws this up by saying:
Hmmm…you mean like the colors in the image I just posted that would show up on a black and white monitor, Caleb? He breaks the thought experiment simply so the film can show shots like that. He says that it gives her experience, which perfect knowledge about something doesn’t give you. That difference between pure knowledge and how something makes you feel is supposed to be the difference between human and machine intelligence according to Caleb. That’s why she mentions that she wants to go to a busy intersection if she is ever allowed to leave at a point in the movie. Of course the movie realizes that it needs to move the plot along instead of getting too smart so Ava messes with the power again to try and guilt Caleb some more. Boring.
Time for another session with Beard for Caleb. Nothing really happens here of consequence. Then more artsy shots and Beard banging the Asian lady robot. This is followed by Asian lady and Beard disco dancing to Oliver Cheatham’s Get Down It’s Saturday Night. I’m guessing Alex Garland also played Grand Theft Auto: Vice City since it’s on that soundtrack for the Fever 105 station hosted by Oliver “Ladykiller” Biscuit. It’s also there so that Lisa could have a scene from this movie to include as a dance scene that she loves.
Unfortunately, Caleb doesn’t want to cool off on Saturday night, or any other night right now. He’s pissed off because the movie can’t really decide whether it wants to be smart about the technical stuff, or whether it wants to focus on that other boring prison break plot.
There’s numerous questions that are batted back and forth here, but the only one that was important to me is that according to Ava she has an off switch like Data.
So, why do they fight her before the Ms. 45 (1981) backstabbing ending when they could have simply turned her off? No matter. The conversations with her in this movie really aren’t the actual sessions. The sessions are with Beard. It’s time for Beard to think that Strong AI is right around the corner, and misunderstand what the singularity means.
I love how Beard makes sure to mention that the bodies are kept around after he builds the next model. He does this not for any in-movie reason, but for the horror factor of a bunch of bodies and so that Ava has a place to get skin later in the movie.
Beard brings up the singularity now and he seems to be confusing it for Terminator 2 (1991). The singularity is not when robots take over and replace us. The singularity is a point at which progress begins to outstrip our ability to fully comprehend the changes it creates till we essentially can do whatever we want. It’s like the Krell in Forbidden Planet (1956). Another example would be The Ancients in Stargate SG-1 who shed their material form and ascended into the freedom of pure energy. Not exactly something you whip out the Oppenheimer quote about being the destroyer of worlds when discussing. It’s an evolution, but not at the expense of the existence of humans. However, at this point the film is on autopilot towards its very unsatisfying conclusion, so who cares.
With Beard knocked out from drinking, Caleb decides to write a prime number generator.
He is supposedly working to free Ava, but he will not use that prime number generator to keep the power system busy like they did in Real Genius (1985). Also, why the comments? It’s almost as funny as in Blackhat (2015) when they didn’t seem to know that comments disappear when you compile code, which means they wouldn’t be present when you decompile something like a virus.
More boring stuff that has very little to do with anything, but looks semi-impressive and atmosphere maintaining if you haven’t already given up on the film like I had long before this point.
Ava is sitting in a corner to further guilt Caleb along with saying some more stuff and shutting off the power again.
Caleb and Beard talk some more before Ava finally breaks out from her prison. Beard is killed in the process.
She locks Caleb up and gets away. I was honestly hoping that it would turn out Caleb was a robot, but they made sure to shoot that down by having him cut open his arm at one point. Then they juxtapose that with scenes later in the movie of the actual robots pulling off their skin.
I am part of a social network called Letterboxd. It isn’t a place where I write proper reviews. That’s what this site is for where I can really think about it, and include screenshots. I believe they are crucial. That’s why I use that network for initial gut reactions to what I watch. I try not to bring that over here, but really step through the film. This film isn’t as bad as I thought it was initially. I’m still not a fan though. The film boils down to somebody trying to socially engineer two people in order to make a prison break while we get pseudo-intellectual stabs at real tech stuff while not bothering to maintain consistency throughout the film. It’s not awful as I thought during and after watching it the first time, but it hardly deserves the ridiculous amount positive critical attention it has been getting since its’ release in 2015.
My ultimate conclusion is this: Watch Sneakers (1992), Real Genius (1985), and WarGames (1983) instead. Also, if this movie sparked an interest in AI for you, then run with it, because it is a fascinating subject that is not done justice by this film. My actual semester at Cal in AI was amazing. I had already read several books on the subject prior to taking the class, and it still was some of the most fun I had while at UC Berkeley.
Side notes: The reason for the race condition at the beginning of Steve Jobs is because a race condition is when two or more things try to operate on the same thing that causes unwanted results. Since the film uses Jobs’ brain as a metaphor for end-to-end control, a race condition is a perfect bug that doesn’t jive with what Jobs wants to achieve.
It is interesting that the film has exactly seven sessions with Ava since a byte is comprised of 8 bits. It makes one wonder whether Garland wants you to think it was cut short, there was a zeroth session prior to the official session since computers begin counting at zero, the 8th session is what happens at the very end, or that the missing session that would make it a byte represents that she is truly human and not really a computer anymore. Just something I thought I would pass on.