1 00:00:00 --> 00:00:01 2 00:00:01 --> 00:00:02 ANNOUNCER: The following content is provided by MIT 3 00:00:02 --> 00:00:05 OpenCourseWare under a Creative Commons license. 4 00:00:05 --> 00:00:08 Additional information about our license, and MIT 5 00:00:08 --> 00:00:15 OpenCourseWare in general, is available at ocw.mit.edu. 6 00:00:15 --> 00:00:16 PROFESSOR: Good afternoon. 7 00:00:16 --> 00:00:16 AUDIENCE: Good afternoon. 8 00:00:16 --> 00:00:25 9 00:00:25 --> 00:00:30 PROFESSOR: So there I was in my car this morning as the pouring 10 00:00:30 --> 00:00:36 rain started, thinking if I make a dash for it -- I've 11 00:00:36 --> 00:00:37 got to take the computer. 12 00:00:37 --> 00:00:39 I need the coffee, because otherwise I'm going 13 00:00:39 --> 00:00:41 to fall asleep. 14 00:00:41 --> 00:00:45 I don't need anything in that bag, do I? 15 00:00:45 --> 00:00:46 So I took off without the bag. 16 00:00:46 --> 00:00:49 And it was true that I didn't need most of 17 00:00:49 --> 00:00:50 what was in the bag. 18 00:00:50 --> 00:00:53 But the lecture notes for today's lecture would have been 19 00:00:53 --> 00:00:55 a useful thing to take with me. 20 00:00:55 --> 00:00:59 On the other hand, if there was ever going to be a day where I 21 00:00:59 --> 00:01:02 forgot the lecture notes, this is probably the one to do it. 22 00:01:02 --> 00:01:06 Because I'm going to talk about attention today. 23 00:01:06 --> 00:01:11 And attention research is what I do for a living. 24 00:01:11 --> 00:01:15 If there's anything that I should be able to just stand up 25 00:01:15 --> 00:01:18 and lecture about, this is it. 26 00:01:18 --> 00:01:22 Now of course, that means that you should find this to be the 27 00:01:22 --> 00:01:26 most gripping topic in the entire course, and that you 28 00:01:26 --> 00:01:30 should decide that you want to do this for a living. 29 00:01:30 --> 00:01:33 Well, when you decide you want to do it for a sort of a 30 00:01:33 --> 00:01:40 living, like a $10 an hour living, you can come and be a 31 00:01:40 --> 00:01:43 subject in attention research in my lab. 32 00:01:43 --> 00:01:46 I would again advocate that you sign up -- I saw there are 33 00:01:46 --> 00:01:50 still these notes around about signing up to be a subject 34 00:01:50 --> 00:01:51 generally in BCS. 35 00:01:51 --> 00:01:54 My lab is separate from the BCS business because 36 00:01:54 --> 00:01:57 I'm technically Brigham and Women's Hospital. 37 00:01:57 --> 00:02:00 But you can sign up with us, too, and we'll pay you $10 38 00:02:00 --> 00:02:03 an hour to do visual attention research. 39 00:02:03 --> 00:02:05 What could be better than that? 40 00:02:05 --> 00:02:07 Is Kristen here? 41 00:02:07 --> 00:02:10 I don't see Kristen. 42 00:02:10 --> 00:02:14 Kristen, one of the TAs is also in my lab, and I was 43 00:02:14 --> 00:02:16 going to point her out. 44 00:02:16 --> 00:02:20 Anyway, send me an email, we'll sign you up. 45 00:02:20 --> 00:02:21 Talk to me. 46 00:02:21 --> 00:02:23 We'd love to have you. 47 00:02:23 --> 00:02:27 And you can do Where's Waldo experiments for $10 an hour. 48 00:02:27 --> 00:02:29 49 00:02:29 --> 00:02:31 You think I'm joking. 50 00:02:31 --> 00:02:38 51 00:02:38 --> 00:02:44 Let me try to explain why it is that I'm putting an attention 52 00:02:44 --> 00:02:51 lecture in between a sensation lecture and a 53 00:02:51 --> 00:02:52 perception lecture. 54 00:02:52 --> 00:02:53 It's not terribly typical. 55 00:02:53 --> 00:02:57 More typically, if people talk about attention, they go off 56 00:02:57 --> 00:03:02 and do it later, after doing sensation and perception. 57 00:03:02 --> 00:03:05 But why am I putting it in there? 58 00:03:05 --> 00:03:12 The core reason is that you simply cannot process all of 59 00:03:12 --> 00:03:15 the information that you take in from the world. 60 00:03:15 --> 00:03:19 You're taking in a vast amount of sensory information. 61 00:03:19 --> 00:03:24 Your perceptual capabilities -- for instance, those that allow 62 00:03:24 --> 00:03:27 you to recognize specific objects -- are limited. 63 00:03:27 --> 00:03:31 You cannot recognize all of the objects in the world that you 64 00:03:31 --> 00:03:32 are looking at, all at the same time. 65 00:03:32 --> 00:03:33 It simply doesn't work. 66 00:03:33 --> 00:03:43 And so, roughly speaking, there's the situation. 67 00:03:43 --> 00:03:49 You've got a lot of stuff coming in from the outside. 68 00:03:49 --> 00:03:55 And you've a box here that does, let's say, let's call 69 00:03:55 --> 00:03:59 this one a recognition box. 70 00:03:59 --> 00:04:05 And only one thing at a time gets to go in and come out 71 00:04:05 --> 00:04:08 of that box, basically. 72 00:04:08 --> 00:04:13 So this is like the basic MIT metaphor about drinking 73 00:04:13 --> 00:04:16 from the firehose. 74 00:04:16 --> 00:04:20 Well, if you're really going to drink from the firehose, it's a 75 00:04:20 --> 00:04:24 very useful idea to restrict the flow in some fashion, 76 00:04:24 --> 00:04:26 and let some of that water just [SPLAT] 77 00:04:26 --> 00:04:28 and get you wet, or whatever. 78 00:04:28 --> 00:04:32 And so there's a severe constriction, sometimes called 79 00:04:32 --> 00:04:35 a bottleneck -- I think I've got some slides that call it a 80 00:04:35 --> 00:04:39 bottleneck later -- that takes all of this and only lets 81 00:04:39 --> 00:04:40 some of it through. 82 00:04:40 --> 00:04:44 And that bottleneck is governed -- it's not just random what 83 00:04:44 --> 00:04:47 gets through -- it's governed by mechanisms of selective 84 00:04:47 --> 00:04:51 attention that allow some things to get through 85 00:04:51 --> 00:04:57 and leave other things on the on the floor. 86 00:04:57 --> 00:05:00 And so if you think of this as sort of sensation and 87 00:05:00 --> 00:05:03 perception -- which is a little bald, but -- then that's why 88 00:05:03 --> 00:05:05 you put attention in the middle there. 89 00:05:05 --> 00:05:08 Now to motivate this a bit further, let me 90 00:05:08 --> 00:05:09 do a demonstration. 91 00:05:09 --> 00:05:13 Actually, this is the demonstration of why reading 92 00:05:13 --> 00:05:16 the Tech while listening to my lecture may not 93 00:05:16 --> 00:05:19 be a brilliant idea. 94 00:05:19 --> 00:05:20 Well, it may be a brilliant idea. 95 00:05:20 --> 00:05:24 It just depends on your particular goals in life. 96 00:05:24 --> 00:05:27 I need a couple of volunteer type people 97 00:05:27 --> 00:05:31 who wish to read here. 98 00:05:31 --> 00:05:34 All right, there's a volunteer person, and there's a 99 00:05:34 --> 00:05:36 pink volunteer person. 100 00:05:36 --> 00:05:38 Yes, you, MIT person. 101 00:05:38 --> 00:05:39 But you have to come up here. 102 00:05:39 --> 00:05:44 So kick a few people on the way by and stuff like that. 103 00:05:44 --> 00:05:46 You have to come up here and do a dramatic reading. 104 00:05:46 --> 00:05:52 105 00:05:52 --> 00:06:01 What I'm going to do is have these people both read 106 00:06:01 --> 00:06:05 to you at the same time. 107 00:06:05 --> 00:06:12 You're going to read from here, from where it says "Catherine." 108 00:06:12 --> 00:06:18 And you're going to read from here. 109 00:06:18 --> 00:06:23 And you're both going to read nice and loudly and steadily. 110 00:06:23 --> 00:06:25 At the same time, yes. 111 00:06:25 --> 00:06:27 That's the interesting part. 112 00:06:27 --> 00:06:31 And what you're going to do is you're going to listen to her 113 00:06:31 --> 00:06:35 -- for, actually, to "her" specifically. 114 00:06:35 --> 00:06:40 Listen for the third instance of the word "her." When you 115 00:06:40 --> 00:06:45 hear "her," say "her." For the third time, raise your hand. 116 00:06:45 --> 00:06:47 OK? 117 00:06:47 --> 00:06:48 Yeah, we got this? 118 00:06:48 --> 00:06:50 Yeah, all right. 119 00:06:50 --> 00:06:51 This her, not that her. 120 00:06:51 --> 00:06:52 NINA: 121 00:06:52 --> 00:06:53 I could tell you my name. 122 00:06:53 --> 00:06:54 PROFESSOR: That would help. 123 00:06:54 --> 00:06:54 You are? 124 00:06:54 --> 00:06:55 NINA: Nina. 125 00:06:55 --> 00:06:55 PROFESSOR: That's Nina. 126 00:06:55 --> 00:06:56 This is? 127 00:06:56 --> 00:06:57 ZAINA: Zaina. 128 00:06:57 --> 00:06:59 [LAUGHTER] 129 00:06:59 --> 00:07:00 PROFESSOR: Right. 130 00:07:00 --> 00:07:02 OK. 131 00:07:02 --> 00:07:03 Zaina or Zena? 132 00:07:03 --> 00:07:03 ZAINA: Zaina. 133 00:07:03 --> 00:07:04 PROFESSOR: OK. 134 00:07:04 --> 00:07:05 At least it's not just one letter. 135 00:07:05 --> 00:07:06 NINA: My surname's [? Navarre, ?] 136 00:07:06 --> 00:07:06 if that helps. 137 00:07:06 --> 00:07:09 PROFESSOR: No, this is not going to help at all. 138 00:07:09 --> 00:07:11 Her. 139 00:07:11 --> 00:07:14 When Nina says "her" for the third time, raise your hand. 140 00:07:14 --> 00:07:15 OK? 141 00:07:15 --> 00:07:17 You got it? 142 00:07:17 --> 00:07:18 You got it? 143 00:07:18 --> 00:07:22 On your mark, get set, read. 144 00:07:22 --> 00:07:33 [OVERLAPPING VOICES] 145 00:07:33 --> 00:07:35 PROFESSOR: OK, thank you. 146 00:07:35 --> 00:07:39 All right, that was excellent. 147 00:07:39 --> 00:07:42 All right, so what was she talking about? 148 00:07:42 --> 00:07:43 Yeah, something. 149 00:07:43 --> 00:07:44 No, 150 00:07:44 --> 00:07:46 no, somebody raise their hand. 151 00:07:46 --> 00:07:47 Hand, hand. 152 00:07:47 --> 00:07:48 What was --? 153 00:07:48 --> 00:07:49 A letter, thank you. 154 00:07:49 --> 00:07:51 That sounds good. 155 00:07:51 --> 00:07:52 She'd gotten a letter. 156 00:07:52 --> 00:07:54 Was it a nice letter? 157 00:07:54 --> 00:07:54 Who knows? 158 00:07:54 --> 00:07:55 It didn't sound too good. 159 00:07:55 --> 00:07:57 Her countenance wasn't doing good things. 160 00:07:57 --> 00:07:59 What was she talking about? 161 00:07:59 --> 00:08:00 Zaina? 162 00:08:00 --> 00:08:00 What? 163 00:08:00 --> 00:08:03 164 00:08:03 --> 00:08:04 Oh, uh. 165 00:08:04 --> 00:08:05 She was talking about uh. 166 00:08:05 --> 00:08:06 OK. 167 00:08:06 --> 00:08:08 Was she talking? 168 00:08:08 --> 00:08:09 AUDIENCE: Yes. 169 00:08:09 --> 00:08:12 PROFESSOR: So what's your problem? 170 00:08:12 --> 00:08:16 What you were doing was -- so how many people -- well, 171 00:08:16 --> 00:08:17 obviously the hands suggested that everybody could 172 00:08:17 --> 00:08:20 manage to do the task. 173 00:08:20 --> 00:08:22 What could you pick up from -- Zena? 174 00:08:22 --> 00:08:23 ZAINA: Zaina. 175 00:08:23 --> 00:08:24 PROFESSOR: Zaina. 176 00:08:24 --> 00:08:26 I'm not going to -- otherwise it's going to turn into warrior 177 00:08:26 --> 00:08:29 queen and stuff like that. 178 00:08:29 --> 00:08:38 What could you pick up about Zaina's speech? 179 00:08:38 --> 00:08:39 Anything? 180 00:08:39 --> 00:08:40 No content. 181 00:08:40 --> 00:08:43 How many people knew she was talking? 182 00:08:43 --> 00:08:44 All right, so you can pick up something. 183 00:08:44 --> 00:08:45 What else did you know about her? 184 00:08:45 --> 00:08:47 AUDIENCE: The tone she was speaking in. 185 00:08:47 --> 00:08:49 PROFESSOR: The tone she was speaking in. 186 00:08:49 --> 00:08:52 If it'd been a male voice, if she had switched to a male 187 00:08:52 --> 00:08:54 voice, you would've noticed. 188 00:08:54 --> 00:08:55 Anything else, you think? 189 00:08:55 --> 00:08:57 AUDIENCE: Was she reading from Heart of Darkness? 190 00:08:57 --> 00:09:01 PROFESSOR: Was reading from Heart of Darkness? 191 00:09:01 --> 00:09:05 No, actually what she was reading from was Lucretius, On 192 00:09:05 --> 00:09:06 the Nature Of the Universe. 193 00:09:06 --> 00:09:08 A wonderful book. 194 00:09:08 --> 00:09:10 De Rerum Natura in Latin. 195 00:09:10 --> 00:09:11 He's a Roman author. 196 00:09:11 --> 00:09:14 This is sort of the first intro psych book. 197 00:09:14 --> 00:09:18 It's also the first intro physics book, intro everything. 198 00:09:18 --> 00:09:20 In those days, you could write a book called On the Nature 199 00:09:20 --> 00:09:23 Of the Universe, in verse. 200 00:09:23 --> 00:09:26 This is a prose translation. 201 00:09:26 --> 00:09:30 She was actually reading Lucretius' theory of vision. 202 00:09:30 --> 00:09:32 And even she may not have noticed that, because it's 203 00:09:32 --> 00:09:35 all about thin films and cool stuff like that. 204 00:09:35 --> 00:09:36 AUDIENCE: [INAUDIBLE] 205 00:09:36 --> 00:09:36 video. 206 00:09:36 --> 00:09:37 PROFESSOR: A video. 207 00:09:37 --> 00:09:41 Yeah, well it's an ancient Roman video. 208 00:09:41 --> 00:09:45 But only a very limited amount of stuff got in. 209 00:09:45 --> 00:09:48 So there was a certain amount of stuff that was getting in. 210 00:09:48 --> 00:09:52 But at some point your auditory system gave up on 211 00:09:52 --> 00:09:54 processing that stream. 212 00:09:54 --> 00:10:00 And in terms of extracting meaning, understanding the 213 00:10:00 --> 00:10:04 words, it went with Nina, because that was the job. 214 00:10:04 --> 00:10:06 You can't do both of them. 215 00:10:06 --> 00:10:07 We'd better let them go sit down. 216 00:10:07 --> 00:10:13 Thank you for being -- 217 00:10:13 --> 00:10:15 What would have made the task easier? 218 00:10:15 --> 00:10:18 What would make it easier to pay attention to one and not 219 00:10:18 --> 00:10:18 the other, do you think? 220 00:10:18 --> 00:10:20 AUDIENCE: Amplifying one of them. 221 00:10:20 --> 00:10:21 PROFESSOR: Amplifying one of them. 222 00:10:21 --> 00:10:25 Yes, if the warrior queen would have just been quiet, it 223 00:10:25 --> 00:10:26 would've been no problem at all. 224 00:10:26 --> 00:10:28 AUDIENCE: If they read the same thing. 225 00:10:28 --> 00:10:29 PROFESSOR: If they read the same thing. 226 00:10:29 --> 00:10:33 No, that's -- that's probably true, but not a deeply 227 00:10:33 --> 00:10:34 interesting true. 228 00:10:34 --> 00:10:39 229 00:10:39 --> 00:10:41 Well, for instance if she was male, it would be easier to 230 00:10:41 --> 00:10:43 segregate the two voices. 231 00:10:43 --> 00:10:45 If we moved them apart further it would be easier to 232 00:10:45 --> 00:10:47 segregate the two voices. 233 00:10:47 --> 00:10:48 AUDIENCE: If one was singing. 234 00:10:48 --> 00:10:49 PROFESSOR: If one of them was singing it would've actually 235 00:10:49 --> 00:10:51 probably been easier to segregate them. 236 00:10:51 --> 00:10:54 So if you change the sort of low level sensory information, 237 00:10:54 --> 00:10:57 it would be easier for you to decide which one to 238 00:10:57 --> 00:10:58 pay attention to. 239 00:10:58 --> 00:11:00 This is something that happens. 240 00:11:00 --> 00:11:04 Oh, so if you're sitting there reading the newspaper while 241 00:11:04 --> 00:11:07 you're trying to listen to this lecture, odds are you are 242 00:11:07 --> 00:11:10 missing one of the two messages. 243 00:11:10 --> 00:11:12 It's sort of dealer's choice there. 244 00:11:12 --> 00:11:15 But it's also not desperately polite, in case 245 00:11:15 --> 00:11:16 anybody was wondering. 246 00:11:16 --> 00:11:17 If you want to read the paper, you might as 247 00:11:17 --> 00:11:20 well go somewhere else. 248 00:11:20 --> 00:11:25 But this happens in the real world all the time. 249 00:11:25 --> 00:11:29 There's a version of it known as the cocktail party effect. 250 00:11:29 --> 00:11:33 You go to a party and you're talking to someone, and 251 00:11:33 --> 00:11:36 you hear, typically, what? 252 00:11:36 --> 00:11:39 Like, your name, over there. 253 00:11:39 --> 00:11:41 So you do this selective attention thing. 254 00:11:41 --> 00:11:43 You listen to that conversation. 255 00:11:43 --> 00:11:46 You seem to be paying attention to this guy who's talking to 256 00:11:46 --> 00:11:48 you, but you're actually listening over there. 257 00:11:48 --> 00:11:52 The problem is eventually, this guy stops talking. 258 00:11:52 --> 00:11:55 And you realize, oh yeah, I'm supposed to say 259 00:11:55 --> 00:11:57 something now, right? 260 00:11:57 --> 00:12:00 I wonder what we're talking about. 261 00:12:00 --> 00:12:03 It can lead to a certain amount of embarrassment. 262 00:12:03 --> 00:12:10 Now this happens ubiquitously in sensory systems and 263 00:12:10 --> 00:12:11 across sensory systems. 264 00:12:11 --> 00:12:16 So for example right now, until I mention it, you are not 265 00:12:16 --> 00:12:19 particularly aware of the pressure of your 266 00:12:19 --> 00:12:21 posterior on the seat. 267 00:12:21 --> 00:12:24 If I direct your attention to that, you say, oh, 268 00:12:24 --> 00:12:26 yeah, there it is. 269 00:12:26 --> 00:12:29 It was presumably there all along; I wasn't 270 00:12:29 --> 00:12:31 floating a moment ago. 271 00:12:31 --> 00:12:33 But until I direct your attention to it, it doesn't 272 00:12:33 --> 00:12:39 rise to the level of current conscious awareness. 273 00:12:39 --> 00:12:42 And it shows up in vision, because the visual world is far 274 00:12:42 --> 00:12:46 too rich for you to process everywhere at once. 275 00:12:46 --> 00:12:51 And that's what makes these sort of Where's Waldo problems 276 00:12:51 --> 00:12:53 interesting and fun. 277 00:12:53 --> 00:12:57 If there was not a bottleneck like this, Waldo man would 278 00:12:57 --> 00:12:59 not have gotten rich. 279 00:12:59 --> 00:12:59 Right? 280 00:12:59 --> 00:13:01 Yeah, where's Waldo? 281 00:13:01 --> 00:13:03 There he is. 282 00:13:03 --> 00:13:04 Big deal. 283 00:13:04 --> 00:13:05 Have you found him? 284 00:13:05 --> 00:13:07 AUDIENCE: No. 285 00:13:07 --> 00:13:09 PROFESSOR: Oh look, I have a little laser today. 286 00:13:09 --> 00:13:11 Isn't that nice? 287 00:13:11 --> 00:13:12 So does it work? 288 00:13:12 --> 00:13:15 289 00:13:15 --> 00:13:16 That's Waldo up there. 290 00:13:16 --> 00:13:19 So now you say, oh, that's really stupid, because I can't 291 00:13:19 --> 00:13:23 even see him now -- Oh, and we decided to exploit the 292 00:13:23 --> 00:13:26 technology by having it on three screens. 293 00:13:26 --> 00:13:30 There's no added information there, it's just it was 294 00:13:30 --> 00:13:32 too cute not to do it. 295 00:13:32 --> 00:13:40 But if I say, where is the elephant spraying a car? 296 00:13:40 --> 00:13:41 You can find it. 297 00:13:41 --> 00:13:43 You might have noticed it before if you had 298 00:13:43 --> 00:13:44 been scrutinizing it. 299 00:13:44 --> 00:13:46 It was certainly visible all along, right? 300 00:13:46 --> 00:13:49 It wasn't that there was a black hole here before. 301 00:13:49 --> 00:13:55 It's just that only when you had the desire to go in search 302 00:13:55 --> 00:13:57 for it did you manage to direct your attention to it in a way 303 00:13:57 --> 00:14:01 that allowed you to recognize these couple of objects. 304 00:14:01 --> 00:14:07 And it's that ability to constrict your processing 305 00:14:07 --> 00:14:10 that's really the focus, at least of the first part 306 00:14:10 --> 00:14:11 of today's lecture. 307 00:14:11 --> 00:14:17 Let me show you the equivalent of the talking example, 308 00:14:17 --> 00:14:19 but now switch to reading. 309 00:14:19 --> 00:14:24 What you want to do here is to look at the little asterisks. 310 00:14:24 --> 00:14:29 And I'll put up two streams of text, columns, one on 311 00:14:29 --> 00:14:31 the left, one on the right. 312 00:14:31 --> 00:14:36 Nice and big so that you can read them. 313 00:14:36 --> 00:14:40 But what you should notice is -- keep your eyes moving down 314 00:14:40 --> 00:14:41 from asterisk to asterisk. 315 00:14:41 --> 00:14:43 What you should notice is you can read one or the other; you 316 00:14:43 --> 00:14:45 just can't read both at the same time, even though 317 00:14:45 --> 00:14:47 they're nice and big. 318 00:14:47 --> 00:14:48 Right? 319 00:14:48 --> 00:14:52 It just doesn't work. 320 00:14:52 --> 00:14:54 It's not a visual restriction. 321 00:14:54 --> 00:14:58 It's a central -- it's a capacity limitation 322 00:14:58 --> 00:15:00 later on in the system. 323 00:15:00 --> 00:15:04 So this is by way of an answer to question one on the handout: 324 00:15:04 --> 00:15:11 what's the problem that attention is solving? 325 00:15:11 --> 00:15:13 Attention is solving this problem of having 326 00:15:13 --> 00:15:16 too much going on. 327 00:15:16 --> 00:15:19 Oh, and attention is a grab bag term. 328 00:15:19 --> 00:15:22 I'm going to be talking about visual selective attention. 329 00:15:22 --> 00:15:27 Attention isn't one thing, like my laser pointer here. 330 00:15:27 --> 00:15:29 There are attentional mechanisms, selective 331 00:15:29 --> 00:15:33 mechanisms, all over the place in the nervous system. 332 00:15:33 --> 00:15:37 So when you are attending to the pressure of your posterior 333 00:15:37 --> 00:15:41 on the seat, you are selecting, probably using a different set 334 00:15:41 --> 00:15:45 of neural circuitry than when you're selecting one 335 00:15:45 --> 00:15:46 of these words. 336 00:15:46 --> 00:15:51 It's the same basic idea, but it's not like there's a single 337 00:15:51 --> 00:15:55 attention box in your brain somewhere. 338 00:15:55 --> 00:15:58 OK. 339 00:15:58 --> 00:16:02 Some things, as we saw in that auditory demo, the 340 00:16:02 --> 00:16:06 reading demo, some things escape the bottleneck. 341 00:16:06 --> 00:16:09 Some things can be appreciated everywhere, 342 00:16:09 --> 00:16:13 all at the same time. 343 00:16:13 --> 00:16:20 Well, question two is, what is that set of things? 344 00:16:20 --> 00:16:28 And the answer is not babies. 345 00:16:28 --> 00:16:36 The answer is that there is a limited set of basic features 346 00:16:36 --> 00:16:41 that can be processed across the entire visual 347 00:16:41 --> 00:16:43 field at one time. 348 00:16:43 --> 00:16:45 Or, you could do it in auditory space. 349 00:16:45 --> 00:16:48 There'd be a set of basic features in auditory space, 350 00:16:48 --> 00:16:51 too, that could be processed at the same time. 351 00:16:51 --> 00:16:52 But I'm going to stick with vision. 352 00:16:52 --> 00:16:55 So all these babies look alike. 353 00:16:55 --> 00:16:58 It doesn't take much to figure out that now there is -- 354 00:16:58 --> 00:17:01 da da da, where'd Mara go? 355 00:17:01 --> 00:17:01 Oh, there's Mara. 356 00:17:01 --> 00:17:05 If the baby turns green, you do something about it. 357 00:17:05 --> 00:17:06 Right? 358 00:17:06 --> 00:17:07 It's a highly salient stimulus. 359 00:17:07 --> 00:17:12 Or if the baby's head gets squashed, you know. 360 00:17:12 --> 00:17:17 So they're a collection of simple, basic features, like 361 00:17:17 --> 00:17:25 color, size, orientation, that are not bottleneck limited 362 00:17:25 --> 00:17:27 in the same kind of way. 363 00:17:27 --> 00:17:31 You can find that if there's a single red thing in the field, 364 00:17:31 --> 00:17:33 you can find it anywhere without having to 365 00:17:33 --> 00:17:35 go hunting around. 366 00:17:35 --> 00:17:39 367 00:17:39 --> 00:17:41 Other things that you might think would be pretty 368 00:17:41 --> 00:17:44 obvious are not anywhere near so obvious. 369 00:17:44 --> 00:17:49 So as you look around here, you may notice that most of these 370 00:17:49 --> 00:17:55 baby heads are upside down and two of them are right way up. 371 00:17:55 --> 00:17:59 But it's not like the green baby head. 372 00:17:59 --> 00:18:04 You have to go hunting for upright versus upside down. 373 00:18:04 --> 00:18:06 Even though that's a very salient thing in the real 374 00:18:06 --> 00:18:09 world, whether or not you're upright, or whether your baby's 375 00:18:09 --> 00:18:13 upright or upside down. 376 00:18:13 --> 00:18:17 So there are about, by last count -- last count was done by 377 00:18:17 --> 00:18:23 me, as it turns out -- 12 to 18 of these things that seem 378 00:18:23 --> 00:18:24 to escape the bottleneck. 379 00:18:24 --> 00:18:25 And that's probably about it. 380 00:18:25 --> 00:18:31 And they are a bunch of simple things -- well, seemingly 381 00:18:31 --> 00:18:36 simple things -- like color, orientation, and size. 382 00:18:36 --> 00:18:38 Things that you could imagine, for instance, the earliest 383 00:18:38 --> 00:18:42 stages of visual cortical processing doing. 384 00:18:42 --> 00:18:45 And then there are some other, more elaborate things that 385 00:18:45 --> 00:18:48 also escape this bottleneck. 386 00:18:48 --> 00:18:52 And they're things like -- well, if you believe my friend 387 00:18:52 --> 00:18:54 Chen from China, this would this would be an example of 388 00:18:54 --> 00:18:57 the importance of topology. 389 00:18:57 --> 00:18:59 He thinks that the distinction here is that this has a hole 390 00:18:59 --> 00:19:01 and this doesn't have a hole. 391 00:19:01 --> 00:19:04 The other possibility is that this has line terminations 392 00:19:04 --> 00:19:06 and that this doesn't. 393 00:19:06 --> 00:19:07 These are the sort of things you can fight 394 00:19:07 --> 00:19:08 about in this field. 395 00:19:08 --> 00:19:11 But anyway, it's easy to find that among that. 396 00:19:11 --> 00:19:16 Curvy things among straight things are easy. 397 00:19:16 --> 00:19:18 Orientation in the third dimension works. 398 00:19:18 --> 00:19:21 So that cube is pointing up this direction; these cubes 399 00:19:21 --> 00:19:23 are pointing down over here. 400 00:19:23 --> 00:19:28 That turns out to be easy. 401 00:19:28 --> 00:19:31 Other examples would include motion. 402 00:19:31 --> 00:19:35 Though actually, motion makes an interesting point. 403 00:19:35 --> 00:19:38 It's easy to detect the presence of something, but not 404 00:19:38 --> 00:19:39 so easy to detect its absence. 405 00:19:39 --> 00:19:41 So imagine the following. 406 00:19:41 --> 00:19:44 I didn't make a demo of this; I could have. 407 00:19:44 --> 00:19:46 Imagine you're looking at the ground and there's one 408 00:19:46 --> 00:19:48 little ant moving around. 409 00:19:48 --> 00:19:51 He's pretty easy to find, right? 410 00:19:51 --> 00:19:54 Because motion is one of these features that you don't have to 411 00:19:54 --> 00:19:56 go hunting for; it's just sort of there. 412 00:19:56 --> 00:19:58 On the other hand, imagine you're looking at an ant's 413 00:19:58 --> 00:20:01 nest, and there's one dead ant. 414 00:20:01 --> 00:20:05 How easy is it to find one ant who's not moving? 415 00:20:05 --> 00:20:07 Not easy. 416 00:20:07 --> 00:20:10 So the absence of a feature can be hard to detect. 417 00:20:10 --> 00:20:14 The presence of a feature, one of these 12 to 18 418 00:20:14 --> 00:20:17 basic features, can be easy to detect. 419 00:20:17 --> 00:20:22 Now, how do you actually go about establishing that 420 00:20:22 --> 00:20:24 something is easy to find or hard to find? 421 00:20:24 --> 00:20:26 I've been doing this in very qualitative terms. 422 00:20:26 --> 00:20:29 But now let me explain how you actually go 423 00:20:29 --> 00:20:30 about studying this. 424 00:20:30 --> 00:20:33 What we would pay you $10 an hour for if you 425 00:20:33 --> 00:20:35 show up in the lab. 426 00:20:35 --> 00:20:39 What we would do is show you a computer screen full of stuff, 427 00:20:39 --> 00:20:41 and ask you a question. 428 00:20:41 --> 00:20:43 A simple-minded question like, on the next one, 429 00:20:43 --> 00:20:45 is there a tilted line? 430 00:20:45 --> 00:20:47 And what you would be doing is sitting there with a 431 00:20:47 --> 00:20:49 couple of computer keys. 432 00:20:49 --> 00:20:51 Bang one key if the answer is no, bang another key 433 00:20:51 --> 00:20:53 if the answer is yes. 434 00:20:53 --> 00:20:55 Do it as fast and accurately as you can, and we're going to 435 00:20:55 --> 00:20:57 measure your reaction time. 436 00:20:57 --> 00:21:00 The amount of time from the onset of the stimulus to the 437 00:21:00 --> 00:21:01 onset of your response. 438 00:21:01 --> 00:21:03 How fast can you do it? 439 00:21:03 --> 00:21:05 Well, I don't have keys for everybody here, so let's 440 00:21:05 --> 00:21:06 just do it verbally. 441 00:21:06 --> 00:21:12 Say yes or no as fast as you can in response to these guys. 442 00:21:12 --> 00:21:14 Tell me, is there a tilted line present? 443 00:21:14 --> 00:21:15 Ready? 444 00:21:15 --> 00:21:17 AUDIENCE: Yes. 445 00:21:17 --> 00:21:18 PROFESSOR: Ready? 446 00:21:18 --> 00:21:19 AUDIENCE: No. 447 00:21:19 --> 00:21:20 PROFESSOR: Ready? 448 00:21:20 --> 00:21:21 AUDIENCE: Yes. 449 00:21:21 --> 00:21:23 PROFESSOR: Ready? 450 00:21:23 --> 00:21:23 AUDIENCE: No. 451 00:21:23 --> 00:21:26 PROFESSOR: OK, that's pretty straightforward. 452 00:21:26 --> 00:21:27 What's the next thing? 453 00:21:27 --> 00:21:29 OK. 454 00:21:29 --> 00:21:33 What you should have heard is that your answers were given 455 00:21:33 --> 00:21:37 crisply, in unison, and it didn't make any real difference 456 00:21:37 --> 00:21:40 whether there were lots of vertical lines on the screen or 457 00:21:40 --> 00:21:41 a few vertical lines on the screen. 458 00:21:41 --> 00:21:47 So if we were to collect real data and to plot the reaction 459 00:21:47 --> 00:21:50 time in milliseconds -- thousandths of a second -- as a 460 00:21:50 --> 00:21:53 function of the set size -- the number of items on the screen 461 00:21:53 --> 00:21:59 -- what you would get for any of these 12 to 18 items, if you 462 00:21:59 --> 00:22:03 did the experiment right, is an essentially flat line here. 463 00:22:03 --> 00:22:05 This would be the line for saying yes, it always turns out 464 00:22:05 --> 00:22:08 to take a little longer, or typically turns out to take a 465 00:22:08 --> 00:22:11 little longer to say no, but it's not dependent on the 466 00:22:11 --> 00:22:13 number of items on the screen. 467 00:22:13 --> 00:22:17 So is there an L, is there a green thing, is there 468 00:22:17 --> 00:22:19 an X among these pluses? 469 00:22:19 --> 00:22:24 All those things would produce similar looking results where 470 00:22:24 --> 00:22:28 the slope of this reaction time by set size function 471 00:22:28 --> 00:22:31 would be, essentially, 0. 472 00:22:31 --> 00:22:33 Not all tasks behave that way. 473 00:22:33 --> 00:22:37 So let's do a different one. 474 00:22:37 --> 00:22:41 In this case you're looking for the letter T. 475 00:22:41 --> 00:22:47 It can be rotated by 90 degrees left or right, or -- maybe it 476 00:22:47 --> 00:22:49 can also be upside down; I don't remember what I put in. 477 00:22:49 --> 00:22:51 But it may not be an upright T. 478 00:22:51 --> 00:22:52 But it'll be a T. 479 00:22:52 --> 00:22:55 The distractor items are all L's. 480 00:22:55 --> 00:23:00 And I just want you to say as fast as you can, 481 00:23:00 --> 00:23:01 is there a T present? 482 00:23:01 --> 00:23:04 Ready? 483 00:23:04 --> 00:23:05 AUDIENCE: No. 484 00:23:05 --> 00:23:07 PROFESSOR: Ready? 485 00:23:07 --> 00:23:09 AUDIENCE: Yes. 486 00:23:09 --> 00:23:12 Yes. 487 00:23:12 --> 00:23:13 [INTERPOSING VOICES] 488 00:23:13 --> 00:23:16 PROFESSOR: OK, ready? 489 00:23:16 --> 00:23:17 AUDIENCE: Yes. 490 00:23:17 --> 00:23:19 PROFESSOR: Ready? 491 00:23:19 --> 00:23:22 AUDIENCE: Yes. 492 00:23:22 --> 00:23:23 PROFESSOR: You also heard the speed - accuracy 493 00:23:23 --> 00:23:26 tradeoff there. 494 00:23:26 --> 00:23:32 This is a known phenomenon in reaction time studies, which 495 00:23:32 --> 00:23:35 is, one can respond very quickly if you don't sweat 496 00:23:35 --> 00:23:37 the accuracy things. 497 00:23:37 --> 00:23:39 And people do that routinely. 498 00:23:39 --> 00:23:42 When people do that a lot in our studies, we call 499 00:23:42 --> 00:23:45 them bad subjects. 500 00:23:45 --> 00:23:49 And we don't invite them back. 501 00:23:49 --> 00:23:53 But what you should have heard there, and should have felt 502 00:23:53 --> 00:23:58 yourself, is that the responses were faster when there 503 00:23:58 --> 00:24:00 were fewer items present. 504 00:24:00 --> 00:24:05 And that the responses of the group, particularly for these 505 00:24:05 --> 00:24:07 larger set sizes, were spread out. 506 00:24:07 --> 00:24:08 Why were they spread out? 507 00:24:08 --> 00:24:12 Well, some people got lucky. 508 00:24:12 --> 00:24:14 This thing came up and their attention happened 509 00:24:14 --> 00:24:14 to be around here. 510 00:24:14 --> 00:24:16 Oh look, there's a T. 511 00:24:16 --> 00:24:19 Some people were unlucky -- oh dee do dee dee, 512 00:24:19 --> 00:24:21 oh yeah, there's a T. 513 00:24:21 --> 00:24:26 And some people were trying to psych out the professor and 514 00:24:26 --> 00:24:28 said, there was a yes, there was a no, there 515 00:24:28 --> 00:24:29 was another yes. 516 00:24:29 --> 00:24:31 I know about this: there's going to be a no. 517 00:24:31 --> 00:24:34 And they said no without doing anything so boring as to 518 00:24:34 --> 00:24:37 actually look at the display. 519 00:24:37 --> 00:24:43 So what you get for data in an experiment like this would 520 00:24:43 --> 00:24:44 look much more like this. 521 00:24:44 --> 00:24:48 As you increase the set size, now the reaction 522 00:24:48 --> 00:24:54 time increases in a fairly linear kind of a way. 523 00:24:54 --> 00:24:59 The slope on these is quite fast. 524 00:24:59 --> 00:25:02 I mean, this is 20 to 30 milliseconds, thousandths of a 525 00:25:02 --> 00:25:05 second, for each additional item to say yes, and about 526 00:25:05 --> 00:25:08 twice that amount to say no. 527 00:25:08 --> 00:25:12 Depending on how one exactly models this, this suggests that 528 00:25:12 --> 00:25:16 you're running through 20 to 40 of these letters a second. 529 00:25:16 --> 00:25:19 So you're going through it quickly, but you're 530 00:25:19 --> 00:25:21 having to search now. 531 00:25:21 --> 00:25:23 It's not simply obvious that there's a T there; you've 532 00:25:23 --> 00:25:25 got to go and hunt for it. 533 00:25:25 --> 00:25:28 Over here you can look for the 5, is another typical sort of 534 00:25:28 --> 00:25:39 task that would produce results like that. 535 00:25:39 --> 00:25:41 I wanted to say one other thing about that, but now I don't 536 00:25:41 --> 00:25:41 remember what it was. 537 00:25:41 --> 00:25:42 Oh yes. 538 00:25:42 --> 00:25:45 What I wanted to say was that the speed of this tells you 539 00:25:45 --> 00:25:51 that you're not looking at the rate of fixation 540 00:25:51 --> 00:25:53 on each letter. 541 00:25:53 --> 00:25:55 If you're doing this in the lab, you make sure that your 542 00:25:55 --> 00:25:58 stimuli are big enough that you don't have to move your 543 00:25:58 --> 00:26:00 eyes to look at each one. 544 00:26:00 --> 00:26:04 If you have to move your eyes, your eyes only move at a 545 00:26:04 --> 00:26:07 rate of about 4 per second. 546 00:26:07 --> 00:26:11 And so if you have to fixate each one of the items before 547 00:26:11 --> 00:26:13 you can tell if it's a T or an L -- so if you used little 548 00:26:13 --> 00:26:17 teeny letters -- this slope would be more like 250 549 00:26:17 --> 00:26:24 milliseconds per item, not 40 or 50 or something like that. 550 00:26:24 --> 00:26:26 Attention can move much more quickly than the eyes. 551 00:26:26 --> 00:26:32 One of the things that tells you is that you can attend 552 00:26:32 --> 00:26:33 where you're not looking. 553 00:26:33 --> 00:26:35 Something that basketball players know very well. 554 00:26:35 --> 00:26:38 When you hear that a basketball player has great peripheral 555 00:26:38 --> 00:26:43 vision, what that really means is that he can be looking here 556 00:26:43 --> 00:26:46 and he can be paying attention to his teammate over there, and 557 00:26:46 --> 00:26:48 throw the ball and fake out the opposition. 558 00:26:48 --> 00:26:50 Because the usual assumption is that you're attending 559 00:26:50 --> 00:26:52 where you're looking. 560 00:26:52 --> 00:26:54 Most of the time that's true. 561 00:26:54 --> 00:26:58 But OK, so now I'm looking at this guy wearing red up there. 562 00:26:58 --> 00:27:02 And he thinks that I'm actually paying attention to him. 563 00:27:02 --> 00:27:04 But I'm not, actually. 564 00:27:04 --> 00:27:06 Because of acuity limitations, I have no idea what I'm paying 565 00:27:06 --> 00:27:09 attention to here, but I think it's a woman person, and 566 00:27:09 --> 00:27:11 I think she just moved. 567 00:27:11 --> 00:27:13 Oh yeah, look, it is a woman person. 568 00:27:13 --> 00:27:15 569 00:27:15 --> 00:27:18 I can move my attention away from the point of fixation. 570 00:27:18 --> 00:27:21 And I can move my attention much more rapidly than 571 00:27:21 --> 00:27:25 I can move my eyes. 572 00:27:25 --> 00:27:32 Now, the find the red thing among green things is a case 573 00:27:32 --> 00:27:36 where the property of the target is one of these basic 574 00:27:36 --> 00:27:40 features and immediately gets your attention. 575 00:27:40 --> 00:27:45 The find the 2 among 5's, or the T among L's is a case where 576 00:27:45 --> 00:27:49 everything in the relevant display is essentially the same 577 00:27:49 --> 00:27:51 as far as the early visual system is concerned. 578 00:27:51 --> 00:27:54 T's among L's, it's a vertical and a horizontal line among 579 00:27:54 --> 00:27:56 other vertical and horizontal lines. 580 00:27:56 --> 00:27:59 There's nothing in this early processing the tells those 581 00:27:59 --> 00:28:01 apart, it turns out. 582 00:28:01 --> 00:28:05 Most real world searches are not like that. 583 00:28:05 --> 00:28:10 In most real world searches, oh let's see, what do I 584 00:28:10 --> 00:28:11 feel like looking for? 585 00:28:11 --> 00:28:14 I'll look for glasses. 586 00:28:14 --> 00:28:20 If I'm looking for eyeglasses -- there are some right there, 587 00:28:20 --> 00:28:21 and there's some more. 588 00:28:21 --> 00:28:24 589 00:28:24 --> 00:28:28 There's no process early in my visual system, you know, some 590 00:28:28 --> 00:28:32 huge chunk of cortex devoted to eyeglass detection. it 591 00:28:32 --> 00:28:33 just doesn't happen. 592 00:28:33 --> 00:28:37 At the same time, I don't search around randomly. 593 00:28:37 --> 00:28:39 No glasses there, no glasses there, no glasses there, 594 00:28:39 --> 00:28:39 no glasses there. 595 00:28:39 --> 00:28:42 I'm searching in an intelligent fashion. 596 00:28:42 --> 00:28:43 Here's how you do that. 597 00:28:43 --> 00:28:46 Let's do one more basic search. 598 00:28:46 --> 00:28:50 What you're looking for here is a red horizontal line. 599 00:28:50 --> 00:28:53 Tell me as fast as you can whether it's present. 600 00:28:53 --> 00:28:54 AUDIENCE: Yes. 601 00:28:54 --> 00:28:57 PROFESSOR: Now how you do that is not by having a chunk of 602 00:28:57 --> 00:29:00 your brain devoted specifically to read horizontals. 603 00:29:00 --> 00:29:02 Oh, remind me later; I've got to check whether you still 604 00:29:02 --> 00:29:02 have a [? McCullough ?] 605 00:29:02 --> 00:29:05 effect, speaking of red horizontals. 606 00:29:05 --> 00:29:06 We'll check that out later. 607 00:29:06 --> 00:29:14 The way you do that is, you use those 12 to 18 basic features 608 00:29:14 --> 00:29:17 to guide your attention around in an intelligent fashion. 609 00:29:17 --> 00:29:20 So if you're looking for red horizontals, you've got 610 00:29:20 --> 00:29:22 something that can do red. 611 00:29:22 --> 00:29:24 You know, give me all the red things. 612 00:29:24 --> 00:29:27 You've got something that can do vertical. 613 00:29:27 --> 00:29:29 Was I looking for red horizontals or red verticals? 614 00:29:29 --> 00:29:31 Well, anyway. 615 00:29:31 --> 00:29:32 This is a red vertical. 616 00:29:32 --> 00:29:34 You've got something that can do vertical. 617 00:29:34 --> 00:29:36 So I've got the red things, I've got the vertical things. 618 00:29:36 --> 00:29:39 I can do that early on in the system. 619 00:29:39 --> 00:29:42 All I need is something that will do something like an 620 00:29:42 --> 00:29:44 intersection operation. 621 00:29:44 --> 00:29:48 And if I were to guide to my attention to the intersection 622 00:29:48 --> 00:29:51 of the set of all red things and the set of all vertical 623 00:29:51 --> 00:29:53 things, that'd be a really good place to look for 624 00:29:53 --> 00:29:54 red vertical things. 625 00:29:54 --> 00:29:57 Oh look, there it is. 626 00:29:57 --> 00:30:02 So what you've got is a front end that collects information 627 00:30:02 --> 00:30:06 that can be used to control this bottleneck to guide your 628 00:30:06 --> 00:30:08 attention around, to feed sensible things to the 629 00:30:08 --> 00:30:15 back end of the system. 630 00:30:15 --> 00:30:17 I think that's sort of pictured there. 631 00:30:17 --> 00:30:20 And the result is that a search for something like a red 632 00:30:20 --> 00:30:23 vertical line, it's not as easy as finding a red thing among 633 00:30:23 --> 00:30:26 green things, but it's pretty easy. 634 00:30:26 --> 00:30:30 It's easier than finding a 2 among 5's or a T among L's, 635 00:30:30 --> 00:30:31 or anything like that. 636 00:30:31 --> 00:30:35 Now this sort of guidance comes in two different forms. 637 00:30:35 --> 00:30:38 Or you can think of it as coming in two different forms. 638 00:30:38 --> 00:30:44 There's a bottom-up form that's sort of stimulus driven. 639 00:30:44 --> 00:30:48 And then there's a top-down form that's user driven 640 00:30:48 --> 00:30:50 by your desires. 641 00:30:50 --> 00:30:54 Let me illustrate that with a couple more searches for a T. 642 00:30:54 --> 00:30:57 Tell me as fast as you can whether or not there's a 643 00:30:57 --> 00:30:59 T in the next display. 644 00:30:59 --> 00:31:02 Ready? 645 00:31:02 --> 00:31:03 AUDIENCE: Yes. 646 00:31:03 --> 00:31:04 PROFESSOR: That was pretty crisp. 647 00:31:04 --> 00:31:06 How did you do it? 648 00:31:06 --> 00:31:08 Muhmuh. 649 00:31:08 --> 00:31:10 That's what I thought. 650 00:31:10 --> 00:31:15 Most people probably found their attention sort of 651 00:31:15 --> 00:31:19 automatically grabbed by this one oddball, which conveniently 652 00:31:19 --> 00:31:21 enough turned out to be the T. 653 00:31:21 --> 00:31:25 And so rather than having to search around, your attention 654 00:31:25 --> 00:31:31 was grabbed bottom-up to this item. 655 00:31:31 --> 00:31:36 Top-down is based on what you know, or what you've been told, 656 00:31:36 --> 00:31:41 or instructions that you've somehow given to yourself. 657 00:31:41 --> 00:31:43 So I'm going to tell you, if there's a T in the 658 00:31:43 --> 00:31:46 next display, it's red. 659 00:31:46 --> 00:31:48 What happened out there? 660 00:31:48 --> 00:31:50 Oh, that was another -- that was also grabbing attention. 661 00:31:50 --> 00:31:53 It works in the auditory domain, too. 662 00:31:53 --> 00:31:55 If we set off an explosion, unsurprisingly, 663 00:31:55 --> 00:31:57 you would notice. 664 00:31:57 --> 00:31:58 All right, you ready? 665 00:31:58 --> 00:32:01 Is there a T in this next display? 666 00:32:01 --> 00:32:02 AUDIENCE: Yes. 667 00:32:02 --> 00:32:05 PROFESSOR: Whoever said no was another speed-accuracy 668 00:32:05 --> 00:32:10 tradeoff, try to smoke out the professor who had a yes on the 669 00:32:10 --> 00:32:12 last one and therefore must have a no on this one. 670 00:32:12 --> 00:32:13 Look at the display! 671 00:32:13 --> 00:32:17 Anyway, that's not as easy as the previous one. 672 00:32:17 --> 00:32:23 But if you searched around, you probably noticed, or you may 673 00:32:23 --> 00:32:26 have noticed, that you were searching through 674 00:32:26 --> 00:32:27 the red items. 675 00:32:27 --> 00:32:29 You're not going to bother searching through the black 676 00:32:29 --> 00:32:31 items if you know the T is going to be red. 677 00:32:31 --> 00:32:31 Right? 678 00:32:31 --> 00:32:40 So let us suppose we did an experiment where the T could 679 00:32:40 --> 00:32:43 be either black or red. 680 00:32:43 --> 00:32:46 And I show you a bunch of displays like this, and I vary 681 00:32:46 --> 00:32:51 the set size, the number of items on the screen. 682 00:32:51 --> 00:32:53 Measure your reaction time. 683 00:32:53 --> 00:32:57 Let's suppose that the slope of that function was 30 684 00:32:57 --> 00:33:01 milliseconds an item. 685 00:33:01 --> 00:33:04 If that's the case, and half the items are red in this 686 00:33:04 --> 00:33:08 display -- or on average half the items are red -- what's the 687 00:33:08 --> 00:33:12 slope going to look like if I tell you that the T is 688 00:33:12 --> 00:33:15 always red if it's present? 689 00:33:15 --> 00:33:17 AUDIENCE: [INAUDIBLE] 690 00:33:17 --> 00:33:17 PROFESSOR: Less steep. 691 00:33:17 --> 00:33:18 Yeah. 692 00:33:18 --> 00:33:20 Specifically how less steep? 693 00:33:20 --> 00:33:21 AUDIENCE: [INAUDIBLE] 694 00:33:21 --> 00:33:22 PROFESSOR: Very less steep. 695 00:33:22 --> 00:33:23 That's not specific. 696 00:33:23 --> 00:33:24 I want a number. 697 00:33:24 --> 00:33:25 AUDIENCE: 15. 698 00:33:25 --> 00:33:26 PROFESSOR: 15. 699 00:33:26 --> 00:33:28 Good number. 700 00:33:28 --> 00:33:29 Right? 701 00:33:29 --> 00:33:35 If you can eliminate half the items, the effective rate of 702 00:33:35 --> 00:33:37 search is going to be twice as great. 703 00:33:37 --> 00:33:38 So the slope will drop in half. 704 00:33:38 --> 00:33:41 And that's exactly what you get in experiments like this. 705 00:33:41 --> 00:33:42 They work very nicely. 706 00:33:42 --> 00:33:46 If you have only half the items on the screen relevant, 707 00:33:46 --> 00:33:50 subjects behave as though they are only looking through 708 00:33:50 --> 00:33:52 half of the items. 709 00:33:52 --> 00:33:57 So by now I have answered question two: what escapes 710 00:33:57 --> 00:33:58 the bottleneck of attention? 711 00:33:58 --> 00:34:04 Well, there are these 12 to 18 basic properties or features of 712 00:34:04 --> 00:34:11 the world that seem to escape the bottleneck. 713 00:34:11 --> 00:34:15 We can study this by measuring reaction time. 714 00:34:15 --> 00:34:17 There are other methods, too, of course, but I was telling 715 00:34:17 --> 00:34:21 you about the reaction time methods. 716 00:34:21 --> 00:34:24 Oh, I see I put Anne Treisman and feature integration 717 00:34:24 --> 00:34:27 theory on there. 718 00:34:27 --> 00:34:29 Don't worry about the feature integration part. 719 00:34:29 --> 00:34:35 That's simply to allow me to a give honor to Anne Treisman who 720 00:34:35 --> 00:34:40 really founded the modern study of visual attention, after 721 00:34:40 --> 00:34:45 having pioneered an awful lot of the auditory things. 722 00:34:45 --> 00:34:48 The auditory demo at the beginning was a classroom 723 00:34:48 --> 00:34:50 version of what's called dichotic listening. 724 00:34:50 --> 00:34:55 Typically what you do is put on a pair of headphones, and you 725 00:34:55 --> 00:34:58 would have one stream of speech in one ear and one stream of 726 00:34:58 --> 00:34:59 speech in the other ear. 727 00:34:59 --> 00:35:02 And you ask questions about, if you're attending through 728 00:35:02 --> 00:35:06 this ear, what can you still pick up through this ear? 729 00:35:06 --> 00:35:09 Anne was doing those things in the late '50s, early '60s. 730 00:35:09 --> 00:35:13 Went on in the '70s and '80s to really invent this field of the 731 00:35:13 --> 00:35:21 study of visual search, and is still doing great stuff, 732 00:35:21 --> 00:35:21 now at Princeton. 733 00:35:21 --> 00:35:23 She was not at Princeton when I was an undergraduate 734 00:35:23 --> 00:35:27 there, but she's there now. 735 00:35:27 --> 00:35:29 All right, so I answered question three. 736 00:35:29 --> 00:35:32 And question four I answered by saying -- 737 00:35:32 --> 00:35:34 oh, conjunction search. 738 00:35:34 --> 00:35:38 That search for a red vertical thing is a conjunction 739 00:35:38 --> 00:35:40 of two basic features. 740 00:35:40 --> 00:35:43 It's not adequate to know that it's red; it's not adequate 741 00:35:43 --> 00:35:44 to know that it's vertical. 742 00:35:44 --> 00:35:47 The conjunction of those two sources of information 743 00:35:47 --> 00:35:53 is adequate, is what defines the target. 744 00:35:53 --> 00:35:57 And you can you use this basic feature information, the basic 745 00:35:57 --> 00:36:00 attributes of the stimulus, to guide your attention around 746 00:36:00 --> 00:36:02 in an intelligent fashion. 747 00:36:02 --> 00:36:04 So that guidance comes in two forms. 748 00:36:04 --> 00:36:10 It can be bottom-up stimulus driven or top-down user driven. 749 00:36:10 --> 00:36:15 All right, so what is that attention actually doing? 750 00:36:15 --> 00:36:24 Why is it that you need to have this -- what is attention 751 00:36:24 --> 00:36:28 making possible here that wasn't possible before? 752 00:36:28 --> 00:36:30 Oh look, it says that right there. 753 00:36:30 --> 00:36:31 Or what were those features doing before 754 00:36:31 --> 00:36:33 attention shows up? 755 00:36:33 --> 00:36:38 Well, here is an answer to that. 756 00:36:38 --> 00:36:42 The answer is that you've got all those features. 757 00:36:42 --> 00:36:46 And in fact, early processes in the visual system seem to cut 758 00:36:46 --> 00:36:49 the scene up into what you might consider to 759 00:36:49 --> 00:36:52 be proto-objects. 760 00:36:52 --> 00:36:55 But those features are just sort of bundled together 761 00:36:55 --> 00:36:59 with an object. 762 00:36:59 --> 00:37:03 So before your attention arrives, something like this 763 00:37:03 --> 00:37:08 would be red and green and vertical and horizontal, and 764 00:37:08 --> 00:37:11 it's got points on it, or something. 765 00:37:11 --> 00:37:17 What attention does is to bind those features together in a 766 00:37:17 --> 00:37:21 way that makes it possible for you to know that the greenness 767 00:37:21 --> 00:37:23 goes with the verticalness here, and the redness goes 768 00:37:23 --> 00:37:25 with the horizontalness. 769 00:37:25 --> 00:37:27 And those points are arranged, the whole thing's 770 00:37:27 --> 00:37:28 arranged into a plus. 771 00:37:28 --> 00:37:35 The argument is that, OK, I need attention in order to 772 00:37:35 --> 00:37:37 recognize any given individual. 773 00:37:37 --> 00:37:40 Before attention arrives on that individual, that 774 00:37:40 --> 00:37:44 person isn't, you know, a black hole in space. 775 00:37:44 --> 00:37:47 That person is a loose bundle of features. 776 00:37:47 --> 00:37:50 That attention allows me to bind those features together 777 00:37:50 --> 00:37:54 in a way that allows me to understand how they interact, 778 00:37:54 --> 00:37:57 and what that recognizable feature might be. 779 00:37:57 --> 00:37:58 So oh, there's Kristen. 780 00:37:58 --> 00:38:01 Hey, Kristen, stand up and wave. 781 00:38:01 --> 00:38:03 No really, I was plugging you before. 782 00:38:03 --> 00:38:07 So if you want to do this for $10 an hour, go find Kristen. 783 00:38:07 --> 00:38:11 So all right up, now we can make fun of Kristen. 784 00:38:11 --> 00:38:14 So before Kristen arrived -- no, before Kristen arrived, 785 00:38:14 --> 00:38:16 she was not visible. 786 00:38:16 --> 00:38:19 Before I attended to Kristen, there was presumably a 787 00:38:19 --> 00:38:25 proto-Kristen object out there that was a bundle 788 00:38:25 --> 00:38:27 of Kristen bits. 789 00:38:27 --> 00:38:31 Only when I got my attention to her -- even though she'd been 790 00:38:31 --> 00:38:33 visible all along, and I've looked over there a bunch of 791 00:38:33 --> 00:38:35 times -- even though she'd been visible all along, only when I 792 00:38:35 --> 00:38:39 got my attention to her could I bind those features together 793 00:38:39 --> 00:38:44 and make her into a recognizable Kristen. 794 00:38:44 --> 00:38:47 Let me see if I can illustrate that to you 795 00:38:47 --> 00:38:49 with another demo here. 796 00:38:49 --> 00:38:52 797 00:38:52 --> 00:38:56 And the way that's going to work is -- OK, so what you want 798 00:38:56 --> 00:39:04 to do in the next slide is to look for red verticals again. 799 00:39:04 --> 00:39:05 You ready? 800 00:39:05 --> 00:39:10 So tell me if you find a red vertical. 801 00:39:10 --> 00:39:11 AUDIENCE: Yes. 802 00:39:11 --> 00:39:11 PROFESSOR: Yeah. 803 00:39:11 --> 00:39:15 In fact, you might have noticed there are two of them. 804 00:39:15 --> 00:39:16 Very easy. 805 00:39:16 --> 00:39:17 What's the point? 806 00:39:17 --> 00:39:19 Well, this is a standard guided search kind of thing. 807 00:39:19 --> 00:39:21 Give me all the red things; give me all the vertical 808 00:39:21 --> 00:39:24 things; look at the intersection of those two sets, 809 00:39:24 --> 00:39:27 and oh lookie, there's two red verticals up there. 810 00:39:27 --> 00:39:33 Now what I'm going to do is to simply take the horizontal here 811 00:39:33 --> 00:39:36 and jump it up to the middle of the vertical bit That's why 812 00:39:36 --> 00:39:37 this is in this sort of odd arrangement. 813 00:39:37 --> 00:39:38 I'm going to jump it up here. 814 00:39:38 --> 00:39:41 So I'm going to make a plus, like those pluses 815 00:39:41 --> 00:39:43 that we just saw. 816 00:39:43 --> 00:39:46 The reason for doing this is, I'm going to keep all the 817 00:39:46 --> 00:39:48 same pixels on the screen. 818 00:39:48 --> 00:39:49 Right? 819 00:39:49 --> 00:39:51 I'm just going to rearrange where the reds and greens are. 820 00:39:51 --> 00:39:54 And of course I'm going to change the location of the red 821 00:39:54 --> 00:39:56 vertical, because it's really boring if I keep it 822 00:39:56 --> 00:39:57 in the same place. 823 00:39:57 --> 00:39:59 But you're looking for red vertical again. 824 00:39:59 --> 00:40:02 Ready? 825 00:40:02 --> 00:40:04 AUDIENCE: Yes. 826 00:40:04 --> 00:40:05 PROFESSOR: Who said no? 827 00:40:05 --> 00:40:06 AUDIENCE: I said woah. 828 00:40:06 --> 00:40:07 PROFESSOR: Oh, woah. 829 00:40:07 --> 00:40:08 OK. 830 00:40:08 --> 00:40:10 Woah's good, woah is good. 831 00:40:10 --> 00:40:12 832 00:40:12 --> 00:40:14 Particularly by the time it says find the two 833 00:40:14 --> 00:40:15 red vertical lines. 834 00:40:15 --> 00:40:20 Anyway, you should have found both of them. 835 00:40:20 --> 00:40:23 Let's let's check intuition here. 836 00:40:23 --> 00:40:27 How many people vote that it was easier to find the red 837 00:40:27 --> 00:40:30 verticals when they were in pluses? 838 00:40:30 --> 00:40:31 How many vote that it was easier when they 839 00:40:31 --> 00:40:33 were ripped apart? 840 00:40:33 --> 00:40:36 That is the correct intuition. 841 00:40:36 --> 00:40:38 Actually, I think I put the data -- I think I realized 842 00:40:38 --> 00:40:41 earlier than I put half the data on a slide. 843 00:40:41 --> 00:40:43 This is the data for looking for the pluses. 844 00:40:43 --> 00:40:47 Quite steep slopes of about 50 milliseconds an item 845 00:40:47 --> 00:40:49 here and about 140 here. 846 00:40:49 --> 00:40:54 Just looking for the red verticals when they were in the 847 00:40:54 --> 00:40:57 disassociated pluses would have been down here, with 848 00:40:57 --> 00:40:59 a slope of about 10. 849 00:40:59 --> 00:41:03 But I somehow left it off the slide. 850 00:41:03 --> 00:41:05 Why is this? 851 00:41:05 --> 00:41:09 Why are the pluses so much more difficult? 852 00:41:09 --> 00:41:13 The answer is that before attention arrives on the 853 00:41:13 --> 00:41:18 object, these two pluses are essentially the same thing. 854 00:41:18 --> 00:41:21 They are red and green and vertical and horizontal. 855 00:41:21 --> 00:41:26 And without attention, you just don't know the 856 00:41:26 --> 00:41:28 difference between them. 857 00:41:28 --> 00:41:34 This thing, this square has red and green and vertical and 858 00:41:34 --> 00:41:37 horizontal in it, but it's in two objects. 859 00:41:37 --> 00:41:40 And so since you direct your attention to objects -- to 860 00:41:40 --> 00:41:45 things that are objects; I've got too many "to"s in there -- 861 00:41:45 --> 00:41:48 this is not a problem in the way that these guys 862 00:41:48 --> 00:41:49 are a problem. 863 00:41:49 --> 00:41:51 In fact anything that you do -- I don't think I brought the 864 00:41:51 --> 00:41:55 demo, but anything that you do to make this less like a single 865 00:41:55 --> 00:41:57 object makes the task easier. 866 00:41:57 --> 00:42:01 So if I was to put a little shadow on here, so that it 867 00:42:01 --> 00:42:04 would look like this thing, the vertical piece was sticking out 868 00:42:04 --> 00:42:09 in front of the horizontal piece, it would get easier. 869 00:42:09 --> 00:42:13 Because now you could direct your attention separately to 870 00:42:13 --> 00:42:18 different planes in depth. 871 00:42:18 --> 00:42:22 So attention is directed to objects, and objects are 872 00:42:22 --> 00:42:24 available ahead of time as just sort of these 873 00:42:24 --> 00:42:28 loose configurations, constellations of features. 874 00:42:28 --> 00:42:33 Once attention gets there, they get glued together 875 00:42:33 --> 00:42:34 into recognizable objects. 876 00:42:34 --> 00:42:36 All right. 877 00:42:36 --> 00:42:43 So what happens when you move away from an attended object? 878 00:42:43 --> 00:42:46 879 00:42:46 --> 00:42:51 That's not a unreasonable question in this framework. 880 00:42:51 --> 00:42:52 So let's see. 881 00:42:52 --> 00:42:54 I need -- Rachel. 882 00:42:54 --> 00:42:54 There's Rachel. 883 00:42:54 --> 00:42:55 I thought I recognized her. 884 00:42:55 --> 00:42:58 All right, I have now recognized Rachel. 885 00:42:58 --> 00:43:00 Limited number of people who I actually recognize 886 00:43:00 --> 00:43:01 by name in here. 887 00:43:01 --> 00:43:04 And they come to regret it. 888 00:43:04 --> 00:43:05 But anyway, all right. 889 00:43:05 --> 00:43:07 So she was here all along. 890 00:43:07 --> 00:43:12 I happen to have attended to her and bound Rachel into a 891 00:43:12 --> 00:43:13 recognizable Rachel object. 892 00:43:13 --> 00:43:17 I now, without moving my eyes in fact, I'm 893 00:43:17 --> 00:43:18 attending elsewhere. 894 00:43:18 --> 00:43:21 And somebody's up there, again, my peripheral vision's lousy, 895 00:43:21 --> 00:43:23 but I can see that somebody was moving up there. 896 00:43:23 --> 00:43:25 They waved a piece of white paper a moment ago. 897 00:43:25 --> 00:43:29 The question is, when I moved my attention elsewhere, 898 00:43:29 --> 00:43:31 what happened to Rachel? 899 00:43:31 --> 00:43:34 Did she remain bound, or did she collapse into 900 00:43:34 --> 00:43:36 Rachel bits again? 901 00:43:36 --> 00:43:38 AUDIENCE: [INAUDIBLE] 902 00:43:38 --> 00:43:40 PROFESSOR: What? 903 00:43:40 --> 00:43:42 She collapsed into Rachel bits. 904 00:43:42 --> 00:43:43 How could you tell? 905 00:43:43 --> 00:43:48 AUDIENCE: [INAUDIBLE] 906 00:43:48 --> 00:43:49 PROFESSOR: That's why I was deliberately still 907 00:43:49 --> 00:43:52 looking at her, to avoid the issues of blur. 908 00:43:52 --> 00:43:56 But the way to do this is not to continue picking on 909 00:43:56 --> 00:44:02 Rachel, but the switch to dancing chickens here. 910 00:44:02 --> 00:44:04 There we have -- you can tell we're back in the 911 00:44:04 --> 00:44:05 realm of my artwork. 912 00:44:05 --> 00:44:07 913 00:44:07 --> 00:44:10 Oh, I like this, with the chickens on three screens. 914 00:44:10 --> 00:44:11 This is so good. 915 00:44:11 --> 00:44:15 916 00:44:15 --> 00:44:17 Anyway, I like those a lot. 917 00:44:17 --> 00:44:20 Now, so you know you know what you're looking at here. 918 00:44:20 --> 00:44:22 You're looking at a bunch of chickens, right? 919 00:44:22 --> 00:44:27 And they're doing this little leggy thing. 920 00:44:27 --> 00:44:31 You would think that, having recognized that there's a bunch 921 00:44:31 --> 00:44:34 of chickens there who are doing this little dance, that if one 922 00:44:34 --> 00:44:39 of those chickens fell apart into chicken bits, that 923 00:44:39 --> 00:44:41 you would notice, right? 924 00:44:41 --> 00:44:42 Seems reasonable. 925 00:44:42 --> 00:44:43 How many of you noticed? 926 00:44:43 --> 00:44:46 927 00:44:46 --> 00:44:48 Ooh, ooh, very slow group here. 928 00:44:48 --> 00:44:50 It should be -- how many chickens are there 929 00:44:50 --> 00:44:51 here, about 20? 930 00:44:51 --> 00:44:53 It should be about one in 20 of you happen to be -- you 931 00:44:53 --> 00:44:58 have all seen that already. 932 00:44:58 --> 00:45:04 933 00:45:04 --> 00:45:10 So one of these chickens fell apart. 934 00:45:10 --> 00:45:12 Well, if you think, quite apart from the fact that the artwork 935 00:45:12 --> 00:45:16 is a little lame, the implications are non-lame. 936 00:45:16 --> 00:45:20 The implication is, all right, I'm looking at you guys. 937 00:45:20 --> 00:45:25 I think I'm looking at a bunch of humanoid life forms. 938 00:45:25 --> 00:45:28 They're moving a little bit, stuff like that. 939 00:45:28 --> 00:45:32 And you would think that if one of you just went to pieces 940 00:45:32 --> 00:45:36 here, that I would notice. 941 00:45:36 --> 00:45:40 The data strongly suggests that that's not the case. 942 00:45:40 --> 00:45:43 That I would eventually notice, as my attention roves around 943 00:45:43 --> 00:45:46 the room, if it turned out that, oh my god, not only has 944 00:45:46 --> 00:45:51 that person not dozed off, but her head fell off, I would 945 00:45:51 --> 00:45:57 notice that and react with according shock and amusement. 946 00:45:57 --> 00:46:03 The way this experiment is actually done is not with the 947 00:46:03 --> 00:46:04 cute little dancing bits. 948 00:46:04 --> 00:46:09 You'd be looking at a screen like this, and 949 00:46:09 --> 00:46:10 you'd hear, beep. 950 00:46:10 --> 00:46:13 And the question would be, is there a destroyed chicken? 951 00:46:13 --> 00:46:15 952 00:46:15 --> 00:46:18 Yeah, it's there, right? 953 00:46:18 --> 00:46:20 Beep. 954 00:46:20 --> 00:46:20 Beep. 955 00:46:20 --> 00:46:21 AUDIENCE: Yes. 956 00:46:21 --> 00:46:22 PROFESSOR: Beep. 957 00:46:22 --> 00:46:23 AUDIENCE: No. 958 00:46:23 --> 00:46:23 PROFESSOR: Beep. 959 00:46:23 --> 00:46:24 AUDIENCE: Yes. 960 00:46:24 --> 00:46:24 PROFESSOR: And 961 00:46:24 --> 00:46:26 so on. 962 00:46:26 --> 00:46:27 You can do it. 963 00:46:27 --> 00:46:29 It's not a difficult task at all, particularly 964 00:46:29 --> 00:46:32 with a few big chickens. 965 00:46:32 --> 00:46:34 But you have to search. 966 00:46:34 --> 00:46:37 You have to search through the chickens each time. 967 00:46:37 --> 00:46:41 And you're no better with a display that's got the same 968 00:46:41 --> 00:46:44 fixed number of chickens up there all the time, compared to 969 00:46:44 --> 00:46:47 a display which has de novo chickens popping up out of 970 00:46:47 --> 00:46:51 nothingness each time. 971 00:46:51 --> 00:46:53 Oh, the feet are moving around. 972 00:46:53 --> 00:46:56 for the demo, why are the feet doing this 973 00:46:56 --> 00:46:58 little chicken dance? 974 00:46:58 --> 00:47:00 Remember I said that motion is one of these things you 975 00:47:00 --> 00:47:03 can pick up automatically? 976 00:47:03 --> 00:47:07 If you don't have something like the little moving feet, 977 00:47:07 --> 00:47:11 then when you have a chicken fall apart - boink - the 978 00:47:11 --> 00:47:14 movement of the contour, compared to all the ones that 979 00:47:14 --> 00:47:16 aren't moving at all tips you off that there's 980 00:47:16 --> 00:47:17 something there. 981 00:47:17 --> 00:47:20 And that tells you that motion's important, but 982 00:47:20 --> 00:47:23 it doesn't tell you the interesting fact that you're 983 00:47:23 --> 00:47:28 not aware when an otherwise coherent object falls to bits. 984 00:47:28 --> 00:47:33 By the way, it turns out you're also not aware when previously 985 00:47:33 --> 00:47:36 incoherent material coheres into a chicken. 986 00:47:36 --> 00:47:39 We did the classic chicken soup experiment. 987 00:47:39 --> 00:47:45 We had a screen full of chicken bits like this, and you heard 988 00:47:45 --> 00:47:47 beep, and you had to figure out whether or not there was 989 00:47:47 --> 00:47:48 now a chicken present. 990 00:47:48 --> 00:47:50 And you had to search for that, too. 991 00:47:50 --> 00:47:53 So chickens emerging from the chicken soup, which you might 992 00:47:53 --> 00:47:57 think would be striking, don't turn out to be striking either. 993 00:47:57 --> 00:47:57 All right. 994 00:47:57 --> 00:48:02 Well the chickens are kind of ugly and complicated. 995 00:48:02 --> 00:48:06 How bad is this problem? 996 00:48:06 --> 00:48:10 So let's get basic here. 997 00:48:10 --> 00:48:14 No more trying to fool you. 998 00:48:14 --> 00:48:15 Well, of course I'm trying to fool you. 999 00:48:15 --> 00:48:19 No more dancing around chickens, and then oh, did you 1000 00:48:19 --> 00:48:21 see -- after the fact I ask you whether you saw something 1001 00:48:21 --> 00:48:23 that fell apart. 1002 00:48:23 --> 00:48:26 These are what? 1003 00:48:26 --> 00:48:28 Red and green dots. 1004 00:48:28 --> 00:48:32 If you weren't sure about that, it says so at the top. 1005 00:48:32 --> 00:48:39 All I'm going to do is, I'm going to cue one dot -- I 1006 00:48:39 --> 00:48:40 don't care about any of the other dots. 1007 00:48:40 --> 00:48:46 All I want to know is, did that one dot change color? 1008 00:48:46 --> 00:48:49 Say yes or no. 1009 00:48:49 --> 00:48:50 Whoops. 1010 00:48:50 --> 00:48:50 Where'd it go? 1011 00:48:50 --> 00:48:56 AUDIENCE: [INAUDIBLE] 1012 00:48:56 --> 00:48:59 PROFESSOR: Well, the answer turns out to be no. 1013 00:48:59 --> 00:49:05 1014 00:49:05 --> 00:49:11 This is such a great exercise in applied statistics, right? 1015 00:49:11 --> 00:49:18 How many -- he can't really be -- he said the last one, so no. 1016 00:49:18 --> 00:49:19 AUDIENCE: Yes. 1017 00:49:19 --> 00:49:21 PROFESSOR: Oh yeah, but he can't possibly be doing 1018 00:49:21 --> 00:49:25 three in a row, right? 1019 00:49:25 --> 00:49:27 AUDIENCE: No. 1020 00:49:27 --> 00:49:29 PROFESSOR: That does turn out to be a no. 1021 00:49:29 --> 00:49:32 Look, you can hear people going both ways. 1022 00:49:32 --> 00:49:35 People are terrible at this. 1023 00:49:35 --> 00:49:38 They're just barely above chance. 1024 00:49:38 --> 00:49:42 And the barely above chance is consistent with them sort of 1025 00:49:42 --> 00:49:47 sitting on two or three dots. 1026 00:49:47 --> 00:49:49 Because you're not just doing a couple of these, you're doing 1027 00:49:49 --> 00:49:53 hundreds of these, for $10 an hour. 1028 00:49:53 --> 00:49:58 So you can sit on a couple of them and say, if I get really 1029 00:49:58 --> 00:50:00 lucky and he cues the one I'm looking at, I'm going 1030 00:50:00 --> 00:50:01 to get this right. 1031 00:50:01 --> 00:50:04 And if he doesn't, I'm clueless. 1032 00:50:04 --> 00:50:05 I mean, it's red and green. 1033 00:50:05 --> 00:50:08 It doesn't get more basic than that. 1034 00:50:08 --> 00:50:08 Yup? 1035 00:50:08 --> 00:50:10 AUDIENCE: I have a question. 1036 00:50:10 --> 00:50:12 Do people's reaction times change? 1037 00:50:12 --> 00:50:17 Because red and green, they have the same after 1038 00:50:17 --> 00:50:19 color, or afterimage. 1039 00:50:19 --> 00:50:20 PROFESSOR: They'd better not have the same after -- 1040 00:50:20 --> 00:50:21 they have the opposite. 1041 00:50:21 --> 00:50:21 Yes. 1042 00:50:21 --> 00:50:22 AUDIENCE: No, no. 1043 00:50:22 --> 00:50:25 But the opposite of red is green, and the opposite 1044 00:50:25 --> 00:50:25 of green is red. 1045 00:50:25 --> 00:50:29 So if you do yellow and blue or something else --? 1046 00:50:29 --> 00:50:31 PROFESSOR: Well, yellow and blue are also opposite 1047 00:50:31 --> 00:50:31 in the same sense. 1048 00:50:31 --> 00:50:33 But it doesn't matter. 1049 00:50:33 --> 00:50:34 The color does not matter. 1050 00:50:34 --> 00:50:37 In fact, we can do another one with different colors. 1051 00:50:37 --> 00:50:40 Look at this new. 1052 00:50:40 --> 00:50:42 More cool colors. 1053 00:50:42 --> 00:50:46 But maybe I was just being nasty to you. 1054 00:50:46 --> 00:50:49 Because there were a lot of dots up there for 1055 00:50:49 --> 00:50:50 you to choose among. 1056 00:50:50 --> 00:50:52 So I'll tell you the relevant dots. 1057 00:50:52 --> 00:50:56 What I'm going to do here is I'll ask you about the 1058 00:50:56 --> 00:50:58 color of specific dots. 1059 00:50:58 --> 00:51:00 I won't change them. 1060 00:51:00 --> 00:51:03 I'll just put them up there and ask you about particular dots. 1061 00:51:03 --> 00:51:06 And what I want you to do is tell me the color. 1062 00:51:06 --> 00:51:10 So if I say, what color is that dot, the answer is -- 1063 00:51:10 --> 00:51:11 AUDIENCE: Purple. 1064 00:51:11 --> 00:51:13 PROFESSOR: Good. 1065 00:51:13 --> 00:51:16 If I happen to cover it up with a black blob, tell me what 1066 00:51:16 --> 00:51:18 color it was before I covered it up. 1067 00:51:18 --> 00:51:20 OK? 1068 00:51:20 --> 00:51:21 Ready? 1069 00:51:21 --> 00:51:22 All right, here we go. 1070 00:51:22 --> 00:51:24 You'll see how this works. 1071 00:51:24 --> 00:51:26 Where'd it go? 1072 00:51:26 --> 00:51:27 There we go. 1073 00:51:27 --> 00:51:29 AUDIENCE: Red. 1074 00:51:29 --> 00:51:30 Yellow. 1075 00:51:30 --> 00:51:33 Blue Green. 1076 00:51:33 --> 00:51:34 Green. 1077 00:51:34 --> 00:51:37 PROFESSOR: Good. 1078 00:51:37 --> 00:51:40 See, you're not -- I put this in because at this point 1079 00:51:40 --> 00:51:44 you might be sitting there saying, I'm so hopeless! 1080 00:51:44 --> 00:51:46 And I wanted to prove to you that you're not. 1081 00:51:46 --> 00:51:48 Well, you are, but not that hopeless. 1082 00:51:48 --> 00:51:49 All right, ready? 1083 00:51:49 --> 00:51:51 AUDIENCE: Purple. 1084 00:51:51 --> 00:51:52 Red. 1085 00:51:52 --> 00:51:53 Blue. 1086 00:51:53 --> 00:51:54 Yellow. 1087 00:51:54 --> 00:51:55 Red. 1088 00:51:55 --> 00:51:56 Green. 1089 00:51:56 --> 00:51:58 Yellow. 1090 00:51:58 --> 00:51:59 PROFESSOR: Ooh, 1091 00:51:59 --> 00:52:00 a few people actually got it. 1092 00:52:00 --> 00:52:02 A bunch of people did the, urp. 1093 00:52:02 --> 00:52:04 But yes indeed, that was yellow. 1094 00:52:04 --> 00:52:08 It was cued before, so we know you paid attention to it. 1095 00:52:08 --> 00:52:11 But it was cued about five items back. 1096 00:52:11 --> 00:52:14 And so you'd paid attention to it. 1097 00:52:14 --> 00:52:17 It didn't take much binding to say, that's yellow. 1098 00:52:17 --> 00:52:20 You'd already done all the work on it. 1099 00:52:20 --> 00:52:23 Five blobs later, by the time your attention is somewhere 1100 00:52:23 --> 00:52:27 else -- it wasn't invisible during that time, right? 1101 00:52:27 --> 00:52:28 You don't really know what it is. 1102 00:52:28 --> 00:52:29 All right, try this. 1103 00:52:29 --> 00:52:31 AUDIENCE: Red. 1104 00:52:31 --> 00:52:32 Green. 1105 00:52:32 --> 00:52:32 Red. 1106 00:52:32 --> 00:52:33 [MURMURING] 1107 00:52:33 --> 00:52:37 PROFESSOR: A couple of people caught on. 1108 00:52:37 --> 00:52:39 He changed it. 1109 00:52:39 --> 00:52:40 This is what happened here. 1110 00:52:40 --> 00:52:42 Whoops, not that way. 1111 00:52:42 --> 00:52:43 Go back. 1112 00:52:43 --> 00:52:43 OK. 1113 00:52:43 --> 00:52:46 So this makes a useful and important point. 1114 00:52:46 --> 00:52:48 So, red. 1115 00:52:48 --> 00:52:52 While your attention was diverted, I changed the color. 1116 00:52:52 --> 00:52:54 1117 00:52:54 --> 00:52:57 Why is that important? 1118 00:52:57 --> 00:53:02 What that tells you, with a very basic sort of stimulus, 1119 00:53:02 --> 00:53:06 is that the following ought to be true: I attend to 1120 00:53:06 --> 00:53:08 Rachel, I attend away. 1121 00:53:08 --> 00:53:11 While I've attended away, Rachel is replaced 1122 00:53:11 --> 00:53:13 by a kangaroo. 1123 00:53:13 --> 00:53:16 I am now asked, what was there? 1124 00:53:16 --> 00:53:19 I say, you know, it was Rachel. 1125 00:53:19 --> 00:53:22 The fact that, even though, you know, still visible in the 1126 00:53:22 --> 00:53:24 visual field and everything, until I attend back, I 1127 00:53:24 --> 00:53:27 would simply not know that something had changed there. 1128 00:53:27 --> 00:53:31 So in fact, if you're worried that -- the trick here, 1129 00:53:31 --> 00:53:34 obviously, since there are 300 of you or so, you want to 1130 00:53:34 --> 00:53:37 convince me that you're paying attention in this class, you 1131 00:53:37 --> 00:53:42 draw my attention early in the class, and then you 1132 00:53:42 --> 00:53:44 subtly sneak out. 1133 00:53:44 --> 00:53:46 And presumably I think you're here attending the whole time. 1134 00:53:46 --> 00:53:50 Because how often do I get back to each individual person? 1135 00:53:50 --> 00:53:52 Well, actually, it's not that good. 1136 00:53:52 --> 00:53:55 Because at 30 to 40 people per second, I can get back 1137 00:53:55 --> 00:53:56 to you pretty quickly. 1138 00:53:56 --> 00:53:59 So forget it. 1139 00:53:59 --> 00:54:03 But don't forget the basic point here, which is that 1140 00:54:03 --> 00:54:08 you're only aware, you're only updating your knowledge about 1141 00:54:08 --> 00:54:14 the world, through this narrow bottleneck of attention, for 1142 00:54:14 --> 00:54:16 the current object of attention. 1143 00:54:16 --> 00:54:21 Everything else, you're basically working on your 1144 00:54:21 --> 00:54:25 hypothesis based on the last time you checked up on it. 1145 00:54:25 --> 00:54:27 So here is actually what the data for an experiment 1146 00:54:27 --> 00:54:32 like this look like. 1147 00:54:32 --> 00:54:35 So if you didn't pay attention to the colored dot, right? 1148 00:54:35 --> 00:54:37 If I never asked about it at all. 1149 00:54:37 --> 00:54:37 Here's chance. 1150 00:54:37 --> 00:54:40 50% in this particular experiment. 1151 00:54:40 --> 00:54:42 Because this is a two color version of it. 1152 00:54:42 --> 00:54:43 Is it red or is it green? 1153 00:54:43 --> 00:54:45 You've got about a 50-50 chance of getting it. 1154 00:54:45 --> 00:54:47 You do a little bit better than that. 1155 00:54:47 --> 00:54:51 If it was recently cued -- if I just asked you whether it was 1156 00:54:51 --> 00:54:53 red or green -- you do pretty well. 1157 00:54:53 --> 00:54:57 But as soon as it's four items ago, or eight or 12 ago, you're 1158 00:54:57 --> 00:55:00 back to being pretty pathetic. 1159 00:55:00 --> 00:55:03 So you don't keep a good record of this. 1160 00:55:03 --> 00:55:13 You're only updating in the current object of attention. 1161 00:55:13 --> 00:55:16 This suggests that your memory is pretty small here. 1162 00:55:16 --> 00:55:18 We'll talk about memory more extensively later. 1163 00:55:18 --> 00:55:20 But let me illustrate that your memory is 1164 00:55:20 --> 00:55:22 actually fairly small. 1165 00:55:22 --> 00:55:23 Here what we're going to do is, I want you to 1166 00:55:23 --> 00:55:25 remember these guys. 1167 00:55:25 --> 00:55:27 Got them? 1168 00:55:27 --> 00:55:30 OK, take them away. 1169 00:55:30 --> 00:55:31 Are these the same? 1170 00:55:31 --> 00:55:32 AUDIENCE: No. 1171 00:55:32 --> 00:55:34 PROFESSOR: OK, well your memory isn't that small. 1172 00:55:34 --> 00:55:35 That's good. 1173 00:55:35 --> 00:55:36 How about these guys? 1174 00:55:36 --> 00:55:37 AUDIENCE: No. 1175 00:55:37 --> 00:55:37 PROFESSOR: No, no, no. 1176 00:55:37 --> 00:55:41 This is a new set. 1177 00:55:41 --> 00:55:41 [LAUGHTER] 1178 00:55:41 --> 00:55:42 Ready? 1179 00:55:42 --> 00:55:44 Boink. 1180 00:55:44 --> 00:55:46 AUDIENCE: Yes. 1181 00:55:46 --> 00:55:49 PROFESSOR: Whoops. 1182 00:55:49 --> 00:55:51 Sadly, I can't remember. 1183 00:55:51 --> 00:55:53 Remember these. 1184 00:55:53 --> 00:55:54 AUDIENCE: [INAUDIBLE] 1185 00:55:54 --> 00:55:57 They look the same, don't they? 1186 00:55:57 --> 00:55:57 AUDIENCE: Yes. 1187 00:55:57 --> 00:55:59 PROFESSOR: OK. 1188 00:55:59 --> 00:56:00 So, well. 1189 00:56:00 --> 00:56:03 1190 00:56:03 --> 00:56:04 How about these? 1191 00:56:04 --> 00:56:04 AUDIENCE: No. 1192 00:56:04 --> 00:56:09 No, this is a new set. 1193 00:56:09 --> 00:56:11 AUDIENCE: Yes. 1194 00:56:11 --> 00:56:13 PROFESSOR: Yes, something changed. 1195 00:56:13 --> 00:56:17 So this time I transposed the red and the yellow. 1196 00:56:17 --> 00:56:19 That's a little more difficult, because I didn't 1197 00:56:19 --> 00:56:22 introduce a new color. 1198 00:56:22 --> 00:56:25 How about this? 1199 00:56:25 --> 00:56:27 AUDIENCE: Yes. 1200 00:56:27 --> 00:56:29 Yes. 1201 00:56:29 --> 00:56:31 PROFESSOR: People aren't quite sure. 1202 00:56:31 --> 00:56:33 The answer is that the capacity of this sort of 1203 00:56:33 --> 00:56:35 memory is about four. 1204 00:56:35 --> 00:56:37 1205 00:56:37 --> 00:56:41 And some of you will have gotten the fact that there was 1206 00:56:41 --> 00:56:43 another transposition, right? 1207 00:56:43 --> 00:56:44 Of the yellow and the green? 1208 00:56:44 --> 00:56:44 Whoops. 1209 00:56:44 --> 00:56:45 The yellows and the greens. 1210 00:56:45 --> 00:56:45 Yeah. 1211 00:56:45 --> 00:56:51 The yellow and green guys are -- whoops! -- switching there. 1212 00:56:51 --> 00:56:53 Some of you will have gotten and some of you will have not 1213 00:56:53 --> 00:56:55 gotten it, because some of you were sitting on the right 1214 00:56:55 --> 00:56:57 four and some of you were sitting on the wrong four. 1215 00:56:57 --> 00:56:59 But it's only about four. 1216 00:56:59 --> 00:56:59 Four what? 1217 00:56:59 --> 00:57:01 It turns out to be four objects. 1218 00:57:01 --> 00:57:02 Look at this. 1219 00:57:02 --> 00:57:03 Tell me if anything changes. 1220 00:57:03 --> 00:57:05 So here we have at least color, shape, and 1221 00:57:05 --> 00:57:09 orientation going on. 1222 00:57:09 --> 00:57:09 AUDIENCE: Yes. 1223 00:57:09 --> 00:57:12 PROFESSOR: Yeah, most people will know here that the red 1224 00:57:12 --> 00:57:16 thing flipped from pointing up to pointing down. 1225 00:57:16 --> 00:57:19 That would seem to suggest that you can keep track of 12 1226 00:57:19 --> 00:57:21 things, because there are four colors, four shapes, 1227 00:57:21 --> 00:57:23 and four orientations. 1228 00:57:23 --> 00:57:26 But if I spread those out across 12 objects, 1229 00:57:26 --> 00:57:27 you'd be very bad. 1230 00:57:27 --> 00:57:30 It's that you can keep track of about four objects. 1231 00:57:30 --> 00:57:33 You can keep track of multiple features of each of those 1232 00:57:33 --> 00:57:36 objects, but it's only about four objects that you 1233 00:57:36 --> 00:57:39 can keep track of. 1234 00:57:39 --> 00:57:43 Now let's see. 1235 00:57:43 --> 00:57:45 How are we doing in question land? 1236 00:57:45 --> 00:57:49 OK, so the answer to question six, at least to the first part 1237 00:57:49 --> 00:57:53 about it, is that the objects don't seem to stay bound. 1238 00:57:53 --> 00:57:58 That you need to continuously update the visual world in 1239 00:57:58 --> 00:58:02 order to have some idea of what its current state is, and that 1240 00:58:02 --> 00:58:06 you're only updating the current object of attention. 1241 00:58:06 --> 00:58:11 After a brief break, we will establish what the Sistine 1242 00:58:11 --> 00:58:14 Chapel has to tell us about that fact. 1243 00:58:14 --> 00:58:19 But those of you who wish may study this image for the next 1244 00:58:19 --> 00:58:21 couple of minutes or so. 1245 00:58:21 --> 00:58:23 And everybody else can just sort of stretch. 1246 00:58:23 --> 00:58:25 And then we'll come back. 1247 00:58:25 --> 00:58:28 1248 00:58:28 --> 00:58:30 While I apologize to Rachel for picking on her. 1249 00:58:30 --> 00:58:34 You're not traumatized for life or anything? 1250 00:58:34 --> 00:58:35 OK, good. 1251 00:58:35 --> 00:58:38 1252 00:58:38 --> 00:58:39 [? 1253 00:58:39 --> 00:59:13 [CROWD NOISES] ?] 1254 00:59:13 --> 00:59:14 AUDIENCE: 1255 00:59:14 --> 00:59:17 Have you seen this video they have where it's a bunch of 1256 00:59:17 --> 00:59:18 people bouncing balls to each other? 1257 00:59:18 --> 00:59:19 PROFESSOR: Yeah. 1258 00:59:19 --> 00:59:24 That's now gotten to be so common that I'm not using it. 1259 00:59:24 --> 00:59:27 1260 00:59:27 --> 00:59:34 [PRIVATE CONVERSATION] 1261 00:59:34 --> 00:59:36 AUDIENCE: Do you know who did that? 1262 00:59:36 --> 00:59:37 PROFESSOR: Yes, Dan Simons. 1263 00:59:37 --> 00:59:43 Then at Harvard, now at University of Illinois. 1264 00:59:43 --> 00:59:45 1265 00:59:45 --> 00:59:48 I will describe a different Dan Simons experiment in a minute. 1266 00:59:48 --> 00:59:50 OK, let's get back together here. 1267 00:59:50 --> 00:59:58 1268 00:59:58 --> 01:00:05 All right, to briefly review. 1269 01:00:05 --> 01:00:09 the story I have been developing thus far is that 1270 01:00:09 --> 01:00:14 even though you are looking at this scene from the Sistine 1271 01:00:14 --> 01:00:17 Chapel, and this is the expulsion from Eden, there's 1272 01:00:17 --> 01:00:21 Adam and Eve, and this very cool snake. 1273 01:00:21 --> 01:00:24 And there's Adam and Eve getting chucked out, with the 1274 01:00:24 --> 01:00:27 angel poking them in the head and stuff like that. 1275 01:00:27 --> 01:00:29 Even though you are looking at this, you know what you're 1276 01:00:29 --> 01:00:39 looking at, that at any given moment the only thing that's 1277 01:00:39 --> 01:00:43 really coming through from the world to recognition is 1278 01:00:43 --> 01:00:46 whatever is currently being fed through the bottleneck, the 1279 01:00:46 --> 01:00:48 current object of attention. 1280 01:00:48 --> 01:00:54 And that maybe three or four objects, the recent status of 1281 01:00:54 --> 01:00:56 three or four objects is currently held in this 1282 01:00:56 --> 01:00:58 visual short term memory. 1283 01:00:58 --> 01:01:03 The implication here is that I could change this scene 1284 01:01:03 --> 01:01:05 and you wouldn't notice. 1285 01:01:05 --> 01:01:08 So let's find out. 1286 01:01:08 --> 01:01:11 What did I change? 1287 01:01:11 --> 01:01:13 AUDIENCE: [INAUDIBLE] 1288 01:01:13 --> 01:01:14 PROFESSOR: I need a hand or two here. 1289 01:01:14 --> 01:01:18 1290 01:01:18 --> 01:01:19 Yeah, sure, what? 1291 01:01:19 --> 01:01:22 AUDIENCE: [INAUDIBLE] 1292 01:01:22 --> 01:01:23 PROFESSOR: Oh, the fig leaf. 1293 01:01:23 --> 01:01:24 The fig leaf, yes. 1294 01:01:24 --> 01:01:27 The originator of change blindness, which is what this 1295 01:01:27 --> 01:01:31 phenomenon is known as, is Ron Rensink, now at the University 1296 01:01:31 --> 01:01:33 of British Columbia. 1297 01:01:33 --> 01:01:40 And he refers to what he calls "areas of interest." If you 1298 01:01:40 --> 01:01:43 change something that people are paying attention 1299 01:01:43 --> 01:01:45 to, they notice that. 1300 01:01:45 --> 01:01:46 But of course I knew that. 1301 01:01:46 --> 01:01:50 And so how many people picked up the other three changes? 1302 01:01:50 --> 01:01:51 AUDIENCE: [INAUDIBLE] 1303 01:01:51 --> 01:01:54 PROFESSOR: Oh, some. 1304 01:01:54 --> 01:01:58 We have a few people picked -- what did you get? 1305 01:01:58 --> 01:01:59 I can't hear you. 1306 01:01:59 --> 01:02:01 AUDIENCE: [INAUDIBLE] 1307 01:02:01 --> 01:02:03 PROFESSOR: The stick thing. 1308 01:02:03 --> 01:02:04 And what? 1309 01:02:04 --> 01:02:05 Sorry? 1310 01:02:05 --> 01:02:07 AUDIENCE: [INAUDIBLE] 1311 01:02:07 --> 01:02:09 Something showed up at the top that's funny. 1312 01:02:09 --> 01:02:11 The stick thing moved, and something showed up at 1313 01:02:11 --> 01:02:12 the top that's funny. 1314 01:02:12 --> 01:02:16 So now with that information, we can go -- whoops. 1315 01:02:16 --> 01:02:18 AUDIENCE: Right there. 1316 01:02:18 --> 01:02:20 1317 01:02:20 --> 01:02:22 PROFESSOR: You got the stick. 1318 01:02:22 --> 01:02:25 See, the reason for the blank is the same as the moving 1319 01:02:25 --> 01:02:28 chicken legs, which is that you don't want to have motion 1320 01:02:28 --> 01:02:30 transience giving stuff away. 1321 01:02:30 --> 01:02:36 But if you have motion transience -- do do do do do -- 1322 01:02:36 --> 01:02:41 you would think that if you were in the Garden of Eden and 1323 01:02:41 --> 01:02:46 the branches were moving from tree to tree, or for that 1324 01:02:46 --> 01:02:52 matter Eve's foot was moving to Adam's body, you would notice. 1325 01:02:52 --> 01:02:56 But if you're not attending to it, you don't notice. 1326 01:02:56 --> 01:03:00 1327 01:03:00 --> 01:03:06 So this is part of a large set of phenomena that come 1328 01:03:06 --> 01:03:08 under the general heading of change blindness. 1329 01:03:08 --> 01:03:11 At the break, somebody was reminding me of one that you 1330 01:03:11 --> 01:03:14 may have seen because it's made it onto Nova and 1331 01:03:14 --> 01:03:15 things like that. 1332 01:03:15 --> 01:03:20 Done by Dan Simons, where you're watching people 1333 01:03:20 --> 01:03:24 apparently play a weird game of basketball in front of the 1334 01:03:24 --> 01:03:29 elevators, it turns out in the psych department at Harvard. 1335 01:03:29 --> 01:03:33 And while you're doing that, a guy in a gorilla suit -- 1336 01:03:33 --> 01:03:36 actually, Stan reminded me, a woman in a gorilla suit. 1337 01:03:36 --> 01:03:39 It's hard to tell; she's in a gorilla suit -- walks in, 1338 01:03:39 --> 01:03:45 walks into the middle of the game, waves, walks out. 1339 01:03:45 --> 01:03:48 And then afterwards you ask -- oh, and you're doing 1340 01:03:48 --> 01:03:48 a demanding task. 1341 01:03:48 --> 01:03:52 You're supposed to count how passes there are, 1342 01:03:52 --> 01:03:53 or something like that. 1343 01:03:53 --> 01:03:56 And you're asked, did you notice the person in 1344 01:03:56 --> 01:03:58 the gorilla suit? 1345 01:03:58 --> 01:03:59 Well, first you're asked, did you notice anything weird? 1346 01:03:59 --> 01:04:00 Eh, no, very boring. 1347 01:04:00 --> 01:04:02 Notice the person in the gorilla suit? 1348 01:04:02 --> 01:04:02 Yeah, right. 1349 01:04:02 --> 01:04:03 What person in a gorilla suit? 1350 01:04:03 --> 01:04:05 Show them the video again. 1351 01:04:05 --> 01:04:09 Oh my -- 1352 01:04:09 --> 01:04:13 Another great Dan Simons experiment was done when he 1353 01:04:13 --> 01:04:14 was at Cornell, actually. 1354 01:04:14 --> 01:04:20 You're on the street in Ithaca, New York, and some guy walks up 1355 01:04:20 --> 01:04:22 to you and asks you for directions. 1356 01:04:22 --> 01:04:24 Actually it's Dan Simons walks up to you and asks 1357 01:04:24 --> 01:04:25 you for directions. 1358 01:04:25 --> 01:04:28 And so, since you are a nice person, you start 1359 01:04:28 --> 01:04:29 giving Dan directions. 1360 01:04:29 --> 01:04:32 Now you're standing there on the street and, who knows why, 1361 01:04:32 --> 01:04:35 but these two guys with a door are carry a door 1362 01:04:35 --> 01:04:36 down the street. 1363 01:04:36 --> 01:04:38 And they walk between you and Dan. 1364 01:04:38 --> 01:04:41 Which is kind of rude. 1365 01:04:41 --> 01:04:44 And then they're off down the street somewhere. 1366 01:04:44 --> 01:04:48 And the question is, do you continue to give directions 1367 01:04:48 --> 01:04:51 once you see Dan again? 1368 01:04:51 --> 01:04:55 Of course, the real question is, did you notice that when 1369 01:04:55 --> 01:05:00 the door went by, Dan Simons ducked down and left with the 1370 01:05:00 --> 01:05:07 door, and his then-student Dan Levin popped up in his place? 1371 01:05:07 --> 01:05:11 And it's a different guy. 1372 01:05:11 --> 01:05:17 50% of the subjects in this study kept talking. 1373 01:05:17 --> 01:05:20 1374 01:05:20 --> 01:05:24 A surprisingly large number of these, on being debriefed 1375 01:05:24 --> 01:05:28 later, claimed to have noticed a change. 1376 01:05:28 --> 01:05:30 Which is a little strange, right? 1377 01:05:30 --> 01:05:33 I'm talking to this guy and the door, and now I'm talking -- 1378 01:05:33 --> 01:05:35 there's another guy here, but what the heck? 1379 01:05:35 --> 01:05:40 He probably wants the answer to the same question. 1380 01:05:40 --> 01:05:42 I don't know what that's about. 1381 01:05:42 --> 01:05:46 But the important finding there is that 50% of the people 1382 01:05:46 --> 01:05:49 behaved as though they hadn't noticed the change from one 1383 01:05:49 --> 01:05:52 person to another, who they were talking to. 1384 01:05:52 --> 01:05:53 What's going on here? 1385 01:05:53 --> 01:05:55 Now people aren't completely stupid. 1386 01:05:55 --> 01:05:58 The experiment has not been done, but we kind of absolutely 1387 01:05:58 --> 01:06:06 know that if I'm talking to Dan Simons, short white guy, and 1388 01:06:06 --> 01:06:09 now the door goes through, and a tall black woman is standing 1389 01:06:09 --> 01:06:13 there -- hm, you know? 1390 01:06:13 --> 01:06:17 Probably that's, again, the sort of front-end stuff that 1391 01:06:17 --> 01:06:19 people tend to pick up on. 1392 01:06:19 --> 01:06:24 But if what you're doing is, I don't know this guy, but 1393 01:06:24 --> 01:06:26 I've got a sort of a model of this guy. 1394 01:06:26 --> 01:06:29 I'm talking to kind of a short, white guy person. 1395 01:06:29 --> 01:06:32 And da da da, I'm still talking to a short, white guy person. 1396 01:06:32 --> 01:06:37 It's not the same one, apparently, but that turns 1397 01:06:37 --> 01:06:38 out not to be a problem. 1398 01:06:38 --> 01:06:51 This has given rise to a notion that perception is what Kevin 1399 01:06:51 --> 01:06:55 O'Regan has called a grand illusion. 1400 01:06:55 --> 01:07:01 That the only thing that you actually see is the current 1401 01:07:01 --> 01:07:04 object of attention. 1402 01:07:04 --> 01:07:12 That I think I'm seeing all of you, but all I'm really doing 1403 01:07:12 --> 01:07:14 at the moment is paying attention to the guy with the 1404 01:07:14 --> 01:07:16 grey stripe on up there. 1405 01:07:16 --> 01:07:17 Yeah, there he is. 1406 01:07:17 --> 01:07:20 And now that he's riveted my attention by waving at me, the 1407 01:07:20 --> 01:07:22 rest of you are just not there. 1408 01:07:22 --> 01:07:26 You are just some sort of grand illusion floating 1409 01:07:26 --> 01:07:29 around in my head. 1410 01:07:29 --> 01:07:33 Now in some sense, that's correct. 1411 01:07:33 --> 01:07:38 That what you are seeing is a creation -- the burden of the 1412 01:07:38 --> 01:07:42 lecture next time will be to say that you're always seeing 1413 01:07:42 --> 01:07:44 a theory about the world. 1414 01:07:44 --> 01:07:47 You're not seeing the world directly. 1415 01:07:47 --> 01:07:51 You're always making an interpretation, your best guess 1416 01:07:51 --> 01:07:55 about what the stimulus means. 1417 01:07:55 --> 01:07:58 And all the evidence I've been showing you for the past hour 1418 01:07:58 --> 01:08:03 or so suggests that you're only updating that theory through 1419 01:08:03 --> 01:08:07 this very narrow bottleneck. 1420 01:08:07 --> 01:08:13 So in some sense, you are only seeing this creation of your 1421 01:08:13 --> 01:08:17 mind, and the only object that you are currently updating 1422 01:08:17 --> 01:08:24 is the one that you are currently attending to. 1423 01:08:24 --> 01:08:30 But to call the whole thing an illusion, it seems to me, 1424 01:08:30 --> 01:08:34 misses an important aspect of the experience. 1425 01:08:34 --> 01:08:36 1426 01:08:36 --> 01:08:40 Let's take a very old example. 1427 01:08:40 --> 01:08:46 The French philosopher of the, I'm thinking early 18th 1428 01:08:46 --> 01:08:51 century, whose name I will now proceed to misspell. 1429 01:08:51 --> 01:08:58 1430 01:08:58 --> 01:09:00 Does that look -- any good philosopher sorts? 1431 01:09:00 --> 01:09:00 That about right? 1432 01:09:00 --> 01:09:03 Condillac, I believe is how you pronounce it properly. 1433 01:09:03 --> 01:09:09 But anyway, Condillac wrote a number of very interesting 1434 01:09:09 --> 01:09:12 things about sensation and perception. 1435 01:09:12 --> 01:09:14 He's most famous for his statue. 1436 01:09:14 --> 01:09:22 His statue that he proposed as an entity with 1437 01:09:22 --> 01:09:24 no senses at all. 1438 01:09:24 --> 01:09:27 And he asked what would the mental life of this statue be? 1439 01:09:27 --> 01:09:30 And argued that, in the absence of any sensory input, there 1440 01:09:30 --> 01:09:32 would be no mental life. 1441 01:09:32 --> 01:09:35 And now, he said, let's imagine opening up, I think he opens 1442 01:09:35 --> 01:09:37 up the statue's nostrils. 1443 01:09:37 --> 01:09:40 And argues that the entire mental life of this 1444 01:09:40 --> 01:09:43 statue is now the smell. 1445 01:09:43 --> 01:09:45 Whatever, I think he waves a rose under it 1446 01:09:45 --> 01:09:46 or something like that. 1447 01:09:46 --> 01:09:52 But a little further on he has a different example where he 1448 01:09:52 --> 01:09:57 says imagine, you're in a dark -- a dark chateau, I believe. 1449 01:09:57 --> 01:10:00 And it's completely pitch black, because of 1450 01:10:00 --> 01:10:01 these heavy curtains. 1451 01:10:01 --> 01:10:06 And it's morning, and you throw open the curtains. 1452 01:10:06 --> 01:10:09 If it were the case -- this is not what he's saying, but if it 1453 01:10:09 --> 01:10:11 were the case that all of vision was nothing but a grand 1454 01:10:11 --> 01:10:14 illusion, you only saw the spotlight of attention, this 1455 01:10:14 --> 01:10:17 one thing that you're attending to at any one moment, your 1456 01:10:17 --> 01:10:25 experience of this brand new scene ought to be like sort 1457 01:10:25 --> 01:10:27 of a weird paint brush. 1458 01:10:27 --> 01:10:29 Initially, I don't see nothin'. 1459 01:10:29 --> 01:10:31 Because I haven't attended to anything. 1460 01:10:31 --> 01:10:32 Now I attend to an object. 1461 01:10:32 --> 01:10:36 And now this person, object, is the only thing in the scene. 1462 01:10:36 --> 01:10:37 And boom, boom, boom. 1463 01:10:37 --> 01:10:39 And I slowly fill you in. 1464 01:10:39 --> 01:10:41 That's not the impression you get ever when 1465 01:10:41 --> 01:10:42 you see a new scene. 1466 01:10:42 --> 01:10:45 You may not know what you're looking at, but you see 1467 01:10:45 --> 01:10:48 something everywhere instantly. 1468 01:10:48 --> 01:10:52 And the grand illusion thing misses the fact that you're 1469 01:10:52 --> 01:10:57 somehow sensing something about the entire visual 1470 01:10:57 --> 01:10:59 field all at once. 1471 01:10:59 --> 01:11:06 Let me offer a way of understanding that that will 1472 01:11:06 --> 01:11:09 then tie back to the visual physiology that I was talking 1473 01:11:09 --> 01:11:11 about in the last lecture. 1474 01:11:11 --> 01:11:13 Here's the idea. 1475 01:11:13 --> 01:11:22 Early in your visual system, you've got the processes that, 1476 01:11:22 --> 01:11:25 sort of a big river of information that tells you 1477 01:11:25 --> 01:11:31 about those 12 to 18 features or attributes that you can 1478 01:11:31 --> 01:11:33 get out -- these are eyes. 1479 01:11:33 --> 01:11:35 This is my drawing again. 1480 01:11:35 --> 01:11:38 So from your eyes, you've got this big flow of information 1481 01:11:38 --> 01:11:40 up into your brain. 1482 01:11:40 --> 01:11:45 And at some point, it hits this bottleneck that's 1483 01:11:45 --> 01:11:48 taken care of by attention. 1484 01:11:48 --> 01:11:52 Object recognition, the ability to tell that that's a branch, 1485 01:11:52 --> 01:11:58 that that's a snake, and so on, only one object at a time can 1486 01:11:58 --> 01:12:05 go in and come out and rise to the level of some sort of 1487 01:12:05 --> 01:12:09 perceptual awareness, populating your 1488 01:12:09 --> 01:12:12 visual experience. 1489 01:12:12 --> 01:12:17 And that bottleneck is guided by these collection of basic 1490 01:12:17 --> 01:12:19 features that you've got. 1491 01:12:19 --> 01:12:21 If you know you're looking for red stuff, you set 1492 01:12:21 --> 01:12:24 these settings for red. 1493 01:12:24 --> 01:12:31 And maybe vertical, and big and moving and so on. 1494 01:12:31 --> 01:12:34 And so you can regulate what gets through here. 1495 01:12:34 --> 01:12:37 And only the one thing at any one time is 1496 01:12:37 --> 01:12:39 getting up into there. 1497 01:12:39 --> 01:12:47 And so the current object of attention gets to rise to 1498 01:12:47 --> 01:12:52 awareness, and you know what you're looking at. 1499 01:12:52 --> 01:12:55 That's the story that I've told you to this point. 1500 01:12:55 --> 01:12:57 That's the story that gives rise to the notion that 1501 01:12:57 --> 01:12:59 everything else in the visual field is some 1502 01:12:59 --> 01:13:00 sort of an illusion. 1503 01:13:00 --> 01:13:04 But look, when I was doing that red and green dot thing, it 1504 01:13:04 --> 01:13:07 wasn't that you didn't see the other red and green dots. 1505 01:13:07 --> 01:13:07 They were there. 1506 01:13:07 --> 01:13:11 You just somehow had a very impoverished ability to tell 1507 01:13:11 --> 01:13:14 me anything about them. 1508 01:13:14 --> 01:13:17 And a way to think about that is to propose that there's 1509 01:13:17 --> 01:13:23 another pathway, another big fat river of information about, 1510 01:13:23 --> 01:13:27 say, these 12 to 18 attributes, that isn't limited 1511 01:13:27 --> 01:13:28 by the bottleneck. 1512 01:13:28 --> 01:13:32 But that it doesn't let you -- it's not a cheat. 1513 01:13:32 --> 01:13:35 This doesn't now let you go and recognize objects 1514 01:13:35 --> 01:13:36 everywhere all at once. 1515 01:13:36 --> 01:13:39 It can only do a few things. 1516 01:13:39 --> 01:13:42 It can sort of give you the statistics of the world. 1517 01:13:42 --> 01:13:44 You know, I'm looking out at you guys and I'm seeing 1518 01:13:44 --> 01:13:49 a sort of texture of people amongst purple. 1519 01:13:49 --> 01:13:57 And that sort of impression of purpleness, of a tilted plane, 1520 01:13:57 --> 01:14:01 is the sort of thing that you might get out of this big, 1521 01:14:01 --> 01:14:05 broad, unrestricted, nonselective, as it's 1522 01:14:05 --> 01:14:07 labeled on there, pathway. 1523 01:14:07 --> 01:14:10 There's evidence that you can get a little bit of 1524 01:14:10 --> 01:14:12 semantic information. 1525 01:14:12 --> 01:14:15 Semantic means the meaning, when you're talking about 1526 01:14:15 --> 01:14:17 language, it's the meaning of the utterance, let's say. 1527 01:14:17 --> 01:14:19 When you're talking about vision, it's the meaning 1528 01:14:19 --> 01:14:22 of the stimulus. 1529 01:14:22 --> 01:14:27 So I might get the notion that I'm in an enclosed space. 1530 01:14:27 --> 01:14:30 This pathway by itself is not going to tell me what 1531 01:14:30 --> 01:14:31 enclosed space I'm in. 1532 01:14:31 --> 01:14:32 But I'm in a space. 1533 01:14:32 --> 01:14:34 There's a tilted surface there. 1534 01:14:34 --> 01:14:35 And so on. 1535 01:14:35 --> 01:14:39 But this is going to give me, that broad pathway is going 1536 01:14:39 --> 01:14:40 to give me the feeling that there's something 1537 01:14:40 --> 01:14:43 happening everywhere. 1538 01:14:43 --> 01:14:47 And this pathway is going to tell me what's happening 1539 01:14:47 --> 01:14:49 specifically here, now. 1540 01:14:49 --> 01:14:53 And between the two of them, I can build up an idea in my 1541 01:14:53 --> 01:14:57 head of, oh, I'm in 10-250. 1542 01:14:57 --> 01:14:59 I'm talking to this bunch of people, some of whom I know by 1543 01:14:59 --> 01:15:02 name, some of whom I recognize because they've been 1544 01:15:02 --> 01:15:04 here before, and so on. 1545 01:15:04 --> 01:15:08 And I can keep updating that 20, 30 times a second 1546 01:15:08 --> 01:15:09 through this pathway. 1547 01:15:09 --> 01:15:13 And I can keep experiencing something, that sort of 1548 01:15:13 --> 01:15:17 wallpaper of the world effect, through this other pathway. 1549 01:15:17 --> 01:15:21 Now that ties back, it might tie back to things that 1550 01:15:21 --> 01:15:23 we talked about before. 1551 01:15:23 --> 01:15:28 If you remember the idea that you can broadly cut visual 1552 01:15:28 --> 01:15:31 processing, visual cortical processing, into two big 1553 01:15:31 --> 01:15:34 pathways, a what and a where pathway. 1554 01:15:34 --> 01:15:38 A what pathway going down into the temporal lobe, and a where 1555 01:15:38 --> 01:15:43 pathway going up into the parietal lobe. 1556 01:15:43 --> 01:15:46 This selective pathway, this thing that only does one object 1557 01:15:46 --> 01:15:51 at a time, would then be mapped onto the what pathway. 1558 01:15:51 --> 01:15:55 What am I looking at, what am I attending to right now? 1559 01:15:55 --> 01:16:01 If you were to lesion that, if you were to lesion it, or you 1560 01:16:01 --> 01:16:04 were to have damage to the temporal lobe of your 1561 01:16:04 --> 01:16:08 brain, you might well end up with an agnosia. 1562 01:16:08 --> 01:16:11 That's not a term that ended up on the handout, so you want 1563 01:16:11 --> 01:16:12 to write that one down. 1564 01:16:12 --> 01:16:15 1565 01:16:15 --> 01:16:20 An agnosia is a failure to know, if you like. 1566 01:16:20 --> 01:16:22 To know what something is. 1567 01:16:22 --> 01:16:25 So an agnosic, if you have a person with a fairly global 1568 01:16:25 --> 01:16:30 agnosia, visual agnosia, they would be able to say, yeah, 1569 01:16:30 --> 01:16:32 I'm looking at a bunch of objects here, but I don't 1570 01:16:32 --> 01:16:34 know what they are. 1571 01:16:34 --> 01:16:35 Here's this object. 1572 01:16:35 --> 01:16:37 It's sort of orange. 1573 01:16:37 --> 01:16:41 It's got orange and brown and white blobs on it. 1574 01:16:41 --> 01:16:43 And it's got this very long part, and there are these 1575 01:16:43 --> 01:16:47 four pointy things coming off the bottom of it. 1576 01:16:47 --> 01:16:49 I've got no idea what that is; maybe it's 1577 01:16:49 --> 01:16:52 furniture of some sort. 1578 01:16:52 --> 01:16:54 You'd look at it and say, that's a giraffe. 1579 01:16:54 --> 01:16:57 An agnosic would be able to tell you about it, but not 1580 01:16:57 --> 01:16:59 know that it was a giraffe. 1581 01:16:59 --> 01:17:04 Smaller lesions produce rather specific agnosias. 1582 01:17:04 --> 01:17:07 There are reports in the literature of agnosias specific 1583 01:17:07 --> 01:17:12 to, say, fruits and vegetables. 1584 01:17:12 --> 01:17:23 More common is a form of agnosia called prosopagnosia, 1585 01:17:23 --> 01:17:27 which is a specific inability to recognize faces. 1586 01:17:27 --> 01:17:29 You know that it's a face, it's got two eyes, it's 1587 01:17:29 --> 01:17:29 got a nose and mouth. 1588 01:17:29 --> 01:17:32 You don't know who it is. 1589 01:17:32 --> 01:17:37 Small lesions down in that pathway can produce 1590 01:17:37 --> 01:17:39 that sort of damage. 1591 01:17:39 --> 01:17:42 That would suggest, then, that the other pathway ought to be 1592 01:17:42 --> 01:17:46 mapped onto the where pathway. 1593 01:17:46 --> 01:17:53 And if you get bilateral damage, for instance, to the 1594 01:17:53 --> 01:17:57 parietal lobe, you can end up with a disorder known as 1595 01:17:57 --> 01:18:01 Balint's syndrome -- might as well write the word down here. 1596 01:18:01 --> 01:18:10 Named after Balint -- that has as one of its properties what's 1597 01:18:10 --> 01:18:12 called a simultagnosia. 1598 01:18:12 --> 01:18:16 This is a situation where you can recognize an object if you 1599 01:18:16 --> 01:18:17 can get your attention on it. 1600 01:18:17 --> 01:18:24 But that's the only thing you can respond to, in some sense. 1601 01:18:24 --> 01:18:28 It is as if the grand illusion theory was really right, that 1602 01:18:28 --> 01:18:30 you can only see the current object of attention. 1603 01:18:30 --> 01:18:34 So you do something like this with a simultagnosic, 1604 01:18:34 --> 01:18:36 say, what's that? 1605 01:18:36 --> 01:18:37 Draw his attention to it. 1606 01:18:37 --> 01:18:39 That's a book. 1607 01:18:39 --> 01:18:42 OK, what else have we got here? 1608 01:18:42 --> 01:18:44 OK, what's that? 1609 01:18:44 --> 01:18:45 That's a cell phone. 1610 01:18:45 --> 01:18:46 What's that? 1611 01:18:46 --> 01:18:47 That's a cell phone. 1612 01:18:47 --> 01:18:48 Anything else? 1613 01:18:48 --> 01:18:49 No. 1614 01:18:49 --> 01:18:50 What's that? 1615 01:18:50 --> 01:18:51 That's a book. 1616 01:18:51 --> 01:18:52 What's that? 1617 01:18:52 --> 01:18:53 That's a book. 1618 01:18:53 --> 01:18:53 Anything else? 1619 01:18:53 --> 01:18:54 No. 1620 01:18:54 --> 01:18:58 So one object at a time. 1621 01:18:58 --> 01:19:01 As if the where of the world had disappeared. 1622 01:19:01 --> 01:19:04 If you get damage -- we'll talk about this more later in the 1623 01:19:04 --> 01:19:08 term -- but if you get damage to the parietal lobe on one 1624 01:19:08 --> 01:19:13 side, particularly on the right side, what you can end up with 1625 01:19:13 --> 01:19:15 is a disorder known as neglect. 1626 01:19:15 --> 01:19:22 It comes in a variety of flavors, again depending 1627 01:19:22 --> 01:19:23 on the particular lesion. 1628 01:19:23 --> 01:19:28 But the characteristic is, you ignore the contralateral, 1629 01:19:28 --> 01:19:29 the other side. 1630 01:19:29 --> 01:19:32 Now that can be the other side of space, so that if I'm a 1631 01:19:32 --> 01:19:36 patient with a right hemisphere parietal lesion and I'm looking 1632 01:19:36 --> 01:19:41 at MIT volleyball here, everything in the left visual 1633 01:19:41 --> 01:19:44 field, I would simply ignore. 1634 01:19:44 --> 01:19:47 I would behave as though it did not exist. 1635 01:19:47 --> 01:19:50 If I took away everything else and put a stimulus in my left 1636 01:19:50 --> 01:19:54 visual field, I could show that the patient could still see it. 1637 01:19:54 --> 01:19:59 But with a full visual field, he behaves as though there's 1638 01:19:59 --> 01:20:02 nothing there at all. 1639 01:20:02 --> 01:20:07 Patients with neglect will do weird things, like -- they're 1640 01:20:07 --> 01:20:10 in the hospital, typically, because they've had a stroke. 1641 01:20:10 --> 01:20:12 You give them their dinner. 1642 01:20:12 --> 01:20:15 They eat everything on the right side of the plate and 1643 01:20:15 --> 01:20:17 leave everything on the left side of the plate. 1644 01:20:17 --> 01:20:17 Why? 1645 01:20:17 --> 01:20:18 Because they didn't like the mashed potatoes? 1646 01:20:18 --> 01:20:18 No. 1647 01:20:18 --> 01:20:20 If you rotate the plate, they'll eat the stuff on the 1648 01:20:20 --> 01:20:22 other side of the plate. 1649 01:20:22 --> 01:20:27 It's as if it just didn't exist, in some fashion. 1650 01:20:27 --> 01:20:30 Now, you'll remember the parietal lobe is also where you 1651 01:20:30 --> 01:20:32 get the representation of the body surface, and 1652 01:20:32 --> 01:20:33 stuff like that. 1653 01:20:33 --> 01:20:36 So neglect patients can also be patients who neglect one half 1654 01:20:36 --> 01:20:41 of their body, and deny that part of their body is theirs. 1655 01:20:41 --> 01:20:43 This is a little easier to understand if you figure that 1656 01:20:43 --> 01:20:47 the stroke might well have also knocked out the ability to 1657 01:20:47 --> 01:20:48 control that side of your body. 1658 01:20:48 --> 01:20:51 So a stroke on the right might leave you 1659 01:20:51 --> 01:20:53 paralyzed on the left. 1660 01:20:53 --> 01:20:57 But you can end up with situations like one described, 1661 01:20:57 --> 01:21:01 I think, by Oliver Sacks in one of his books, where a patient 1662 01:21:01 --> 01:21:05 is saying, "This is a cheap hospital. 1663 01:21:05 --> 01:21:07 This is a really cheap, lousy hospital." How do you know it's 1664 01:21:07 --> 01:21:11 a cheap, lousy hospital? "Because they're doubling up on 1665 01:21:11 --> 01:21:14 beds." What you mean they're doubling up on beds? 1666 01:21:14 --> 01:21:18 He says, "Look at that leg. 1667 01:21:18 --> 01:21:23 That's not my leg." So you can get, this is somebody looking 1668 01:21:23 --> 01:21:29 at their own leg and denying that that leg belongs to them. 1669 01:21:29 --> 01:21:36 That's another aspect of neglect. 1670 01:21:36 --> 01:21:42 OK, what I'm going to do next time is to talk about the way 1671 01:21:42 --> 01:21:46 in which you make hypotheses about the world. 1672 01:21:46 --> 01:21:46