Activa JavaScript para disfrutar de los vídeos de la Mediateca.
Virtual Reality
Ajuste de pantallaEl ajuste de pantalla se aprecia al ver el vídeo en pantalla completa. Elige la presentación que más te guste:
NASA Destination Tomorrow Segment explaining how NASA uses virtual reality environments to simulate NASA missions.
The great poet Walt Whitman once said, I accept reality and dare not question it.
00:00:00
Well, if old Walt was here to see this, he just might question it.
00:00:10
Today, NASA researchers are working in high-tech virtual reality simulation labs
00:00:14
using numbers, graphics, mathematical models,
00:00:19
to create three-dimensional images of objects and environments.
00:00:22
Man, it's like working inside a real holodeck.
00:00:26
Now, I spoke with Dr. Chris Sandridge at NASA Langley's Immersive Design and Simulation Lab,
00:00:28
better known as the CAVE, to find out how it works.
00:00:33
What we're standing in right now is called a CAVE.
00:00:36
It stands for Cave Automatic Virtual Environment.
00:00:39
Basically, it's a multi-screen theater where we can generate 3-D images, 3-D sounds,
00:00:42
and simulate various NASA missions.
00:00:47
The CAVE has three walls made of 10-foot by 10-foot rear projection screens
00:00:50
and a floor that is projected from above,
00:00:54
giving the users a near-complete immersion in computer-generated graphics.
00:00:56
The simulation looks like double images until you put on the goggles
00:01:00
that gives everything a three-dimensional quality.
00:01:03
The hardware and graphics equipment used to operate the system
00:01:06
were first developed for use in computer games and in the theme park industry.
00:01:09
So, how does this virtual environment work?
00:01:13
We need the glasses to describe that.
00:01:15
Basically, what we have here are shutter glasses,
00:01:18
and what they do is they kind of decode the stereo image so that we see the depth.
00:01:22
Basically, the computer is generating two images,
00:01:27
one for your left eye, one for your right eye,
00:01:30
and then there's a little sensor here on the glasses
00:01:32
that is detecting an infrared signal from behind the screen
00:01:34
that synchronizes the glasses so you see a 3-D image.
00:01:38
In addition, the person who's actually running the CAVE is also being head-tracked.
00:01:42
There's a black box above us that is putting out an electromagnetic field
00:01:46
that's being picked up by this antenna,
00:01:50
and then that relays information back to the computer
00:01:52
and tells the computer where the person is looking and what his head orientation is,
00:01:55
and then it updates the visuals and it updates the sound based on this person's position.
00:01:59
And then finally, because we don't have a mouse and a keyboard available to us,
00:02:05
we need some type of an input device.
00:02:10
So, what we have here is the wand that we use to control the application.
00:02:12
It has joysticks on it. It has some buttons.
00:02:17
And then also, it is tracked as well,
00:02:20
so the computer knows where the position of this is so we can interact with the environment.
00:02:22
So that's basically how it works.
00:02:27
And then, of course, there's kind of a supercomputer in the back room that's kind of driving it all.
00:02:29
So, can you show me how this application works?
00:02:34
Sure. Put your glasses on and then we'll go to town.
00:02:36
Got it, man. Test drive this thing.
00:02:40
This is a full-up configuration of the station,
00:02:42
and we're using this application basically for two different environments,
00:02:45
the radiation environment and the sound environment.
00:02:49
Currently, NASA Langley researchers are developing tools
00:02:52
to help design improved radiation shielding and reduce noise for the International Space Station.
00:02:55
They're able to move equipment or install shielding in the virtual reality image
00:03:00
and then observe and store calculations of what effects the changes make.
00:03:04
The simulations can be shared with other researchers at distant locations via computer network connections.
00:03:08
So, Johnny, you want to try giving it a shot?
00:03:13
Absolutely. Let me see this.
00:03:15
Take the wand.
00:03:16
Okay.
00:03:17
You need to put it on these glasses because these are the ones that are tracked.
00:03:18
All right. Thank you.
00:03:21
And the way it works is that you point the wand in the direction you want to go
00:03:24
and then push the joystick forward.
00:03:29
Forward.
00:03:30
To go forward.
00:03:31
Oh, man.
00:03:32
And you pull it backward to go backwards.
00:03:33
And then rotating is pulling the joystick left and right.
00:03:34
Check this out.
00:03:37
You might want to back out so you can see, fly around the station.
00:03:38
I'm going to throw up.
00:03:41
All right. Here we go.
00:03:42
Rookie driver.
00:03:45
Yeah.
00:03:46
Here, take the wheel.
00:03:47
Here, your glasses back.
00:03:48
Thanks.
00:03:49
I'll take these.
00:03:50
So what are some of the other uses for this technology?
00:03:51
Another use that we're just starting to work on is to develop a simulation
00:03:54
to evaluate community noise of jets and aircraft flying near airports
00:03:58
to look at how we can quiet the aircraft
00:04:04
and be less intrusive to the neighbors around the airport.
00:04:08
And then finally, I guess, these types of cave environments are used
00:04:12
by the automotive industry to lay out the interior cockpit of the car.
00:04:16
So in a virtual environment, they'll look at, like, where the mirror is,
00:04:20
where the console is, anything where human factors are involved,
00:04:24
and you can put it in actual size and look at it in the correct perspective
00:04:28
before you build hardware prototypes, which are fairly expensive.
00:04:33
This was a lot of fun.
00:04:37
This was really something else, and thanks a lot for everything.
00:04:38
Yeah, no problem.
00:04:40
One more question?
00:04:41
Sure.
00:04:42
Can I keep the glasses?
00:04:43
Yeah, everybody wants the glasses.
00:04:44
They are very exciting.
00:04:45
Check these out, man.
00:04:46
- Valoración:
- Eres el primero. Inicia sesión para valorar el vídeo.
- 1
- 2
- 3
- 4
- 5
- Idioma/s:
- Niveles educativos:
- ▼ Mostrar / ocultar niveles
- Nivel Intermedio
- Autor/es:
- NASA LaRC Office of Education
- Subido por:
- EducaMadrid
- Licencia:
- Reconocimiento - No comercial - Sin obra derivada
- Visualizaciones:
- 422
- Fecha:
- 28 de mayo de 2007 - 17:04
- Visibilidad:
- Público
- Enlace Relacionado:
- NASAs center for distance learning
- Duración:
- 04′ 48″
- Relación de aspecto:
- 4:3 Hasta 2009 fue el estándar utilizado en la televisión PAL; muchas pantallas de ordenador y televisores usan este estándar, erróneamente llamado cuadrado, cuando en la realidad es rectangular o wide.
- Resolución:
- 480x360 píxeles
- Tamaño:
- 27.91 MBytes