VPL will be a collaborative web-based virtual learning environment for first-year university physics.
The key elements of the VPL system are:
- Interactive simulations of physics experiments.
The simulations will be delivered in a near-photorealistic video-game-like 3D virtual environment (VE).
Physics-based models will be used to drive the simulations, which will include rigid-body statics and dynamics, heat-transfer, optics, and electricity (static and circuits). The experiments will be performed by the user in the VE where the user can pick/release objects, click on buttons, turn knobs, push/pull objects, etc. The VE also supports multiple users using a central server where they can logon. The users can collaborate in many ways in order to enhance the learning process and to help them develop team work skills. Those include:
- They can collaborate to perform an experiment, where each user performs a certain task.
- One user can show another user how to perform a certain experiment or task.
- They can talk to each other using instant text messages.
- Multimedia instruction.
The multimedia lectures will include synchronized speech, written text, sounds, movies and 2D/3D animated graphical illustrations.
They will also include exercise questions which include multiple choice, fill-in the blanks and drag-and-drop.
The student can do the exercises him/her self or can ask for help and be guided step-by-step through the solution.
The instruction is delivered using a hierarchical tree-outline that is always visible so that the student can keep track of his/her own progress.
The instruction is systematically delivered by playing the tree-outline in a linear fashion.
The student can also access the material in a non-linear fashion because at any time s/he can click on and be taken to a different section of the outline.
- Intelligent near photo-realistic animated humanoids.
The virtual humans can act as tutors to deliver instruction material or as lab assistants to show the student how to operate the virtual experiment equipment.
The humanoids' lips are synched with the speech and their gestures and motions are synched to the instruction material.
For example, they can walk up to a physics experiment and point at or manipulate objects.
The student can ask the virtual tutor questions in natural-language (using speech or typing).
The tutor will then search the lecture knowledge-base and return the most appropriate answer.
If the student asks for more information, then the answer with the next score value is returned.
If no more information is available then the tutor offers to do a web search.
The student can also give commands using natural-language to the lab assistant to perform tasks in the VE such as turning on/off knobs, showing/setting-up a physics experiment, etc.
Minimum hardware: Pentium-4 2 GHz or Pentium-M 1.5 Ghz (or equivalent AMD processor); 512KB RAM;
DVD drive or 4 GB free hard-disk space; and Nvidia GeForce (or equivalent) graphics card.
Our software runs on most PC laptops (that are 1-2 years old) and most PC desktops (2-3 years old).
Software: Windows Vista/XP/2000 and Microsoft Internet Explorer 6 or 7.
Under development from an SBIR grant.