VERG -> Virtual Patients ->
Overview
|
Overview |
System Description �� The
experiences occur in an examination room.�
The system is composed of two computers, four cameras for tracking the
student�s posture, a data projector, a wireless microphone, and Dragon
Naturally Speaking 9 for speech recognition.�
The total cost of the hardware involved is less than $8000, and the use of commodity components
makes wide-spread adoption a realistic goal. �� The user�s
posture is approximated by tracking IR retro-reflective tape on a ball cap,
chair, and finger.� Students can use
the hand tracking to localize the VP�s pain with simple pointing
gestures.� A wireless microphone
captures audio input (speech recognition performance is about 70% matching
for utterances).� |
|
|
Experience �� The VHs� gestures
and audio responses were drafted by teaching medical faculty.� Users knock on the exam room door, enter,
and see VHs projected life-size on the exam room wall.� The user converses with the VP for
ten-minutes.� Several scenarios have
been created including focusing on acute abdominal pain, breast mass, and
blurred vision.� Evaluation Since August 2004, studies have been conducted to
develop, evaluate, and validate the system.�
Medical, nursing, and physician assistant students (n > 150) participated.� The system is being installed and tested at
the Medical College of Georgia, Videos of interactions are available upon request. |
What
we think we know about virtual humans: �
Interaction with a virtual patient is
validated.� Expert observer ratings of virtual patient
interactions are correlated (r=0.49)
with standardized patient interactions. �
Conversation content with a virtual patient is
similar to that with a standardized patient, even if the method be might more robotic. �
Although the interaction with a VP is not identical
to a SP, some educational objectives can be achieved.� High
level concepts, e.g. empathy, require further research. �
The patient-doctor interaction is a usable platform
to study virtual humans, despite current technology compromises. �
Natural interaction with virtual humans is
important for teaching communication skills. |
|
|
Ongoing
Research Directions Research and user studies are currently underway
to: �
Validate the
use of virtual humans to teach communications and interpersonal skills �
Characterize a
virtual human interaction �
Expose
students to abnormal findings, e.g. neurological, psychomotor, and emotional �
Elicit real
world biases (e.g. racial/ethnic, gender, age, intelligent, weight, and
accent) �
Analyze and
visualize the interaction with a virtual human to enable student
self-reflection �
Visualize,
categorize, and evaluate the signals from a virtual human-human interaction �
Use highly
interactive virtual environments that allow real tools to perform virtual
exams. �
Evaluating the
effect of mixed reality interaction on presence and co-presence. �
Measure the
impact of system performance on perception, presence, and co-presence. |