All Files in ‘COSC477 (2022-S2)’ Merged

01. Introduction

Course Administration

Course objectives:

Topics:

Staff:

Labs:

Assessment:

Research project:

Mixed Reality

A continuum:

In terms of interactions:

Virtual Reality

VR:

Defining characteristics of VR:

AIP Cube (Zeltzer, 1992)

Three axes:

VR is at extreme end of all three axes of the AIP cube.

Very hyped in 1980s/1990s:

Keys to success:

Current state of senses:

Simulator sickness:

Factors negatively influencing VR:

Delay/latency is one of the main contributing factors to simulator sickness. The system must complete several tasks in series, which can lead to noticeable high latency:

VR Output

Sound:

Smell:

Touch:

VR Interaction

Interaction with VR:

Basic VR interaction tasks:

The “optimal” interface depends on:

Augmented Reality

Azuma (1997):

AR feedback loop:

Requirements:

History:

Display types:

Tracking:

Registration:

Tracking requirements:

Tracking technologies:

Evolution of AR interfaces (more expressive/intuitive going down):

02. Developing Augmented Reality Experiences

Adrian Clark, senior lecturer, School of Product Design.

Introduction to Unity

Unity: ‘real time development platform’. Not just for games.

Unity is so big, no one knows the full extent of what it can do.

Resources:

Unity:

Editor:

Shader rendering mode:

Scripting:

// Behavior script is another component that attaches to a GameObject

// https://docs.unity3d.com/Manual/ExecutionOrder.html

// A massive number of lifecycle callbacks
public class NodeBehaviorScript: MonoBehaviour {
  // Called before first frame update
  void Start() {
    Debug.log("Instantiated");
  }

  // Called every frame
  void Update() {
    if (Input.GetKey(KeyCode.UpArrow)) {
      // transform: transform of the object the script is attached to
      // localPosition: position relative to parent
      transform.localPosition += new Vector3(0, 0, 0.1f);
    }
  }

  void onMouseDown() {
    // Set up a collider on the game object
    // e.g. box collider: invisible box (hopefully) around the object
  }

  void onCollisionEnter(Collision collision) {
    // Use collider that is larger than the object: when two objects
    // come close together you can add custom behavior (e.g. 'picking up'
    // the object in AR
  }

}

Unity Remote:

AR

Many different SDKs available. Some deciding factors:

Unity also has AR foundation: a common interface to platform-specific AR frameworks. No way of running it in the editor, which makes development very frustrating (although there are some rumblings of a Unity Remote-like app which does on-device processing).

This course will use Vuforia:

Vuforia AR camera:

Targets:

UI:

Prefabs:

Building for Mobile:

Adrian:

03. Developing Virtual Reality Experiences

Dr Tham Piumsomboon, School of Product Design.

Current VR Development Tools

2016: rise of consumer HMDs. Oculus, HTC Vive.

XR Fragmentation: different vendors all had their own proprietary APIs (e.g. Steam VR, Hololens, Oculus, HTC Vive, Magic Leap).

Kronos group (which created OpenGL, Vulkan, etc.) developed the OpenXR standard: cross-platform API supported by many hardware vendors.

Toolkits:

Game Engines:

Developing VR Experiences

Immersion:

Models of immersion:

Unity:

Analysis -> Profiler

Game engines components:

Game loop:

The subsystems will often update at different rates (e.g. NPC behavior may update at ~1 FPS, physics engine at 120 FPS, renderer at 60 FPS).

Camera placements and control mappings:

ProBuilder package: allows you to create new primitive shapes.

VR Interaction Design

Even if the graphics are good, you need to be able to interact with the environment.

Seven principles of fundamental design (Norman):

VR: mappings are from games, not reality (e.g. using scissors: clicK a button to use, not picking it up with your fingers). Chairs: cannot sit on VR chairs in real life

VR affordances:

User groups:

Whole user needs:

Summary:

04. AR Tracking, Calibration and Registration

Optical tracking:

Trackable managers:

Computer vision: detecting objects and tracking their movement in 6 degrees of freedom

Vision is inferrential: context, prior knowledge etc. is required to come up with a reasonable interpretation of the scene: An infinite number of 3D objects can lead to the same image.

3D information recovery:

TODO

The human visual system is really good and is often taken for granted; replicating this with a computer is very difficult.

Cognitive processing of color is dependent on context: neighboring colors, not absolute values. Hence, using a color mask for filtering is likely to fail unless you have control over lighting.

Low-level image processing:

TODO

Recognition:

TODO:

Perfect 3D point cloud -> 3D model is very difficult

Modelling the natural world: extremely difficult as there is variation. Manufacturing produces many copies of a single product, but nature does not.

Vision systems:

COLOR:

TODO

Fidutial:

05. Mixed Reality Displays

Rob Lindeman: Professor & Director of HIT Lab.

Displays

Definitions

Virtual Reality

Rob first defined VR as:

Fooling the senses into believing they are experiencing something they are not actually experiencing

Lindeman, 1999 (PhD)

Today, he has a new definition:

Fooling the brain into believing it is experiencing something it is not actually experiencing

Mixed Reality

Mixing (not overlaying) of real-world (RW) and computer-generated (CG) stimuli.

This requires matching attributes such as:

Milgram’s Reality-Virtuality continuum: different displays influence the quality of the experience.

General Display Types

NB: humans are animals and as such, were evolutionary pressures have guided the development of ours senses. Displays that leverage the different strengths and weaknesses are more likely to be effective.

Senses:

Display anchoring:

Visual display types:

Mixing Reality

Visual

NB: we don’t need to simulate reality, just need to make it good enough to make the brain believe it is physically correct.

Direct:

                                    Human
Real-world ----> Environment ----> sensory ----> Nerves ----> Brain
 signal                           subsystem

                  Display?         Retina        Optic    Direct cranial
                                                 nerve     stimulation

Captured/mediated

Real-world ----> Environment ----> Capture device ----> Post-processing ----> Captured signal

Audio

Real-world ----> Environment ----> Outer ear ----> Middle ear ---> Inner ear ----> Nerves ----> Brain

Mic-through AR:

Hear-through AR:

Visual Mixing

Projection:

Optical-see-through AR:

Optical-see-through Projective AR:

Video-see-through AR:

Visual Cues

Do we need stereo, which is one of the major things added by VR compared to traditional displays?

Monoscopic cues:

Stereoscopic cues:

Motion depth:

Physiological cues:

Masking/Occlusion

Making a physical object block a virtual one.

Real-world Problems with Immersion

Dynamic immersion:

Visuals & Sound

Non-intrusive senses: touch etc. requires something on or in your body.

Final Thoughts

06. Interaction in VR

Rob Lindeman, Director of HITLab NZ.

User interaction:

The state of VR:

Motivation for studying VR interaction:

Existing input methods:

Classification Schemes

Relative vs absolute movement:

Integrated vs separable degrees of freedom:

Analog vs digital:

Isometric vs isotonic:

Rate control vs position control:

Special-purpose vs general-purpose:

Direct vs indirect:

3D Input Devices:

3D Spatial Input Devices:

Motion-Capture/Tracking Systems:

Other Input Devices:

Special Purpose Input Devices:

Interaction in VR

Mapping Devices to Actions:

VR interaction:

Main interaction tasks (Bowman et al.):

Objects:

Object selection in the real world:

Selection-task decomposition:

Reaching objects:

Manipulation:

Design Guidelines:

Research papers:

The ‘optimal’ interface depends on:

07. Interaction in AR

Stephan Lukosch

AR Interface Foundations:

Designing AR system = interface design which satisfies the user and allows real-time interaction

Interacting with AR content:

Evolution of AR interfaces

Expressiveness and intuitiveness has increased over time:

Designing AR Systems

Basic design guidelines:

Affordances

Objects are purposely built: they include affordances and make them obvious.

Affordances: an attribute of an object that allows people to know how to use it

Physical affordances:

Interfaces:

Augmented reality:

Case Studies

Navigating a spatial interface:

Workspace awareness in collaborative AR:

Depth perception in AR:

3D AR lens:

Magic book:

Interaction Design

The process of:

Solving the right problem:

Double diamond of design:

Involving users:

Interaction design:

Practical issues:

What are the users’ needs?

Alternative generation:

Choosing between alternatives:

Prototyping:

Typical development:

Low fidelity prototypes:

High fidelity prototypes:

08. Collaboration in Mixed Reality

Tuckman’s model of group formation:

  1. Forming
  1. Storming
  1. Norming
  1. Performing
  1. Adjourning

Drexler’s team performance model:

  1. Orientation: why I am here?
  2. Trust building: who are you?
  3. Goal clarification: what are we doing?
  4. Commitment: how are we doing it?
  5. Implementation: who does what, when, where?
  6. High performance
  7. Renewal

Collaboration

Definitions

Designing collaboration

Collaboration is affected by internal and external factors:

Collaboration outcomes:

e.g. using AR to help people understand impacts of climate change.

Collaboration Challenges

Piirainen et al., 2012 - group perspective:

Nunamaker et al. 1997 - process perspective:

Haake et al., 2010 and Olson and Olson, 2000 - tool perspective:

Collaboration Design from a Tool Perspective

Time-space matrix of Computer-Supported Cooperative Work (CSCW):

In AR:

3C model:

Human-Computer-Human Interaction Design

Oregon Software Development Process (OSDP) (Lukosch, 2007):

Workspace Awareness in Collaborative AR

Types:

Awareness categories and elements:

Workspace awareness:

Case Studies

Workspace awareness in collaborative AR:

A collaborative game to study presence and situational awareness in a physical and an augmented reality environment:

CSI The Hague:

Burkhardt et al., 2009: seven dimensions of collaboration:

09. Creating Multiple-Sensory VR Experiences

Part 1: Yuanjie Wu

Yuanjie Wu, post-doc researcher at HIT Lab.

(currently in Auckland).

Senses

Creating a realistic experience must provide a multi-sensory experience and create a sense of presence.

A VR system can be modeled as a loop of:

What is ‘input’ and ‘output’ depends on the point of view: the system or the human.

Subjective reality: the way an individual experiences and perceives the external world in their own mind.

Brains consciously and sub-consciously find patterns. The sub-conscious can be thought of as a filter that only allows information that does not conform to the patterns to pass through.

Perceptual illusions provide insight into some of the shortcuts the brain makes:

Mental models: NLP (neuro-linguistic programming)

VR research problems:

Multi-sensory VR systems

Subject wearing HMD in a cage:

Avatar system:

Realism:

Part 2: Rory Clifford

Dr. Rory Clifford, post-doc research fellow at HIT Lab.

Focus on training simulations, cultural preservation.

What creates a profound VR experience?

In the first 30 seconds, you must:

Sound:

Movement:

Smell:

Vibro-tactile feedback:

Haptics:

Fire Emergency NZ (FENZ):

Modeling the real world:

10. Human Perception and Presence in MR

Rob Lindeman, Director HITLab NZ.

In popular media:

Terms:

‘Being there’

What does it mean to ‘be here’?

What does it mean to be together?

How can we re-create these using technology?

In a real environment, we can use:

For a remote physical environment:

In virtual environments:

In described environments:

Game Design

What makes a good game?

A great game is a series of interesting and meaningful choices made by the player in pursuit of a clear and compelling goal

  • Sid Meier

‘Natural Funativity’:

Game structure:

Flow

Mihály Csíkszentmihályi, Flow: The Psychology of Optimal Experience (1990):

Convexity of game play:

Flow and convexity can be combined:

flOw

Characterizing Flow:

Immersion

Immersion:

Haptic ChairIO (Feng et al., 2016):

Natural interaction:

Personal experiences:

Presence

Types of presence:

Measuring presence:

The Real World

The real world is great:

Hence, it is useful to use existing things from the real world: this makes AR easier than VR in terms of fidelity.

But beyond perceptual, there is:

We can tap into experiences already anchored in the mind of the user: provide the essence and let the brain fill in the details, or plant new experiences: seeds that can grow and become scaffolding for future experiences.

To do this:

The myth of technical immersion:

Impossible spaces:

Dava Visualization in Mixed Reality

Master in Human Interface Technology (MHIT)

HITLab NZ:

MHIT:

Data Visualization in Mixed Reality

Immersive analytics (Immersive Analytics, Springer, 2018):

Very dependent on availability of immersive technologies:

Immersive analytics allows engagement:

Opportunties:

Possible Values of 3D for Data Visualizations

Additional visual channel (3rd spatial dimension) for data visualization:

Immersive display technologies have advanced considerably: higher resolution, lower latency, wider range of interaction technologies

Immersive workspaces:

Depth Cues and Display Technology

Limitations of depth perception:

Comparing 2D with 3D Representations - Potential Benefits of Immersive Visualization

Cone Trees:

Data mountains:

Aviation:

3D shapes/landscapes:

Network visualization:

Multivariate data visualization:

Spatial and spatio-temporal data visualization:

Overall:

Summary:

Data Visualization in AR - Situated Analytics

Conceptual model:

Physically vs perceptually-situated visualizations:

Embedded vs non-embedded visualizations:

Interaction:

12. Evaluating Immersive Experiences

Can simply ask the player for their opinion, but these statements are qualitative.

Through validated instruments that use questionnaires, you can get quantitative data (e.g. on situational awareness, workload).

There are many methods to achieve this:

Engagement

What is engagement?

Elements of Flow (Csikszentmihalyi):

O’Brien & Toms:

Situational Awareness

AR promises the ability to provide additional information to the environment you are in.

Many jobs require high situational awareness to make effective and timely decisions.

Situational awareness (Endsley, 1995):

Assessing situational awareness:

Metrics

NASA Task Load Index (TLX):

Simulation workload measure (SIM-TLX):

System Usability Scale (SUS):

Game Experience Questionnaire (GEQ (another also has the same acronym)):

Igroup Presence Questionnaire (IPQ):

Immersive Tendency Questionnaire (ITQ):

User Experience Questionnaire (UEQ):

Game Engagement Questionnaire (GEQ):

Revised Game Engagement Model (R-GEM):

Case Studies

AR game to assess upper extremity motor dysfunctions:

Human augmentation for distributed situational awareness:

Aerial wildfire firefighting training:

Superhuman Sports

Designing MR games that motivate and engage users in physical activity.

You can: