At least 2 semesters of course work or equivalent using a
Audience (long version)
Pedagogical Footnote: Design of and Audience for this Tutorial Project
This course is a Computer Science course, not an Industrial Design or
Industrial Art course that would be appropriate for individuals seeking an MFA in digital product art. Of course,
real world interactive 3D applications -- especially entertainment
products, but less so for data visualization applications -- require team members
from dozens of professions. For instance, Dr. Wartell's first
post-baccalaureate job was at NCR Human Interface Technology Center [MacTavish97] whose members included computer
scientists, industrial artists, industrial designers, cognitive
scientists, psychologists, electrical engineers, and industrial
engineers. He part of a team of 3D graphics programmers hired by Dr.
Mark and Nong Tarlton in order to develop the Dr.'s
Tarltons' cross-platform, object-oriented real-time rendering engine,
For this tutorial, a CS student's goal is to be
familiarity with the Unity Editor's capabilities as a 3D modeling
tool, but not to be an expert at the level of a digital production
artist, architect or industrial designer. These professionals generally use more sophisticated 3D modeling tools as well
such as Blender,
To avoid the cost of additional textbooks (or on-line certificate courses), this tutorial project interjects custom exercises into readings from dollar free, advertisement free (i.e. distraction free), on-line manuals. In Unity the on-line manuals are split between the Unity Manual (https://docs.unity3d.com/Manual/UnityManual.html) and the Scripting API (https://docs.unity3d.com/ScriptReference/index.html). The Unity Manual focuses on Unity's Editor (e.g. it's 3D modeling tool) with only cursory coverage of the Unity scene graph and renderer API that makes the entire 3D simulation system work. The documentation on the latter, the Scripting API, tends to assume either:
the reader is already a compentent graphics programmer familiar with either a low level API's like OpenGL or any other object-oriented scene graph API
the reader has already mastered the Unity Editor and it's conceptual introduction to the classes and algorithms implemented by the Unity scene graph and rendering API.
This tutorial project interweaves readings from both the Unity Manual and the Unity Scripting API. The goal is CS start students on the path toward mastering the 3D graphics and geomtric programming necessary to implemented 3D user interfaces on VR/AR/MR or WIMP systems along with the relevant 3D geometric programming required for such applications.
Guide To Reading These Instructions
Spring 2020 Students: Items 1-3 are
unchanged from previous assignments, but Item 4 is new.
Screen Shot Figures: Most screen captures in this document
can be zoomed to full size by clicking on the image. Click
anywhere outside the zoomed image to return the image to its original
size. For example, click on the image below:
your_user_id – this indicates
you should input a specific text string. The specifics will be
indicated in the instructions.
– this indicates you should input a specific text string. It is
assumed the reader can interpret what to type based on the context
and their general computer science knowledge.
[…additional output will
appear…] – this a comment describing output from a
Shell Code Examples: Shell code instructions appear in black
boxes. For lengthy code examples there is an interactive scroll bar on
the right. Additionally, on the right there is a button, ▽
, clicking the button will expand the box to display it's entire
contests without scrolling. An example is below:
lucretius@CCIWD-435B-1 ~/ITCS_4120 $ ls -la total 20
drwxr-xr-x 1 lucretius 197609 0 Sep 23 21:57 ./
drwxr-xr-x 1 lucretius 197609 0 Sep
23 21:37 ../
-rw-r--r-- 1 lucretius 197609 0 Sep
23 21:55 'The Hitchhickers Guide to the Galaxy.azw'
-rw-r--r-- 1 lucretius 197609 0 Sep
23 21:36 'On the nature of things - Lucretius.pdf'
-rw-r--r-- 1 lucretius 197609 0 Sep
23 21:56 'Ethica - Spinzoa.pdf'
-rw-r--r-- 1 lucretius 197609 0 Sep
23 21:57 'Moses the Egyptian: The Memory of Egypt in
-rw-r--r-- 1 lucretius 197609 0 Sep
23 21:37 Notes.docx […misc additional output
Application Context: Written
instructions are often annotated with a bracket notation:
1 ) [Application/Program Name] Do xyz.
The Application/Program Name is the name of the application
or program in which the instructions are to be carried out. If a
sequence of listed instructions are all performed within the same
application, then the bracket notation will show:
2) [ ^ ] Do pdq.
The up arrow ( ^ ) indicates instruction #2 is performed within the
same application as the previous numbered instruction.
Git Commit Protocol and Rubric
In this document's instructions, you are required to
make git commits at specific points using specific commit
messages. The git commits are a required part of the
assignment. It also introduces you to good git practices.
For a given exercise that includes specific git
commit instructions, if there is not at least 1 git commit message
associated with that exercise, 1 point will be subtracted for "poor
software development practices" for that exercise.
Your commit message need not be a verbatim copy of the commit message
given in the exercise’s instructions, but it should be very
similar. (But you can simply cut & paste
the messages, so they really should be verbatim).
The grader will allow for a few of the required commit messages to be
missing without incurring a penalty.
Finally, extra commit's are perfectly fine and common practice, such
for fixing other types of coding mistakes, etc.
For exercises that require you modify code already
given to you, if only the original example code is committed with no
change & there is no evidence from the student (documentation,
notes, etc) of trying to solve the exercise, then minimal points will
be awarded for that exercise.
Git Repo Setup
As your read through various sections of the U3DDoc's you will perform
various exercises. You must submit them to various directories
inside a single Git repo.
~/ITCS_VRAR/ $git clone
Unity_XR_Tutorial_1-git-start Cloning into
warning: You appear to have cloned an empty repository.
Unity_XR_Tutorial_1-git-start directory (yes, for now use this unusual name) create
a text file called README.md.
Add your name on the first line of the file.
Git commit the file with the commit message “initial README.md”
Time: "[not calculated yet] minutes to read;[not calculated yet] minutes to do"
The instructions below walk the reader through a subset of the Unity Manual and the Scripting API and interject multiple required exercises. Most of the instruction steps are labeled as "Chapter : Section : Subsection" indicating the related sections found the Unity documenation.
When successive steps refer to the same online Chapter, the short-hand notation ""[ ^ ]" is used to indicate that the reading in step is part of the same chapter (or section) as in the previous step.
Rename both Cube objects to names "Left Controller" and
Change all scale factors of each cube object to 0.1
From the GameObject menu create an four empty objects (menu
item Create Empty) with
the following names: Platform, Tracker, Left Sensor,
Change the parent-child relationships of objects in order to
create the following transform hierarchy:
Using the Inspector
window, transform positions as follows:
Platform position (0,0,0)
Tracker position (0,2,0)
Left Sensor position (-0.25,0,0)
Right Sensor position (+0.25,0,0)
From the Project window,
right-click Assets. In
the pop-up menu select Create.
In the cascading pop-up menu select Material.
In the Project window's
Assetssub-window a new material will appear.
Name the material "Left".
Left-click the "Left" material.
The Inspector window now
shows the "Left" material. In the Inspector,
left click the Color Dialog Button (see image
below). This create a color selector
dialog. Pick a shade of blue for this material.
Repeat the above two steps to create a second material
called "Right". Pick a shade of red for this material.
Read the first paragraph of https://docs.unity3d.com/Manual/Materials.html
regarding how to apply a material to a GameObject.
Then apply each of the materials to their corresponding
GameObjects (e.g. "Left Controller" and "Right Controller").
git commit with message "-created basic view coordinate
Note: The next steps return the chapter and section "Creating Gameplay : GameObjects".
[ ^ ] : Creating Gameplay : GameObjects : [...] - Return to this sub-section (where we left off in step 6). Read all the remaining sub-sub-section's ("Creating components with scripting" through "Saving your
work"). After reading perform the following exercises.
Exercise: Debug Print the Hierarchy - I
[Unity Editor] Add a tag to each
object in the "Platform" hierarchy. Use the
tag name "ViewPlatform"
[U3DDocs] Read in depth U3DDoc's Scripting API on the C# class Transform.
[Visual Studio] Modify the "Main
Script.cs" to do the following for each of Platform's descendent GameObjects:
Debug.Log the position and orientation in local coordinates. For orientation include both the quaternion and Euler angles.
Debug.Log the position and orientation in world coordinates. For orientation include both the quaternion and Euler angles.
[Git] git commit with message "-added further debug log of pose"
[ ^ ] : Creating Gameplay : Prefabs Unity Prefab's are Unity's implementation of scene graph instancing. The steps below walk you through reading the sub-sections of the "Prefabs" section with several exercises.
Pedagogical Footnote: Scene Graph Instancing
Scene graphs are called scene graphs and not scene tree's because the scene's data structure as a whole is not a tree but rather a directed-acyclic-graph (DAG). In a DAG, an geometric object, such as a wheel, can be instantiated multiple times by making the same programming language geometry object into a child of multiple transform nodes in the scene graph. So a single wheel geometry object would be a child of four different transform nodes, where each transform node has it's own location and orientation ("pose"). The rendering engine will draw 4 copies of the wheel at four locations corresponding to the four wheel of the car object. DAG based instancing is ubiquitous in scene graph API's at least since the 80's (if not earlier).
First, make sure to select "Pivot" and "Local" for Tool Handle Rotation options.
Create a GameObject that is a simple 3D model of a wheel with a tire (a sphere) and hub cab (a cylinder). Assume Unity's units of distance are in meters. Hint: To help manually align the parts first set their World CS positions to (0,0,0) and then manually adjust the scale factors and positions to achieve a reasonable alignment. Finally, create a Empty parent object to contain both of them. Your result should be something similar to the following:
As discussed in "Creating Prefab Assets"
create a Prefab Asset from the Wheel GameObject.
As discussed in "Creating Prefab instances" create 4 instances of this Prefab and adjust their transforms so they appear roughly like the image below (hub caps should point outward).:
Using a Cube, add a "Body" to the Car
Save all change and git commit with -m "-create car"
Exercise: Unity scene graph
[Unity Editor] Create a new C# script "Car.cs"
[ ^ ] Associate the script with the Car GameObject
[Visual Studio] Copy, paste and modify your MainScript.cs code into Car.cs. Car.cs should recursively traverse the Car object and print (via Debug.Log) all the descendent GameObjects. Simply Car.cs using the fact that the instance of the MonoBehavior in a script auto-magically has access to all the parts of the GameObject that script is associated with. In other words, it is not actually necessary to have the statement "car = GameObject.Find("Car 1");" The MonoBehavior code can directly access all parts of a GameObject (https://docs.unity3d.com/ScriptReference/GameObject.html). So having modified the original MainScript.cs code to Car.cs, the statement "LogHierarchy(this.transform, 0);" can directly access transform of the Car GameObject.
[Unity Editor] Run the simulation (Press "Play") and observe how the local coordinate transform of an isntantiated object such as the HubCap
are the same for all instances, while the world coordinate transform differs for each of the 4 HubCaps.
[Git] git commit with -m "-added Car.cs with debugging code"
Note: The next few steps require reading the U3DDoc sub-sections slightly out of order.
Observe that Unity's Camera class properties and methods can roughly be divided into two groups. Those that define and compute the standard view frustum geometry (as reviewed in course lectures) and those that manage the rendering pipeline. From a scene graph standpoint this is design choice. Some scene graphs put both groups of functionality in their Camera class; other scene graphs APIs only put the view frustum related parts in the Camera class and then delegate the rendering control functionality to a another class. (For the curious student, an example of the latter is the OpenSceneGraph API's Camera and Viewer).
In your questions file answer the following questions:
Question "Familiar Parts": Make a list called "Familiar Parts" and list those view frustum related parts of Unity.Camera that directly correspond to prior courses you have taken (including material reviewed in this course lecture).
Question "Unfamiliar Parts": Make a list called "Unfamiliar Parts" and list those view frustum related parts of Unity.Camera that seem less familiar to you based on prior courses, etc. Relying only on what you already know, for each of these unfamiliar parts make a educated guess about how they might related to items on your "Familiar Parts" list.
git commit with message "-answered Unity.Camera questions"
Note: Now skip sub-section [ ^ ] : Creating Gameplay : Adding Random Gameplay Elements . (This section basically reviews standard Computer Science 101 material in the context of Unity).
Having done some actual coding :) , go back to the previously skipped section "Asset workflow"
[Unity Editor] Create and Empty object called "JigglySphere" and a C# script "JigglySphere.cs" associated with that GameObject.
Set the JigglySphere position 0,0,0
In the Inspector window add 2 components: a Mesh Filter and a Mesh Renderer
[Visual Studio] Create code to generate and render a mesh that displays just a set of Points (see MeshTopology) that lie on a sphere. Use the code examle in https://docs.unity3d.com/ScriptReference/Mesh.html as a starting point. Further, use the following equation to generate the vertices on a unit sphere: :
[Git] Git commit with messsage "-sphere of points"
[Visual Studio] Now make the sphere into a surface rendered as a sphere of triangles.
This requires changing the indices and the MeshTopology.
Draw a picture to help yourself determine how to set up the indices!
Add normals to the vertices (calculating the normal at a point on a sphere is quite easy).
[Git] Git commit with messsage "-sphere surface with triangles"
[Visual Studio] Modify the "Update" method of JigglySphere to make each vertex 'jiggle' continuously (as an animation) in some visually dicernable fashion. Feel free to be a creative in your geometric computations.
Oculus XR Plugin: The course grader will grade your
tutorial using an Oculus Rift S. Therefore, you
must next install the Oculus XR Plugin. In the Project window right-click Packages. In the pop-up
menu select View in Package Manager.In the Package
Manager window find and select the Oculus XR
Plugin. Click the Install button
to add the package.
Shaking Hands with a Star Bot
ZJW: I decided this level of procedural geometry generation was too much for this project
In this part you will develop a VR 3D User Interface that is bi-manual
(i.e. 6DOF controllers for left and right hand) and utilizes
head-tracking for both tracked viewing and as part of the 3D user
interface and 3D interaction. The tutorial is designed to be
implementable on either a HMD or a "home made" fish-Tank VR [Arthur93]
setup. As of 2/24/2020 Dr. Wartell is evaluating solutions that
will allow you to setup a home-made fish-tank VR envrionment using
minimal hardware such as a webcam or a LeapMotion device which the
VisCenter can lend out to individual students.
A Procedurally Generated Star Bot
Procedurally Generated Trees (from link)
Star Bot Concept Sketch
In Section 6.1, in order to demonstrate basic knowledge of Unity's
implementation of the common scene graph paradigm, you will create a procedurally generated robot/creature, nick named
a "star bot". (Later, in Section 6.2 you will add the VR 3D
user interface to interact with the Star Bot).
Unity C# Scripting Basics
ZJW TODO: Tutorial Reading on C# script integration into Unity
Generating a Star Bot
The most common example of procedurally generated computer graphics are
3D trees. The star bot is essentially a variation this idea.
ZJW TODO: Discuss requirements for number of limbs, branching levels
and branching depth and hiearchical coordinate systems
XR 3D User Interface
xR 3D UI Basics
Shaking Hands, Poking and Proding, etc.
See the course syllabus regarding partial credit and the late penalty
This is an individual student project. Each student should
be writing and submitting their own code.
Students can discuss general aspects of the various API's and tools
with each other and this is strongly encouraged. Discussing
algorithm's at the pseudo-code level is also acceptable.
However, it is not permissible:
to copy code from other students
copy code implementing entire functions or entire algorithms from
Internet resources other than the resources explicitly referenced in
to copy code from other students from prior semesters
translate an algorithm found on the Internet implemented in
programming X and re-write it in language Y.
If you have questions about whether using a particular resource is
acceptable or not, email the TA and/or Professor.
T. J. MacTavish & R. L. Henneman,
"The NCR Human Interface Technology Center", CHI ’97 Extended Abstracts on Human Factors in Computing Systems
, 83–84 (1997) [DOI]
M. A. Tarlton & P. N. Tarlton,
"A framework for dynamic visual applications", I3D '92: Proceedings of the 1992 symposium on Interactive 3D
graphics , 161-164 (1992) [DOI]
M. A. Tarlton, P. N. Tarlton, E. J. Lee
& Z. Wartell,
"Objects, Modeling and Media: A Framework for Interactive 3D
Applications", AT&T Middleware Day and Software Symposium , (1995)
This section is experimental and/or incomplete and should not be visible to students.
This approach requires you have a webcam mounted on top of your monitor.
Install Vuforia Unity XR Package
From the Unity Editor, in the Project
window right-click Packages.
In the pop-up menu select View in Package
Manager.In the Package Manager window find and select
the Vuforia Engine AR Plugin. Click the Install
button to add the package.
Mount it on a cardbox box that you can stand upright on your
desk. For example:
Figure: Dr. Wartell's home office desk :)
With your webcam in a single-fixed position, use your web cam
software to take at least 25 pictures of box at a variety of
positions. Each picture must show the entire sheet of
paper. Half the images should be with the box rotated on
it's side. You might try standing the box on a book or
two to raise it's position within the camera's view to increase the
variety of angles from which the photos are taken.
[GML] Create a New Project. Enter 1 for the number of
templates. For the grid size enter 7 x 8 and 21.5 mm.
[GML] Load all ~25 images into by pressing the Green + button.
[GML] Press the Detect All
button. Each image should then appear like this:
Figure: Dr. Wartell's home office desk :)
[GML] Press the Calibrate
button (camera icon). GML will calculate the camera's
intrinsic parameters from these images.
In your ITCS_x165 directory (but notin
your project directory for this tutorial), git clone