Unity XR Tutorial 1:
Unity Scene Graph API and Editor


Spring 2020

Due: Wednesday April 1, 2020 (11:59PM).

Author: Zachary Wartell
https://webpages.uncc.edu/~zwartell/

Updated:  $REVISION$

Change Log:  [link]



Creative Commons License
"Unity XR Tutorial 1" by Zachary Justin Wartell is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Git Repo at https://gitlab.com/zwartell/Unity_XR_Tutorial_1



Compatibility

This page has been tested on Firefox and Chrome. 

Time Estimate:

I estimate this tutorial will take roughly 30 hours over the three week assignment period.

Objectives

  1. Read and learn basic of the Unity Editor from https://docs.unity3d.com/Manual/ (abbreviated U3DDoc's )
  2. Familiarize oneself with the larger corpus of material at U3DDoc for future reference
  3. Learn to configure of Unity with Microsoft Visual Studio (MSVS) and the Unity XR Management plug-in architecture
  4. Learn the basics of using Unity's C# scene graph API.
  5. Apply knowledge of 3D transformations and geometric computation to procedurally generate an object.
  6. Develop a VR (or AR) 3D user interface to interact with the objects, applying knowledge of 3D transformations and basic 3D computations such as collision detection and ray casting, etc.

Prerequisites

  1. Project Pre-req's:
  2. General Pre-req's:
    1. CS MS Program:  https://cci.uncc.edu/academics/computer-science/masters-program/ms-admissions
    2. ITCS 6125/8125:

Audience (long version)

Pedagogical Footnote: Design of and Audience for this Tutorial Project

This course is a Computer Science course, not an Industrial Design or Industrial Art course that would be appropriate for individuals seeking an MFA in digital product art.  Of course, real world interactive 3D applications -- especially entertainment products, but less so for data visualization applications -- require team members from dozens of professions.  For instance, Dr. Wartell's first post-baccalaureate job was at NCR Human Interface Technology Center [MacTavish97] whose members included computer scientists, industrial artists, industrial designers, cognitive scientists, psychologists, electrical engineers, and industrial engineers. He part of a team of 3D graphics programmers hired by Dr. Mark and Nong Tarlton in order to develop the Dr.'s Tarltons' cross-platform, object-oriented real-time rendering engine, "Mirage" [Tarlton92][Tarlton95].

For this tutorial, a CS student's goal is to be familiarity with the Unity Editor's capabilities as a 3D modeling tool, but not to be an expert at the level of a digital production artist, architect or industrial designer. These professionals generally use more sophisticated 3D modeling tools as well such as Blender, AutoCAD, etc.

To avoid the cost of additional textbooks (or on-line certificate courses), this tutorial project interjects custom exercises into readings from dollar free, advertisement free (i.e. distraction free), on-line manuals. In Unity the on-line manuals are split between the Unity Manual (https://docs.unity3d.com/Manual/UnityManual.html) and the Scripting API (https://docs.unity3d.com/ScriptReference/index.html). The Unity Manual focuses on Unity's Editor (e.g. it's 3D modeling tool) with only cursory coverage of the Unity scene graph and renderer API that makes the entire 3D simulation system work. The documentation on the latter, the Scripting API, tends to assume either:

This tutorial project interweaves readings from both the Unity Manual and the Unity Scripting API. The goal is CS start students on the path toward mastering the 3D graphics and geomtric programming necessary to implemented 3D user interfaces on VR/AR/MR or WIMP systems along with the relevant 3D geometric programming required for such applications.

Guide To Reading These Instructions

|

Spring 2020 Students: Items 1-3 are unchanged from previous assignments, but Item 4 is new.

  1. Screen Shot Figures: Most screen captures in this document can be zoomed to full size by clicking on the image.   Click anywhere outside the zoomed image to return the image to its original size. For example, click on the image below:
    Screen_Capture
  2. Fonts:
    • your_user_id – this indicates you should input a specific text string. The specifics will be indicated in the instructions.
    • path_to_a_directory_where_you_can_save_your_work – this indicates you should input a specific text string. It is assumed the reader can interpret what to type based on the context and their general computer science knowledge.
    • […additional output will appear…] – this a comment describing output from a command-line command.
  3. Shell Code Examples: Shell code instructions appear in black boxes. For lengthy code examples there is an interactive scroll bar on the right. Additionally, on the right there is a button,  ▽  , clicking the button will expand the box to display it's entire contests without scrolling. An example is below:
    lucretius@CCIWD-435B-1 ~/ITCS_4120
    $ ls -la
    total 20
    drwxr-xr-x 1 lucretius 197609 0 Sep 23 21:57 ./
    drwxr-xr-x 1
    lucretius 197609 0 Sep 23 21:37 ../
    -rw-r--r-- 1
    lucretius 197609 0 Sep 23 21:55 'The Hitchhickers Guide to the Galaxy.azw'
    -rw-r--r-- 1
    lucretius 197609 0 Sep 23 21:36 'On the nature of things - Lucretius.pdf'
    -rw-r--r-- 1
    lucretius 197609 0 Sep 23 21:56 'Ethica - Spinzoa.pdf'
    -rw-r--r-- 1
    lucretius 197609 0 Sep 23 21:57 'Moses the Egyptian: The Memory of Egypt in Western Monotheism.azw'
    -rw-r--r-- 1
    lucretius 197609 0 Sep 23 21:37 Notes.docx
    […misc additional output will appear…]

  4. Application Context:   Written instructions are often annotated with a bracket notation:

    1 ) [Application/Program Name] Do xyz.

    The Application/Program Name is the name of the application or program in which the instructions are to be carried out.  If a sequence of listed instructions are all performed within the same application, then the bracket notation will show:

    2)  [ ^ ]   Do pdq.

    The up arrow ( ^ ) indicates instruction #2 is performed within the same application as the previous numbered instruction.

Git Commit Protocol and Rubric

In this document's instructions, you are required to make git commits at specific points using specific commit messages.   The git commits are a required part of the assignment.   It also introduces you to good git practices.

  1. For a given exercise that includes specific git commit instructions, if there is not at least 1 git commit message associated with that exercise, 1 point will be subtracted for "poor software development practices" for that exercise.   

    Your commit message need not be a verbatim copy of the commit message given in the exercise’s instructions, but it should be very similar.    (But you can simply cut & paste the messages, so they really should be verbatim).

    The grader will allow for a few of the required commit messages to be missing without incurring a penalty.

    Note: If you accidentally commit with a missing or bad message, you can correct the message using git commit –amend option (see https://git-scm.com/book/en/v2/Git-Basics-Undoing-Things)

    Finally, extra commit's are perfectly fine and common practice, such for fixing other types of coding mistakes, etc.

  2. For exercises that require you modify code already given to you, if only the original example code is committed with no change & there is no evidence from the student (documentation, notes, etc) of trying to solve the exercise, then minimal points will be awarded for that exercise.

Git Repo Setup

As your read through various sections of the U3DDoc's you will perform various exercises.  You must submit them to various directories inside a single Git repo.

  1. Create a Git repository:

    https://cci-git.uncc.edu/
    your_userid/Unity_XR_Tutorial_1

    Note: Make sure the repository is Private and give Reporter access to both Dr. Wartell and the TA.
  2. If you have not already, create a subdirectory for your work in this class:
    lucretius@CCIWD-435B-1 ~/
    $mkdir ITCS_VRAR

    lucretius@CCIWD-435B-1 ~/
    $cd ITCS_VRAR

  3. Clone the repo.
    lucretius@CCIWD-435B-1 ~/ITCS_VRAR/
    $git clone https://cci-git.uncc.edu/your_user_id/Unity_XR_Tutorial_1.git Unity_XR_Tutorial_1-git-start
    Cloning into 'Unity_XR_Tutorial_1-git-start'...
    warning: You appear to have cloned an empty repository.

  4.  Inside the Unity_XR_Tutorial_1-git-start directory (yes, for now use this unusual name) create a text file called README.md.  Add your name on the first line of the file.  

    Git commit the file with the commit message “initial README.md”

  5. Copy the file from https://cci-git.uncc.edu/UNCC_Graphics/gitignore/blob/master/Unity_VisualStudio.gitignore
    to a local file .gitignore in Unity_XR_Tutorial_1.

    Git add & commit the file.

  6. Add either a plain text file or a MSWord file called Questions.txt or Questions.docx. Put your name at the head of the file.

    Git add & commit the file.

Setting up Unity with Visual Studio and XR Management Package

Install Unity Hub and Unity Editor

You will need a computer that has the Unity Hub and Unity Editor installed.  Unity provides a free Personal Edition for students the you can access through an on-line signup mechnism.  U3DDoc's describes this installed and setup process at https://docs.unity3d.com/Manual/GettingStartedInstallingUnity.html.

Computers in the VisCenter and in the CCI Open Lab has Unity Hub and Unity Editor installed.

Unity, MSVS and XR Management Package

Time:  [not calculated yet] minutes to read; [not calculated yet] minutes to read do
    1. [U3DDocs] - https://docs.unity3d.com/Manual/ - Familiarize yourself with the outline of this manual for later reference.
    2. Watch the following video that shows how to Create Unity project and connects with MSVS and includes key XR Management packages.

      Note, the text below the video reviews the key steps and contains hyperlinks to` referenced materials.


      Key Steps in Video:
      1. [Unity Hub]:     Create Unity 3D Project
      2. [Unity Editor]:  Brief overview of Editor and Scene tabbed window
      3. [         ^         ]:  Create of 3D Plane object and Create empty C# Script
      4. [         ^         ]:  Configure and build MSVS solution
      5. [Visual Studio]:   Brief overview of the created Visual Studio project
      6. [bash]:  As always follow good git commit practices throughout all steps :)

      Mac Users: Video Modifications: The "preference" is under "unity" tab, and there is no "copy PDB files" options (it is window only option, https://docs.unity3d.com/Manual/BuildSettingsStandalone.html). Mac users should check this link for guidance: https://docs.microsoft.com/en-us/visualstudio/mac/setup-vsmac-tools-unity?view=vsmac-2019

      Reference Material:
      1. Unity Build Settings and MSVS -  The video covers the basics which should be sufficient knowledge for now.  For more in depth understanding of the steps in the video, see the following as needed:
        1. https://docs.unity3d.com/Manual/BuildSettings.html
        2. https://docs.unity3d.com/Manual/comp-ManagerGroup.html
        3. https://docs.unity3d.com/Manual/VisualStudioIntegration.html
        4. https://docs.unity3d.com/Manual/ManagedCodeDebugging.html
      2. Unity Packages - https://docs.unity3d.com/Manual/PackagesList.html - Note this chapter for later reference on the Unity Package mechanism.  The video covers installing this tutorial's required packages, which is sufficient detail for now.

  1. [bash]:  As always follow good git commit practices throughout all steps :)

Part I: Unity Editor and Unity Scene Graph

Time:  "[not calculated yet] minutes to read;[not calculated yet] minutes to do"

The instructions below walk the reader through a subset of the Unity Manual and the Scripting API and interject multiple required exercises. Most of the instruction steps are labeled as "Chapter : Section : Subsection" indicating the related sections found the Unity documenation. When successive steps refer to the same online Chapter, the short-hand notation ""[ ^ ]" is used to indicate that the reading in step is part of the same chapter (or section) as in the previous step.

  1. "Unity User Manual (2019.3)" - (https://docs.unity3d.com/Manual/UnityManual.html) Read this chapter
  2. Note:

  3. Working in Unity : Unity's Interface - (https://docs.unity3d.com/Manual/UsingTheEditor.html) Read the entire section and explore the Unity Editor while reading.

    Note:  The steps below walk through specific sections of chapter Creating Game Play (https://docs.unity3d.com/Manual/CreatingGameplay.html).

  4. [ ^ ] : Creating Gameplay : Asset Workflow : [...] - For now, read only through the first two sub-sub-section, skipping the rest of "Asset Workflow". More specifically read:
    1. https://docs.unity3d.com/Manual/AssetWorkflow.html
    2. https://docs.unity3d.com/Manual/AssetTypes.html
    3. https://docs.unity3d.com/Manual/AssetPackages.html

  5. [ ^ ] : Creating Gameplay : Scenes - Read all

  6. [ ^ ] : Creating Gameplay : GameObjects : [...] : - Read only sub-sub-sections from "GameObject" through "Transform".

    Note: The next steps skip to a differents sub-section to focus in greater details on Unity transforms to prepare for a programming exercise.

  7. [ ^ ] : Creating Gameplay : Transform - (https://docs.unity3d.com/Manual/Transforms.html)
    1. Read the above section carefully
    2. Exercise:
      1. Create a second "Cube" object
      2. Rename both Cube objects to names "Left Controller" and "Right Controller"
      3. Change all scale factors of each cube object to 0.1
      4. From the GameObject menu create an four empty objects (menu item Create Empty) with the following names:  Platform, Tracker, Left Sensor, Right Sensor
      5. Change the parent-child relationships of objects in order to create the following transform hierarchy:
        Screen_Capture
      6. Using the Inspector window, transform positions as follows:
        1. Platform position (0,0,0)
        2. Tracker position  (0,2,0)
        3. Left Sensor position (-0.25,0,0)
        4. Right Sensor position (+0.25,0,0)
      7. From the Project window, right-click Assets. In the pop-up menu select Create.  In the cascading pop-up menu select Material.
      8. In the Project window's Assets sub-window a new material will appear.  Name the material "Left".
      9. Left-click the "Left" material.
      10. The Inspector window now shows the "Left" material.   In the Inspector, left click the Color Dialog Button (see image below).   This create a color selector dialog.   Pick a shade of blue for this material.
        Screen_Capture
      11. Repeat the above two steps to create a second material called "Right".  Pick a shade of red for this material.
      12. Read the first paragraph of https://docs.unity3d.com/Manual/Materials.html regarding how to apply a material to a GameObject.   Then apply each of the materials to their corresponding GameObjects (e.g. "Left Controller" and "Right Controller").
      13. Save All
      14. git commit with message "-created basic view coordinate system hierarchy"

  8. Note: The next steps return the chapter and section "Creating Gameplay : GameObjects".

  9. [ ^ ] : Creating Gameplay : GameObjects : [...] - Return to this sub-section (where we left off in step 6). Read all the remaining sub-sub-section's ("Creating components with scripting" through "Saving your work"). After reading perform the following exercises.

    1. Exercise: Debug Print the Hierarchy - I
      1. [Unity Editor]     Add a tag to each object in the "Platform" hierarchy.    Use the tag name "ViewPlatform"
      2. [Visual Studio]   Modify your script "ViewPlatform.cs" as shown in "ViewPlatform.cs" Version #2
      3. [Git] git commit with message "-added debug log of hiearchy"
      4. [Visual Studio] Set a breakpoint on the line of code  if (debugLog). Leave the breakpoint disabled (but do not delete it).
      5. [Unity Editor]  Select the Play. Wait until the Unity Editor enters Play mode.
      6. [Visual Studio] From the main menu select Debug → Attach to Unity Debugger. In the dialog box, Select Unity Instance, select the first instance listed and select Ok
      7. [ ^ ] Enable the breakpoint. When the debugger breaks at that point, goto the Watch Window and manually change this.debugLog to true
      8. [Unity Editor] View the Debug.Log output in the Console tabbed window.
      9. [U3DDocs] Read and just briefly familiarize yourself with only the basics of the U3DDoc's Debug.Log.

    2. Exercise: Debug Print the Hierarchy - II
      1. [U3DDocs] Review the U3DDoc's on Rotation and Orientation in Unity. (Note, you should already be familiar with this math from lectures and prior courses).
      2. [U3DDocs] Read in depth U3DDoc's Scripting API on the C# class Transform.
      3. [Visual Studio] Modify the "Main Script.cs" to do the following for each of Platform's descendent GameObjects:
        1. Debug.Log the position and orientation in local coordinates. For orientation include both the quaternion and Euler angles.
        2. Debug.Log the position and orientation in world coordinates. For orientation include both the quaternion and Euler angles.
      4. [Git] git commit with message "-added further debug log of pose"

  10. [ ^ ] : Creating Gameplay : Prefabs   Unity Prefab's are Unity's implementation of scene graph instancing. The steps below walk you through reading the sub-sections of the "Prefabs" section with several exercises.

    Pedagogical Footnote: Scene Graph Instancing

    Scene graphs are called scene graphs and not scene tree's because the scene's data structure as a whole is not a tree but rather a directed-acyclic-graph (DAG). In a DAG, an geometric object, such as a wheel, can be instantiated multiple times by making the same programming language geometry object into a child of multiple transform nodes in the scene graph. So a single wheel geometry object would be a child of four different transform nodes, where each transform node has it's own location and orientation ("pose"). The rendering engine will draw 4 copies of the wheel at four locations corresponding to the four wheel of the car object. DAG based instancing is ubiquitous in scene graph API's at least since the 80's (if not earlier).

    Read only U3DDoc's introduction part to "Prefabs" (https://docs.unity3d.com/Manual/Prefabs.html), then continue with the detailed sub-sub-sections as listed in the following steps.

  11. [ ^ ] : Creating Gameplay : Prefabs : Creating Prefabs (https://docs.unity3d.com/Manual/CreatingPrefabs.html)
    1. Read the above sub-section
    2. Exercise: Re-invent the Wheel
      1. First, make sure to select "Pivot" and "Local" for Tool Handle Rotation options.

      2. Create a GameObject that is a simple 3D model of a wheel with a tire (a sphere) and hub cab (a cylinder). Assume Unity's units of distance are in meters. Hint: To help manually align the parts first set their World CS positions to (0,0,0) and then manually adjust the scale factors and positions to achieve a reasonable alignment. Finally, create a Empty parent object to contain both of them. Your result should be something similar to the following:

      3. As discussed in "Creating Prefab Assets" create a Prefab Asset from the Wheel GameObject.
      4. As discussed in "Creating Prefab instances" create 4 instances of this Prefab and adjust their transforms so they appear roughly like the image below (hub caps should point outward).:
      5. Using a Cube, add a "Body" to the Car
      6. Save all change and git commit with -m "-create car"

  12. Exercise: Unity scene graph
    1. [Unity Editor] Create a new C# script "Car.cs"
    2. [ ^ ] Associate the script with the Car GameObject
    3. [Visual Studio] Copy, paste and modify your MainScript.cs code into Car.cs. Car.cs should recursively traverse the Car object and print (via Debug.Log) all the descendent GameObjects. Simply Car.cs using the fact that the instance of the MonoBehavior in a script auto-magically has access to all the parts of the GameObject that script is associated with. In other words, it is not actually necessary to have the statement "car = GameObject.Find("Car 1");" The MonoBehavior code can directly access all parts of a GameObject (https://docs.unity3d.com/ScriptReference/GameObject.html). So having modified the original MainScript.cs code to Car.cs, the statement "LogHierarchy(this.transform, 0);" can directly access transform of the Car GameObject.
    4. [Unity Editor] Run the simulation (Press "Play") and observe how the local coordinate transform of an isntantiated object such as the HubCap are the same for all instances, while the world coordinate transform differs for each of the 4 HubCaps.
    5. [Git] git commit with -m "-added Car.cs with debugging code"

  13. Note: The next few steps require reading the U3DDoc sub-sections slightly out of order.

  14. [ ^ ] : Creating Gameplay : Prefabs : Nested Prefabs (https://docs.unity3d.com/Manual/NestedPrefabs.html)
    1. Read this section.
    2. Exercise: Two Cars
      1. Make your Car GameObject into a Prefab. It will be Nested Prefab. Your result should be similar to below:
      2. Asscoate the Car.cs script with the second car in the scene.
      3. Run the simulation (Press "Play") and observe the LogHierarchy output. You should see two hierarchy's printed, one for "Car 1" and one for "Car 2"
      4. Save all change and git commit with -m "-two cars".

  15. [ ^ ] : Creating Gameplay : Prefabs : [...] - Return to sub-section "Editing a Prefab in Prefab Mode" (https://docs.unity3d.com/Manual/EditingInPrefabMode.html) read the remaining sub-sub-sections regarding Prefabs.
    1. Exercise: Instantiating Prefabs at Runtime
      1. [Unity Editor] Create a script InstantiateCar.cs
      2. [Visual Studio] Based on https://docs.unity3d.com/Manual/InstantiatingPrefabs.html make the script instantiate 10 cars. Place them around a circle with each car's forward direction being aligned with the normal vector to the circle.
        You result should be similar to image below:


      3. [Git] git commit with message "-added 10 cars"
  16. [ ^ ] : Creating Gameplay : Input   (https://docs.unity3d.com/Manual/Input.html) Read this section
    1. Exercise: WIMP Navigation Control
      1. [Visual Studio] Add keyboard navigation to ViewPlatform.cs by re-using the example code at https://docs.unity3d.com/ScriptReference/Input.GetAxis.html. Be sure you are modifying the Platform CS.
      2. [ ^ ] Add support for mouse-look to ViewPlatform.cs as well (see other code examples at above link).
      3. [ ^ ] Modify the mouse-look to only be enabled if the left mouse button is pressed (see https://docs.unity3d.com/ScriptReference/Input.GetKey.html).
      4. [ ^ ] Add the ability to slide left/right with keypress 'Z' and 'C' and move up/down (Y-axis) with keypress 'R' and 'F'.
        Reminder: Be sure all navigation is modifying the Platform CS.
      5. [Git] git commit with message "-added WIMP navigation control to Platform CS"
  17. Note: Skip "[ ^ ] : Creating Gameplay : Transforms". (You read it earlier).

  18. [ ^ ] : Creating Gameplay : Layers  - Read this section.

    Note: The next two sections are read out-of-order!

  19. [ ^ ] : Creating Gameplay : Light - (https://docs.unity3d.com/Manual/Lights.html). Read this section.
    1. Exercise: Car with Headlight
      1. [Unity Editor] Make a Prefab Varaint of your Car called "Car With Headlight"
      2. [ ^ ] Add a single spot light to this Prefab
      3. [ ^ ] Add a instance of this Prefab to the scene.
      4. [Git] git commit with messsage "-Car Head with Light"

  20. [ ^ ] : Creating Gameplay : Constraints  - (https://docs.unity3d.com/Manual/Constraints.html) Read this section.
    1. Exercise: CirculingCar
      1. [Unity Editor] - create another instance of car calling "CirclingCar" and create a script CirclingCar.cs associated with that car.
      2. [Visual Studio] - write script code to make the car drive around in a large circle around all the other cars in the scene (see https://docs.unity3d.com/ScriptReference/Time-deltaTime.html for how to use time for animation).
      3. [Git] git commit with messsage "-CirclingCar"

  21. [ ^ ] : Creating Gameplay : Constraints : Aim Constraints - (https://docs.unity3d.com/Manual/Constraints.html) Read this section.
    1. Exercise: Search Light Tower
      1. [Unity Editor] Create a simple search light tower with a perhaps 2 cylinders and 1 spot light
      2. [Unity Editor] Add an Aim constraint to the appropriate component of the tower so the search follows the CirclingCar. Mine looks something like this:

      3. [Git] git commit with messsage "-Search Light Tower Light"
  22. Note: Skip "Rotation and Orientation in Unity " and skip "Lights" (You read them earlier).

  23. [ ^ ] : Creating Gameplay : Cameras - (https://docs.unity3d.com/Manual/Cameras.html). Read this section.
    1. Read: Also, from the above section follow and read the hyperlink: https://docs.unity3d.com/Manual/CamerasOverview.html
    2. Read: Similarly also follow and read the hyperlink: https://docs.unity3d.com/Manual/class-Camera.html
    3. Review: Next review the U3DDoc's Scripting API's full description of C# class Camera:
      https://docs.unity3d.com/ScriptReference/Camera.html

    4. Exercise: Unity.Camera class

      Observe that Unity's Camera class properties and methods can roughly be divided into two groups. Those that define and compute the standard view frustum geometry (as reviewed in course lectures) and those that manage the rendering pipeline. From a scene graph standpoint this is design choice. Some scene graphs put both groups of functionality in their Camera class; other scene graphs APIs only put the view frustum related parts in the Camera class and then delegate the rendering control functionality to a another class. (For the curious student, an example of the latter is the OpenSceneGraph API's Camera and Viewer).

      In your questions file answer the following questions:

      1. Question "Familiar Parts": Make a list called "Familiar Parts" and list those view frustum related parts of Unity.Camera that directly correspond to prior courses you have taken (including material reviewed in this course lecture).
      2. Question "Unfamiliar Parts": Make a list called "Unfamiliar Parts" and list those view frustum related parts of Unity.Camera that seem less familiar to you based on prior courses, etc. Relying only on what you already know, for each of these unfamiliar parts make a educated guess about how they might related to items on your "Familiar Parts" list.

      3. git commit with message "-answered Unity.Camera questions"

  24. Note: Now skip sub-section [ ^ ] : Creating Gameplay : Adding Random Gameplay Elements . (This section basically reviews standard Computer Science 101 material in the context of Unity).

    Having done some actual coding :) , go back to the previously skipped section "Asset workflow"

  25. [ ^ ] : Creating Gameplay : Cross-Platform Considerations - (https://docs.unity3d.com/Manual/CrossPlatformConsiderations.html). Read this section

  26. [Scripting API] : Meshes : [...] ( https://docs.unity3d.com/ScriptReference/Mesh.html) Read this section.
    1. If needed for your own review, follow and read the cited links
      1. https://docs.unity3d.com/ScriptReference/Mesh-vertices.html
      2. https://docs.unity3d.com/ScriptReference/Mesh-normals.html
      3. https://docs.unity3d.com/ScriptReference/Vector3.html
    2. Further follow and read the links
      1. https://docs.unity3d.com/ScriptReference/MeshTopology.html
      2. https://docs.unity3d.com/ScriptReference/Mesh.SetNormals.html
      3. https://docs.unity3d.com/ScriptReference/Mesh.SetVertices.html
    3. Exercise: Jiggly Sphere
      1. [Unity Editor] Create and Empty object called "JigglySphere" and a C# script "JigglySphere.cs" associated with that GameObject.
        1. Set the JigglySphere position 0,0,0
        2. In the Inspector window add 2 components: a Mesh Filter and a Mesh Renderer
      2. [Visual Studio] Create code to generate and render a mesh that displays just a set of Points (see MeshTopology) that lie on a sphere. Use the code examle in https://docs.unity3d.com/ScriptReference/Mesh.html as a starting point. Further, use the following equation to generate the vertices on a unit sphere: :

        x = cos θ sin φ
        y = sin θ sin φ
        z = cos φ

        over θ ∈ [0, 2 ∏], φ ∈ [0, ∏]

        (https://mathworld.wolfram.com/SphericalCoordinates.html)

        You should seem something like this:
      3. [Git] Git commit with messsage "-sphere of points"
      4. [Visual Studio] Now make the sphere into a surface rendered as a sphere of triangles.
        1. This requires changing the indices and the MeshTopology.
        2. Draw a picture to help yourself determine how to set up the indices!
        3. Add normals to the vertices (calculating the normal at a point on a sphere is quite easy).
      5. [Git] Git commit with messsage "-sphere surface with triangles"
      6. [Visual Studio] Modify the "Update" method of JigglySphere to make each vertex 'jiggle' continuously (as an animation) in some visually dicernable fashion. Feel free to be a creative in your geometric computations.
        Here's mine:
      7. [Git] Git commit with messsage "-jiggly sphere"
  • https://docs.unity3d.com/Manual/EditorFeatures.html
  • https://docs.unity3d.com/Manual/analysis.html

  • Part II: 3D Interaction and Manipulation

    Time:  "[not calculated yet] minutes to read; [not calculated yet] minutes to do"
    1. Leap Motion Core: Sign up as LeapMotion developer. Download and install the developer Leap Motion Core package from https://developer.leapmotion.com/unity/
    2. Oculus XR Plugin:  The course grader will grade your tutorial using an Oculus Rift S.  Therefore, you must next install the Oculus XR Plugin.    In the Project window right-click Packages.  In the pop-up menu select View in Package Manager. In the Package Manager window find and select the Oculus XR Plugin.   Click the Install button to add the package.

    Shaking Hands with a Star Bot

    ZJW: I decided this level of procedural geometry generation was too much for this project

    In this part you will develop a VR 3D User Interface that is bi-manual (i.e. 6DOF controllers for left and right hand) and utilizes head-tracking for both tracked viewing and as part of  the 3D user interface and 3D interaction.   The tutorial is designed to be implementable on either a HMD or a "home made" fish-Tank VR [Arthur93] setup.  As of 2/24/2020 Dr. Wartell is evaluating solutions that will allow you to setup a home-made fish-tank VR envrionment using minimal hardware such as a webcam or a LeapMotion device which the VisCenter can lend out to individual students.

    A Procedurally Generated Star Bot

    Star Bot Star Bot
    Procedurally Generated Trees (from link)                                     Star Bot Concept Sketch

    In Section 6.1, in order to demonstrate basic knowledge of Unity's implementation of the common scene graph paradigm, you will create a procedurally generated robot/creature, nick named a "star bot".   (Later, in Section 6.2 you will add the VR 3D user interface to interact with the Star Bot).

    Unity C# Scripting Basics

    ZJW TODO: Tutorial Reading on C# script integration into Unity scene

    Generating a Star Bot

    The most common example of procedurally generated computer graphics are 3D trees.  The star bot is essentially a variation this idea.

    ZJW TODO: Discuss requirements for number of limbs, branching levels and branching depth and hiearchical coordinate systems

    XR 3D User Interface

    xR 3D UI Basics

    Shaking Hands,  Poking and Proding, etc.

    Academic Integrity

    See the course syllabus regarding partial credit and the late penalty policy.  

    This is an individual student project.   Each student should be writing and submitting their own code. 

    Students can discuss general aspects of the various API's and tools with each other and this is strongly encouraged.   Discussing algorithm's at the pseudo-code level is also acceptable.  

    However, it is not permissible:

    If you have questions about whether using a particular resource is acceptable or not, email the TA and/or Professor.

    References

    1. T. J. MacTavish & R. L. Henneman,
      "The NCR Human Interface Technology Center",
      CHI ’97 Extended Abstracts on Human Factors in Computing Systems , 83–84 (1997)  [DOI]  [URL]

    2. M. A. Tarlton & P. N. Tarlton,
      "A framework for dynamic visual applications",
      I3D '92: Proceedings of the 1992 symposium on Interactive 3D graphics , 161-164 (1992)  [DOI]

    3. M. A. Tarlton, P. N. Tarlton, E. J. Lee & Z. Wartell,
      "Objects, Modeling and Media: A Framework for Interactive 3D Applications",
      AT&T Middleware Day and Software Symposium , (1995)

    Appendix I:

    Terms

    Appendix II:

    "ViewPlatform.cs"   Version #2

    	using System.Collections;
    	using System.Collections.Generic;
    	using System;
    	using UnityEngine;
    
    	public class MainScript : MonoBehaviour
    	{
    
    		private GameObject platform;
    		private GameObject tracker;
    		private GameObject leftSensor;
    		private GameObject rightSensor;
    		private GameObject leftController;
    		private GameObject rightController;
    
    		private bool debugLog = false;
    
    		// Start is called before the first frame update
    		void Start()
    		{
    			tracker = GameObject.Find("Tracker");
    			leftSensor = GameObject.Find("Left Sensor");
    			rightSensor = GameObject.Find("Right Sensor");
    			leftController = GameObject.Find("Left Controller");
    			rightController = GameObject.Find("Right Controller");
    		}
    
    		// Update is called once per frame
    		void Update()
    		{
    			if (debugLog)
    				LogHierarchy(transform,0);
    		}
    
    		static void LogHierarchy(Transform t,int level)
    		{
    			String indent="";
    			indent = indent.PadLeft(level * 4, ' ');
    
    			Debug.Log(indent + t.gameObject.name + " LCS: " + t.localPosition + " WCS: " + t.position);
    
    			for (int i=0;i < t.childCount;i++)        
    				LogHierarchy(t.GetChild(i),level+1);
    		}        
    	}
    	

    Appendix III:   Alternative 6DOF Tracked Display plus dual 6DOF Controllers

     

    Fish-Tank VR[Arthur93] plus dual 6DOF Controllers

    Idea #1 -With LeapMotion plus LeapMotion glasses-based head-tracking

    1. Install Leap Motion V2 SDK https://developer-archive.leapmotion.com/v2a
    2. https://github.com/rajarshiroy/CS231A_PROJECT/tree/m aster/FinalReportCode/MarkerTracking

    Idea #2- With Personal Webcam and ARToolkit

    This section is experimental and/or incomplete and should not be visible to students.

    This approach requires you have a webcam mounted on top of your monitor.

    1. Install Vuforia Unity XR Package
      1. From the Unity Editor, in the Project window right-click Packages.  In the pop-up menu select View in Package Manager. In the Package Manager window find and select the Vuforia Engine AR Plugin.   Click the Install button to add the package.
    2. Calibrate your webcam:
      1. Install GML http://graphics.cs.msu.ru/en/node/909 as
      2. Print this checkboard pattern images/Checkboard 1.pdf.   
      3. Mount it on a cardbox box that you can stand upright on your desk.   For example:
        CheckboardOnBox
        Figure:  Dr. Wartell's home office desk :)
      4. With your webcam in a single-fixed position, use your web cam software to take at least 25 pictures of box at a variety of positions.   Each picture must show the entire sheet of paper.   Half the images should be with the box rotated on it's side.   You might try standing the box on a book or two to raise it's position within the camera's view to increase the variety of angles from which the photos are taken.
      5. [GML] Create a New Project.  Enter 1 for the number of templates.  For the grid size enter 7 x 8 and 21.5 mm.
      6. [GML] Load all ~25 images into by pressing the Green + button.
      7. [GML] Press the Detect All  button.  Each image should then appear like this:
        GMLResult.png
        Figure:  Dr. Wartell's home office desk :)


      8. [GML] Press the Calibrate  button (camera icon).  GML will calculate the camera's intrinsic parameters from these images.
      9. ARGG STUCK

    3. Install UniCave:
      1. In your ITCS_x165 directory (but not in your project directory for this tutorial), git clone https://github.com/livingenvironmentslab/UniCAVE.git
      2. In the Unity Editor for your tutorial project, follow the instructions at https://unicave.discovery.wisc.edu/2018-documentation/ to add the UniCAVE Unity Package to your Unity project.
      3. *** ARG STRUCK ***  Documentation for UniCAVE is not consistent with the actual code...
    4. What Next?