Develop

Using the Ultrahaptics SDK with Unity®

2886 views 26/07/2018 Nicci 6

This article shows you the steps needed to use the Ultrahaptics C# API and Unity® 

We are also launching the Ultrahaptics Core Asset (UCA) for Unity®, our new haptics plugin. UCA for Unity® is currently in Closed Beta. Watch our video at the bottom of this article to find out more. If you would like to join the Closed Beta, please register your interest.

Add the Ultrahaptics API to Unity®

  • Download and unpack the latest Ultrahaptics SDK to your working folder.
  • Install the SDK as normal
  • You can either double-click on the *.unitypackage file from the Ultrahaptics SDK’s ./Libraries/Unity3D folder or, from within Unity®, right-click in the Project window and select ‘Import Package -> Custom Package’, select the *.unitypackage file and open.
  • This will open an import dialog box as shown (this may differ slightly in the Unity® package for Mac OS). Select all and click ‘Import’:

Once imported into your project, you will be able to use the Ultrahaptics C# API in any script files.

Example C# Code 

The C# API is the same as the C++ versionthe main exception being that everything is under the Ultrahaptics namespace. Where C++ classes are nested in namespaces, in C# the namespace name is prefixed to the class name: 

In C++

Ultrahaptics::AmplitudeModulation::Emitter

In C#

Ultrahaptics.AmplitudeModulationEmitter

In addition, in cases where a set of objects must be provided, for example in AmplitudeModulationEmitter.update(…), these accept standard .NET collections. You can this see in the example code listing below, which projects a fixed, haptic point 20 cm above the array.

Unity_AMFocus.cs  

using UnityEngine;
using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
using Ultrahaptics;

public class Unity_AMFocus : MonoBehaviour {
    AmplitudeModulationEmitter _emitter;

    void Start()  {
        // Initialize the emitter
        _emitter = new AmplitudeModulationEmitter();
        _emitter.initialize();
    }

    // Update on every frame
    void Update()  {
        // Set the position to be 20cm above the centre of the array
        Ultrahaptics.Vector3 position = new Ultrahaptics.Vector3(0.0f, 0.0f, 0.2f);
        // Create a control point object using this position, with full intensity, at 200Hz
        AmplitudeModulationControlPoint point = new AmplitudeModulationControlPoint(position, 1.0f, 200.0f);
        // Output this point; technically we don't need to do this every update since nothing is changing.
        _emitter.update(new List { point });
    }

    // Ensure the emitter is stopped on exit
    void OnDisable()  {
        _emitter.stop();
    }

    // Ensure the emitter is immediately disposed when destroyed
    void OnDestroy()  {
        _emitter.Dispose();
        _emitter = null;
    }
}

This script is implemented as a Unity® “MonoBehaviour” class, allowing it to be used as a Behaviour Component and attached to Unity® objects. Use this script in your scene by adding a Script component to a Game Object, then dragging and dropping the script from the Project explorer onto the ‘Script’ field in the Inspector.

Utrahaptics Unity script

Unity® provides extensive scripting documentation and tutorials on its website.

Add tracking with the Leap Motion Unity assets

Unity® applications requiring hand tracking capabilities, such as the example below, will need the Leap Motion® SDK and Unity® core asset.

Windows

For building your AR and VR applications we recommend using the latest Leap Motion® SDK and Unity Core Asset.

To build the tracking example below, use Leap Motion® Orion (v3) SDK and Unity Core Asset.

The Orion Unity® Core Asset comes with prefabs for direct integration into your Unity project. To add the asset and prefabs to your project:

  1. Import the asset into your Unity project as above,
  2. From LeapMotion/Core/Prefab, add the LeapHandController prefab to your Scene (you can delete the missing script from the component),
  3. In the LeapHandController Hand Model Manager component, type ‘2’ into the Model Pool’s Size field, ensure that both “Is Enabled” boxes are checked and add the following objects to the Graphics and Physics Hands sections, as shown below:
    • HandModelsNonHuman/Capsule Hand Left
    • HandModelsNonHuman/Capsule Hand Right
    • HandModelsPhysical/RigidRoundHand_L
    • HandModelsPhysical/RigidRoundHand_R

MacOS

MacOS users will need to install the Leap Motion® V2 SDK and V2 compatible Unity assets. The Unity asset comes with a full suite of demos and cross-platform libraries. When first opening, you may be prompted to update to the latest version of Unity® – accept and continue. To start using Leap Motion® in your project, drag and drop the HandController prefab into your hierarchy.

Tracking Example Code

This example uses the Leap Motion Controller to place a point on the user’s hand. The emitter will stop when a hand is not detected, or the Leap Motion Controller is not connected. You should refer to our C++ Examples, included with the SDK, for additional information on using hand tracking and how to use Time Point Streaming.

using UnityEngine;
using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
using Ultrahaptics;

public class Unity_AMHandTracking : MonoBehaviour{
  AmplitudeModulationEmitter _emitter;
  Alignment _alignment;
  Leap.Controller _leap;

  void Start() {
    // Initialize the emitter
    _emitter = new AmplitudeModulationEmitter();
    _emitter.initialize();
    _leap = new Leap.Controller();

    // NOTE: This example uses the Ultrahaptics.Alignment class 
    // to convert Leap coordinates to the Ultrahaptics device space.
    // You can either use this as-is for Ultrahaptics development kits, 
    // create your own alignment files for custom devices,
    // or replace the Alignment references in this example with your own
    // coordinate conversion system.

    // Load the appropriate alignment file for the currently-used device
    _alignment = _emitter.getDeviceInfo().getDefaultAlignment();
    // Load a custom alignment file (absolute path, or relative path 
    // from current working directory)
    // _alignment = new Alignment("my_custom.alignment.xml");
  }

  // Converts a Leap Vector directly to a UH Vector3
  Ultrahaptics.Vector3 LeapToUHVector(Leap.Vector vec) {
    return new Ultrahaptics.Vector3 (vec.x, vec.y, vec.z);
  }

  // Update on every frame
  void Update()  {
    if (_leap.IsConnected)
    {
      var frame = _leap.Frame ();
      if (frame.Hands.Count > 0)
      {
        // The Leap Motion can see a hand, so get its palm position
        Leap.Vector leapPalmPosition = frame.Hands[0].PalmPosition;
        // Convert to our vector class, and then convert to our coordinate space
        Ultrahaptics.Vector3 uhPalmPosition = 
          _alignment.fromTrackingPositionToDevicePosition(LeapToUHVector(leapPalmPosition));
        // Create a control point object using this position, 
        // with full intensity, at 200Hz
        AmplitudeModulationControlPoint point = 
          new AmplitudeModulationControlPoint(uhPalmPosition, 1.0f, 200.0f);
        // Output this point
        _emitter.update(new List { point });
      }
      else  {
        Debug.LogWarning ("No hands detected");
        _emitter.stop();
      }
    }
    else  {
      Debug.LogWarning ("No Leap connected");
      _emitter.stop();
    }
  }

  // Ensure the emitter is stopped when disabled
  void OnDisable()  {
    _emitter.stop();
  }

  // Ensure the emitter is immediately disposed when destroyed
  void OnDestroy() {
    _emitter.Dispose();
    _emitter = null;
    _alignment.Dispose();
    _alignment = null;
  }
}

Unity Plugin | Ultrahaptics Core Asset (UCA)

Our new Ultrahaptics Core Asset (UCA) for Unity® provides a library of prefabs, prebuilt Sensations and example Unity Scenes. Sensations can be easily played back and edited dynamically in the Unity Inspector, integrated into the Timeline window, or controlled via scripting.

Was this helpful?