• No results found

Several different haptic interface devices have been used in this research: the Phantom from SensAble Technologies is the device that we have used the most at Certec but I have also used the FEELit Mouse from Immersion and force feedback joysticks from Logitech and Microsoft.

Two Software Development Kits (SDKs) for the Phantom have been commercially available for some time now. They are GHOST by SensAble Technologies Inc. (Boston, Massachusetts) and the Reachin API3 by Reachin AB (Stockholm). A third SDK for haptic

development: e-Touch SDK by Novint Technologies (Albuquerque, New Mexico) is currently available as a beta version.

When we started our haptics work, none of these SDKs or APIs were available so we made our own simple object-oriented package to start with. This package handled basic, necessary steps in haptic programming such as communication with the haptics hardware, coordinate system conversions, temperature and force tracking, basic shape geometry and also sinusoidal textures. We started using GHOST as soon as the first beta version was available (in 1997), and since 2001 we have also been using the Reachin API. All the APIs described here constitute a huge leap forward compared to the essentially force level programming that we had to do in the beginning.

4.1 The Phantom

Technically the Phantom is a small robot with very low back drive friction. The standard A-model Phantom has three full degrees of freedom, i.e., three motors and three encoders. The tip of the robot is attached to a stylus or thimble via a passive gimbal that allows rotational movements (Figure 4.1). The normal use of the Phantom, however, is the opposite of a robot: the user holds on to the stylus (or puts a finger in the thimble) at the end of the robot arm and moves it and the robot provides feedback to the user by applying forces via the stylus.

3 API STANDS FOR APPLICATION PROGRAMMER’S INTERFACE.

The basic principle of the haptic rendering is simple: every millisecond, the computer that controls the Phantom reads the position of the stylus. It then compares this position to the boundaries of the objects in the virtual environment. If the user is not near any of the virtual objects, no current is sent to the motors and the user is free to move the stylus around. However, if the system detects a collision between the stylus and one of the virtual objects, it drives the motors to exert a force on the user’s hand (via the stylus) to push the user’s hand back to the surface of the virtual object. In practice, the user is prevented from penetrating the virtual object just as if the stylus collided with a real object (Figure 4.2).

Other haptic devices — such as Immersion Corporation’s Impulse Engine or CyberGrasp — use the same principle but with different mechanical systems for force generation and sometimes more than one point of interaction.

4.2 The FEELit Mouse

The FEELit Mouse is a 2D haptic device intended as a mass-market product, and as such it needs to be inexpensive. It has a smaller work Figure 4.1 The Phantom 1.0, a

haptic interface with a close-up of the motors.

Figure 4.2. The basic haptic rendering control loop.

D E V I C E S A N D S O F T W A R E x 45 area than the other devices and can only exert a fraction of the force

that can be felt with many other devices. Immersion justified force feedback for the mass market by emphasizing benefits such as

increased targeting speed in Windows and better ergonomical factors.

The commercial version of the FEELit Mouse was the Logitech Wingman Force Feedback Mouse (Figure 4.3) mainly marketed as a device that gives an added dimension to computer games.

4.3 Force Feedback Joysticks

These are intended to be used as gaming devices with a home user price tag. However it is possible to make special programs for force feedback joysticks that can be both educational and fun for blind children [Johansson & Linde 1998]. I have used force feedback joysticks from Microsoft (Figure 4.4) and Logitech.

4.4 GHOST

The General Haptics Open Software Toolkit (GHOST SDK) from SensAble Technologies is a C++ object-oriented toolkit that represents the haptic environment as a hierarchical collection of geometric objects and spatial effects. The GHOST SDK provides an abstraction that allows application developers to concentrate on the generation of haptic scenes, manipulation of the properties of the scene and objects within the scene and control of the resulting effects on or by one or more haptic interaction devices.

Using GHOST, developers can specify object geometry and properties, or global haptic effects using a haptic scene graph. A scene graph is a hierarchical collection (tree) of nodes. The internal nodes of the tree provide a means for grouping objects, orienting and scaling the subtree relative to the parent node, and adding dynamic

properties to their subtrees. The terminal nodes (leaves) of the tree represent actual geometries or interfaces. Leaves also contain an orientation and scale relative to their parent nodes.

The GHOST SDK does not generate visual representations of objects within the haptic scene graph. The GHOST SDK does, however, provide graphical callback mechanisms to facilitate integration between the haptic and graphic domains. SensAble also provides a graphics toolkit called GhostGL that works with GHOST.

GhostGL is a library that can render any GHOST SDK scene using OpenGL. It provides an easy way to add graphics to any GHOST SDK application. Once a GHOST SDK scene graph has been created it can be passed to the GhostGL routines that traverse and render a

graphical representation of each node in the scene [SensAble 2001;

2002].

Figure 4.3. Logitech Wingman Force Feedback Mouse, the commercial version of the FEELit Mouse

Figure 4.4. The Microsoft SideWinder Force Feedback Pro Joystick.

Key features of the GHOST SDK include the ability to:

Model haptic environments and models using a hierarchical haptic scene graph.

Specify the surface properties (for example, compliance and friction) of the geometric models.

Use behavioral nodes that can encapsulate either stereotypical behaviors or full free body dynamics.

Use an event callback mechanism to synchronize the haptics and graphics processes.

Extend the functionality through the subclassing metaphor.

Application developers can extend, modify or replace all object classes.

4.5 Reachin API

The Reachin API (formerly known as Magma) is an object oriented C++ application programming interface for creating touch. Reachin API lets the developer create haptic applications by means of a node concept: to let the user sense different forms, one creates geometric nodes; to let the user sense different surface qualities one defines surface property nodes, etc. The Reachin API includes an extensible library of shape nodes, surface property nodes, simulation and scripting nodes, and control nodes for the different haptic and tracking devices.

The Reachin API also makes heavy use of a VRML loader. This means that many haptic environments can be built without having to go into C++ coding. Reachin uses an extended version of VRML97 that supports touch properties on the objects in addition to the standard visual properties. More complicated behavior can also be programmed without C++ by adding “script-nodes”. These are small programs written in Python (an object oriented programming language suitable for scripting) that can connect and manipulate states and properties of different nodes in the scene. Reachin VRML uses a field network approach to event handling: Instead of defining callback procedures (as in GHOST), the developer uses “field routing” to make a direct connection between the fields of different nodes or within a single node. This way, for example, the “pressed”

field (a state property) of a virtual button node can be connected to the “playing” field (an activation property) of a sound node to make a sound when the button is pressed [Reachin 2001; 2002].

D E V I C E S A N D S O F T W A R E x 47 The Reachin API features:

A high frequency loop (1-5 kHz) for time critical force

calculations and a slower loop for prediction, force interpolation and dynamic scene graph updates.

Haptic environments and models using a hierarchical haptic scene graph.

Touching modeled objects with a finite ball stylus tip, which makes edges feel more realistic and prevents fall-through.

Haptic texture algorithms on a 3D oriented volume or in free space.

Surface friction and damping.

Additional tool kits for NURBS, soft tissues, etc.

4.6 E-Touch

E-Touch is a 3D, multi-sensory (sight, touch and hearing) software package from Novint Technologies (Albuquerque, New Mexico). It is the first software that has been developed and delivered as an Open Module system, which is an outgrowth of the Open Source

movement.

The e-Touch SDK is a modular, multi-process system that allows multi-sensory programming. With the e-Touch SDK, programmers can build 3D applications that enable use of the senses of sight, touch, and hearing. The e-Touch SDK includes programming tools for creating 3D tools, navigation techniques, 3D models, and a full set of user interface tools including an extensive API [Novint 2002].

M E T H O D S x 49