The first draft of the paper is done! It comes out at about 12 pages. I’ll need to cut it down to 6 to submit for CHI 2014 WIP. Easier than writing though. Of course, that’s just the first draft. More to come, I’m guessing. Still, it’s a nice feeling, and since I’ve burned through most of my 20% time, it’s time for me to get back to actually earning my pay, so I’ll be taking a break from this blog for a while. More projects are coming up though, so stay tuned. I’ll finish up this post with some images of all the design variations that led to the final, working version:

Prototype Evolution

Prototype Evolution (click to enbiggen)

The chronological order of development is from left to right and top to bottom. Starting at the top left:

  • The first proof of concept. Originally force-input / motion – feedback. It was with this system that I discovered that all actuator motion had to be in relation to a proximal relative base.
  • The first prototype. It had 6 Degrees of freedom, allowing for a user to move a gripper within a 3D environment and grab items. It worked well enough that it led to…
  • The second prototype. A full 5-finger gripper attached to an XYZ base. I ran into problems with this one. It turned out that motion feedback required too much of a cognitive load to work. The user would loose track of where their fingers were, even with the proximal base. So that led to…
  • The third prototype. This used resistive force sensors and vibrotactile feedback. The feedback was provided using voice coils, which were capable of full audio range, which meant that all kinds of sophisticated contact and surface effects could be provided. That proved the point that 5 fingers could work with vibrotactile feedback, but the large scale motions of the base seemed to need motion (I’ve since learned that isometric devices are most effective over short ranges). This was also loaded with electronic concepts that I wanted to try out – Arduino sensing, midi synthesizers per finger, etc.
  • To explore direct motion for the base for the fourth prototype I made a 3D printing of a 5-finger Force Input / Vibrotactile Output (FS/VO) system that would sit on top of a mouse. This was a plug-and play substitution that worked with the previous electronics and worked quite nicely, though the ability to grip doesn’t give you much to do in the XY plane
  • To Get 3D interaction, I took two FS/VO modules and added them to a Phantom Omni. I also dropped the arduino and the synthesizer and the Arduino, using XAudio2 8-channel audio and a Phidgets interface card. This system worked very nicely. The FS/VO elements combined with a force feedback base turned out to be very effective. That’s what became the basis for the paper, and hopefully the basis for future work.
  • Project code is here (MD5: B32EE89CEA9C8E02E5B99BFAF24877A0).

First Results :-)

Downloaded several wav files of sine wave tones, ranging from 100hz to 1,000hz. The files are created using this generator, and are 0.5 sec in length.

Glued the tactor actuators in place, since they kept on coming loose during the testing

Fixed the file outputEach test result is now ordered

Fixed a bug where the number of targets and the number of goals were not being recorded

Added a listing of the audio files used in the experiment.

Got some initial results based on my self-testing today: firstResults
The pure haptic and tactor times to perform the task are all over the place, but it’s pretty interesting to note that Haptic/Tactor and Open Loop are probably significantly different. Hmmmm.


Ok, here it is, all ready to travel:


It’s still a bit of a rat’s nest inside the box, but I’ll clean that up later today.

Adding a “practice mode” to the app. It will read in a setup file and allow the user to try any of the feedback modalities using srand(current milliseconds) – done

Sent an email off to these folks asking how to get their C2-transducers.

Need to look into perceptual equivalence WRT speech/nonspeech tactile interaction. Here’s one paper that might help:

Fixed my truculent pressure sensor and glued the components into the enclosure. Need to order a power strip.

Life can be a drag sometimes

  • Cleaning up commands. Mostly done
  • While testing the “test” part of the app, I’m realizing that my “ratio” calculations have some issues. Before tying to fix them directly, I’m going to try just making a different gripper that has three “sensor spheres” on each finger. Then I can just let my “drag-based” physics to the whole job. Finished. That’s much better
  • Quick! What’s wrong with the following code?
	for(int i = 0; i < 3; ++i){
		position[i] += velocityVec[i];
		if(velocityVec[i] > drag*ratio){
			velocityVec[i] -= drag*ratio;
			velocityVec[i] = 0;
  • Yep, the drag is only being applied for objects moving in a positive direction. This is a problem that has been driving me crazy for days. I thought is was some artifact of the communication between the Phantom control loop (1k hz) and the simulation loop (100 hz). Nope. Simple math mistake. Facepalm.
  • Pretty picture for the day. Notice that the grippers now have multiple points of contact:


  • I’ve also started to notice how feedback changes the speed that you can perform the task. Haptic and tactor seem pretty close. Open loop is much worse, at least subjectively. Let’s see what the data says.

Zeno’s development schedule

Ok, it’s not as bad as that paradox, but the amount of work remaining always grows when you get closer to the end. Still, I think I’ll be ready by the end of the week.

  • Goal Box – done. I had to add collision detection for determining if a TargetSphere was touching (for setup) or inside (achieving the goal). I basically wrote an axis-aligned bounding box where the radius of the TargetSphere was either added (inside) to or subtracted from (touching) the size of the GoalBox. I’m not calculating penetration, just looking for sign change on the line segment.
  • Added <map> of UI_cmd to handle commands coming from the control system and results coming from the sim. Worked right the first time. Yay, C++ templates!
  • Enable/disable haptics/tactors – done. It’s interesting to see how the behavior and feel of the system changes with the different capabilities enabled/disabled.
  • Started working on the experiment session management

Picture for the day:GoalBox

Deadlines and schedules

I was just asked to see how many hours I have left for working this research. It turns out at the rate I’m going, that I can continue until mid-October. This is basically a big shout-out to Novetta, who has granted a continuation of my 20% time that was originally a hiring condition when I went to work for Edge. Thanks. And if you’d like a programming job in the DC area that supports creativity, give them a call.

I just can’t make the audio code break in writing out results. Odd. Maybe a corrupt input file can have unforeseen effects? Regardless, I’m going to stop pursuing this particular bug without more information

Fixing the state problem. Done.

Fixing the saving issue. Also changing the naming of the speakers to reflect Dolby or not. Done.

New version release built and deployed.

And back to Phantom++


I started to add in the user interface that will support experiments. Since it was already done, I pulled in most of the Fluid code from the Vibrotactile headset, which made things pretty easy. I needed to add an enclosing control system class that can move commande between the various pieces.

I’ve also decided that each sound will have an associated object with it. This allows each object to have a simple “acoustic” texture that doesn’t require any fancy data structure.

At this point, I’m estimating that the first version of the test program should be ready by Friday.