Milestones

The first draft of the paper is done! It comes out at about 12 pages. I’ll need to cut it down to 6 to submit for CHI 2014 WIP. Easier than writing though. Of course, that’s just the first draft. More to come, I’m guessing. Still, it’s a nice feeling, and since I’ve burned through most of my 20% time, it’s time for me to get back to actually earning my pay, so I’ll be taking a break from this blog for a while. More projects are coming up though, so stay tuned. I’ll finish up this post with some images of all the design variations that led to the final, working version:

Prototype Evolution

Prototype Evolution (click to enbiggen)

The chronological order of development is from left to right and top to bottom. Starting at the top left:

  • The first proof of concept. Originally force-input / motion – feedback. It was with this system that I discovered that all actuator motion had to be in relation to a proximal relative base.
  • The first prototype. It had 6 Degrees of freedom, allowing for a user to move a gripper within a 3D environment and grab items. It worked well enough that it led to…
  • The second prototype. A full 5-finger gripper attached to an XYZ base. I ran into problems with this one. It turned out that motion feedback required too much of a cognitive load to work. The user would loose track of where their fingers were, even with the proximal base. So that led to…
  • The third prototype. This used resistive force sensors and vibrotactile feedback. The feedback was provided using voice coils, which were capable of full audio range, which meant that all kinds of sophisticated contact and surface effects could be provided. That proved the point that 5 fingers could work with vibrotactile feedback, but the large scale motions of the base seemed to need motion (I’ve since learned that isometric devices are most effective over short ranges). This was also loaded with electronic concepts that I wanted to try out – Arduino sensing, midi synthesizers per finger, etc.
  • To explore direct motion for the base for the fourth prototype I made a 3D printing of a 5-finger Force Input / Vibrotactile Output (FS/VO) system that would sit on top of a mouse. This was a plug-and play substitution that worked with the previous electronics and worked quite nicely, though the ability to grip doesn’t give you much to do in the XY plane
  • To Get 3D interaction, I took two FS/VO modules and added them to a Phantom Omni. I also dropped the arduino and the synthesizer and the Arduino, using XAudio2 8-channel audio and a Phidgets interface card. This system worked very nicely. The FS/VO elements combined with a force feedback base turned out to be very effective. That’s what became the basis for the paper, and hopefully the basis for future work.
  • Project code is here (MD5: B32EE89CEA9C8E02E5B99BFAF24877A0).

A little more direction?

  • In meeting with Dr. Kuber, I brought up something that I’ve been thinking about since the weekend. The interface works, provably so. The pilot study shows that it can be used for (a) training and (b) “useful” work. If the goal is to produce “blue collar telecommuting”, then the question becomes, how do we actually achieve that? A dumb master-slave system makes very little sense for a few reasons:
    • Time lag. It may not be possible to always get a fast enough response loop to make haptics work well
    • Machine intelligence. With robots coming online like Baxter, there is certainly some level of autonomy that the on-site robot can perform. So, what’s a good human-robot synergy?
  • I’m thinking that a hybrid virtual/physical interface might be interesting.
    • The robotic workcell is constantly scanned and digitized by cameras. The data is then turned into models of the items that the robot is to work with.
    • These items are rendered locally to the operator, who manipulates the virtual objects using tight-loop haptics, 3D graphics, etc. Since (often?) the space is well known, the objects can be rendered from a library of CAD-correct parts.
    • The operator manipulates the virtual objects. The robot follows the “path” laid down by the operator. The position and behavior of the actual robot is represented in some way (ghost image, warning bar, etc). This is known as Mediated Teleoperation, and described nicely in this paper.
    • The novel part, at least as far as I can determine at this point is using mediated telepresence to train a robot in a task:
      • The operator can instruct the robot to learn some or all of a particular procedure. This probably entails setting entry, exit, and error conditions for tasks, which the operator is able to create on the local workstation.
      • It is reasonable to expect that in many cases, this sort of work will be a mix of manual control and automated behavior. For example, placing of a part may be manual, but screwing a bolt into place to a particular torque could be entirely automatic. If a robot’s behavior is made  fully autonomous, the operator needs simply to monitor the system for errors or non-optimal behavior. At that point, the operator could engage another robot and repeat the above process.
      • User interfaces that inform the operator when the robot is coming out of autonomous modes in a seamless way need to be explored.

Results!

With 10 subjects running two passes each through the system, I now have significant (Using one-way ANOVA) results for the Phantom setup. First, user errors:

Linear Hypotheses:
Estimate Std. Error t value Pr(>|t|)
HAPTIC_TACTOR - HAPTIC == 0 -0.3333 0.3123 -1.067 0.7110
OPEN_LOOP - HAPTIC == 0 0.5833 0.3123 1.868 0.2565
TACTOR - HAPTIC == 0 1.0000 0.3123 3.202 0.0130 *
OPEN_LOOP - HAPTIC_TACTOR == 0 0.9167 0.3123 2.935 0.0262 *
TACTOR - HAPTIC_TACTOR == 0 1.3333 0.3123 4.269 <0.001***
TACTOR - OPEN_LOOP == 0 0.4167 0.3123 1.334 0.5466
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
(Adjusted p values reported -- single-step method)

Next, normalized user task completion speed

Linear Hypotheses:
Estimate Std. Error t value Pr(>|t|)
HAPTIC_TACTOR - HAPTIC == 0 0.11264 0.07866 1.432 0.4825
OPEN_LOOP - HAPTIC == 0 0.24668 0.07866 3.136 0.0118 *
TACTOR - HAPTIC == 0 0.17438 0.07866 2.217 0.1255
OPEN_LOOP - HAPTIC_TACTOR == 0 0.13404 0.07866 1.704 0.3269
TACTOR - HAPTIC_TACTOR == 0 0.06174 0.07866 0.785 0.8612
TACTOR - OPEN_LOOP == 0 -0.07230 0.07866 -0.919 0.7947
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
(Adjusted p values reported -- single-step method)

So what this says is that HAPTIC_TACTOR has the lowest error occurrence, and that HAPTIC is the fastest in achieving the task (note – there may be some Force Feedback artifacts that contribute to this result but that will be dealt with in the next study)

This can be shown best by looking at some plots. Here’s the error results as means plots

ErrorMeansPlot

And here are means plots for the task completion speed

Fastest50Percent

Since this is a pilot study with only 10 participants, the populations are only just separating in a meaningful way, but looking at the charts it looks like HAPTIC and HAPTIC_TACTOR will probably continue to become more separate from OPEN_LOOP and TACTOR.

What does this mean?

First, and this is only implicit from the study – it is possible to attach simpler, cheaper sensors and actuators (force and vibration) to a haptic device and get good performance. Even with simple semi-physics, all users were able to grip and manipulate the balls in the scenario in such a way as to achieve the goal. Ninety percent of the users who made no errors in placing 5 balls in a goal took between 20 and 60 seconds, or between 4 and 12 seconds per ball (including moving to the ball, grasping the ball and successfully depositing the ball in a narrow goal). Not bad for less than $30 in sensors and actuators.

Second, force-feedback really makes a difference. Doing tasks in an “open loop” framework is significantly slower than doing the same task with force feedback. I doubt that this is something that users will get better at, so the question with respect to gesture-based interaction is how to compensate? As can be seen from the results, it is unlikely that tactors alone can help with this problem. What will?

Third, not every axis needs to have full force-feedback. It seems that as long as the “reference frame” is FF, then the inputs that work with respect to that frame don’t need to be as sophisticated. This does mean that low(ish) cost, high-DOF systems using hybrid technologies such as Force Feedback plus Force/Vibration may be possible. This might open up a new area of exploration in HCI.

Lastly, the issue of how multiple modalities and how they could effectively perform as assistive technologies needs to be explored with this system. There are only a limited set (4?) of ways to render positional information (visual, tactile, auditory, proprioceptive) to a user, and this configuration as it currently stands is capable of three of them. However, because of the way that the DirectX sound library is utilized to provide tactile information, it is trivial to extend the setup so that 5 channels of audio information could also be provided to the user. I imagine having four speakers placed at the four corners of a monitor, providing an audio rendering of the objects in the scene. A subwoofer channel could be used to provide additional tactile(?) information.

Once multiple modalities are set up, then the visual display can be constrained in a variety of ways. It could be blurred, intermittently frozen or blacked out. Configurations of haptic/tactile/auditory stimuli could then be tested against these scenarios to determine how they affect the completion of the task. Conversely, the user could be distracted (for example in a driving game), where it is impossible to pay extensive attention to the placement task. There are lots of opportunities.

Anyway, it’s been a good week.

The Saga Continues, and Mostly Resolves.

Continuing the ongoing saga of trying to get an application written in Visual Studio 2010 in MSVC to run on ANY OTHER WINDOWS SYSTEM than the dev system. Today, I should be finishing the update of the laptop from Vista to Win7. Maybe that will work. Sigh.

Some progress. It seems you can’t use “Global” in the way specified in the Microsoft documentation about CreateFileMapping() unless you want to run everything as admin. See StackOverflow for more details.

However now the code is crashing on initialization issues. Maybe something to do with OpenGL?

It’s definitely OpenGL. All apps that use it either crash or refuse to draw.

Fixed. I needed to remove the drivers and install NVIDIA’s (earlier) versions. I’m not getting the debug text overlay, which is odd, but everything else is working. Sheesh. I may re-install the newest drivers since I now have a workable state that I know I can reach, but I think it’s time to do something else than wait for the laptop to go through another install/reboot cycle.

Started writing haptic paper. Targets are CHI, UIST, or HRI. Maybe even MIG? This is now a very different paper from the Motion Feedback paper from last year, and I’m not sure what the best way to present the information is. The novel invention part is the combinations of a simple (i.e. 3-DOF) haptic device with an N-DOF force-based device attached. The data shows that this combination has much lower error rates and faster task completion times than other configurations (tactor only and open loop), and the same times for a purely haptic system. Not sure how to organize this yet….

This is also pretty interesting… http://wintersim.org/. Either for iRevolution or ArTangibleSim

The unbearable non-standardness of Windows

I have been trying to take the Phantom setup on the road for about two weeks now. It’s difficult because the Phantom uses FireWire (IEE 1394) and it’s hard to find something small and portable that supports that.

My first attempt was to use my Mac Mini. Small. Cheap(ish). Ports galore. Using Bootcamp, I installed a copy of Windows Pro 7. That went well, but when I tried to use the Phantom, the system would hang when attempting to send forces. Reading joint angles was OK though.

I then tried My new Windows 8 laptop, which has an extension slot. The shared memory wouldn’t even run there. Access to the shared space appears not to be allowed.

The next step was to try an old development laptop that had a Vista install on it. The Phantom ran fine, but the shared memory communication caused the graphics application to crash. So I migrated the Windows 7 install from the Mac to the laptop, where I’m currently putting all the pieces back together.

It’s odd. It used to be that if you wrote code on one Windows platform that it would run on all windows platforms. Those days seem long gone. It looks like I can get around this problem if I change my communication scheme to sockets or something similar, but I hate that. Shared memory is fast and clean.

Slow. Painful. Progress. But at least it gives me some time to do writing…

Results?

Looks like we got some results with the headset system. Still trying to figure out what it means (other than the obvious that it’s easier to find the source of a single sound).HeadsetPrelimResults

Here are the confidence intervals:

confidenceIntervals

Next I try to do something with the Phantom results. I think I may need some more data before anything shakes out.

Strain Relief and Shorts

IMG_2194Yesterday, just as I was about to leave work, one of my coworkers dropped by to see what I was doing and thought it would be fun to be experimented upon. Cool.

I fired up the system, created a new setup file and ran the test. Everything ran perfectly, and I got more good results. When I cam in this morning though, the rig was pretty banged up. A wiring harness that had been fine for me working out bugs was nowhere near robust enough to run even one person through a suite of tasks. It’s the Law of Enemy Action.

You’ve heard of Murphy’s Law (Everything that can go wrong, will). The Law of Enemy action is similar: “People will use your product as if they are trying to destroy it”. In a previous life I designed fitness equipment and it was jaw dropping to see the amount of damage a customer could inflict on a product. Simply stated – you need to overdesign and overbuild if at all possible.

With that in mind, I pulled all the hardware off the Phantom and started over. New, lighter, more flexible wire. strain relieved connections. Breakaway connections. The works.

When it was done, I fired it up and started to test. Sensors – check. Actuators – check. Yay! And then the right pressure sensor started to misbehave. It was kind of beat up, so it made sense to replace it. But when I went to test, the new sensor was misbehaving in the same way. And it seemed to be related to turning on the vibro-acoustic actuators.

Time to open the box up and poke around. Nope – everything looked good. Maybe the connector? Aha! My new more flexible cable was stranded rather than solid. And a few strands from one of the wires was touching the right sensor connection.

So I pulled everything apart and replaced the cable that went into the connection with 22 gauge solid wired which then connected to my stranded cable. All fixed.And an example that even though Murphy’s Law is bad enough, you should always be prepared for Enemy Action.