I can’t wait to feel my mobile screen

Posted on

I can’t wait to feel my mobile screen!

michelangelos
          hand touching

I read about second generation haptics earlier this week and
have been fascinated to find out more about it. As with most
cutting edge tech it doesn’t have a very sexy name.  The
word ‘touch’ has been used and abused to death, but this is
truly a way of touching the screen of a mobile phone. What it
means is that the mobile screen can create a sensory effect in
your finger’s nerves that imitates the sensation of touching a
textured surface. So what you may well ask, well there are four
questions that I am going to ask to try and answer that.

1) Does this require any new hardware or will it work on existing
hardware


2) If it doesn’t work on
existing hardware, how much work will it take to adapt or
build new hardware to run it


3) What benefits does this
technology bring to the consumer


4) Can this be commercialised?

5) When will the benefits
justify the costs, or the costs decrease to make it feasible?

But before we try and answer those questions here’s a bit more
about how it works. Disclaimer, this is just my attempt to
explain the technology in layman’s terms, here’s a link to the Senseg page
where you can get properly techy.

haptic in greek
The word Haptic comes from the Greek word “Haptikos” meaning ‘to
grasp’ or ‘to perceive’. First generation haptics used small
motors which create a vibration, you’ll feel them occasionally
on mobile devices if you press the wrong button as a sort of
error signal, they make a quite ‘brr’ sound.

But second generation haptics are very much cleverer. A very
thin film of conductive material is placed over the screen of
the phone/tablet. The film uses a very low electrical current to
create a magnetic effect on the finger. By modulating the
electrical current, the magnetic attraction can be increased or
decreased giving the sensation that the finger is moving toward
the screen or away from the screen respectively. The film is
mapped into pixels that correspond to the screen, and software
creates a virtual 3D map of contours that are then “felt” when
the finger hovers over the co-ordinates relating to a button or
any other graphic icon…

image of finger touching
            haptic screen

Source: Senseg

So let’s answer the questions:
1) Does this require new
hardware, will it work on existing hardware?
Yes it will require new hardware. Manufacturers
would have to add 3 things to the phone. i) The conductive film,
ii) the modulator that sits within the phone and converts the
software x,y,z co-ordinates onto the conductive film’s grid,
iii) the software to map the screen to the grid. And no it can’t
be retrofitted to consumer products, or at least not unless your
name is Heath Robinson.

2) How much work will it
take to adapt or build new hardware to run it?

It sounds like the ‘film’ can be added relatively easily as a
‘coating’ on the screen. It would need to be added during
manufacture and so depending on what the coating is made from
(ie. not liquid gold) it sounds like that could be done
relatively cheaply and mass manufactured. I’m not sure how big
the modulator is though and it would take up space and weight in
the phone. The software might also eat into the overall memory
space.

3) What are the consumer
benefits?

To be honest I haven’t ever thought, I wish I could feel the
shape of that button, or have an enhanced sensation when I swipe
left on my ipad. It’s not something that consumers really need,
but it is something that will enhance the overall experience. It
aims to make the whole interaction more intuitive because it
will feel like you are actually touching the items on your
screen when you move them or tap them. The main use will be for
navigation (ie. buttons and swipes) additional and perhaps more
useful uses will be within mobile games and within mobile apps.
It adds another layer, a deeper, subtler more intuitive layer
that will make the mobile phone feel less like a slab of liquid
metal and more organic. Once the SDK is out there then you can
expect developers to start using it within apps in all sorts of
novel ways. I look forward to a Braille app and an interactive
3D map that you navigate through by grabbing hold of mountains.

4) Can this be
commercialised?

I’m struggling to see how this could make money. Yes it’s cool
and fun, but so are lots of other things that are already
available as apps without any additional hardware required (ie.
Augmented Reality, 3D vision).

5) When will the benefits
justify the costs?
I don’t see Samsung or Nokia rushing
to put this onto their phones in the short term. I could see a
smaller, techier player jumping on this and carving a small
niche for themselves. Someone like ASUS perhaps? It’s too much
of a gimmick at the moment without specific applications or
commercial return. But I am sure it will come. It’s in that
catch-22 situation where the technology exists, but it requires
consumer demand to make the manufacturers (and operators) take
an interest. But the consumer demand will only come when there
are practical applications (ie. apps that use the technology
practically).

I’m going to follow Senseg’s development with interest, but
they’ve been around for a year or so and there don’t seem to
have been that many developments. Maybe there are deals going on
in the background with manufacturers, operators and app
developers. I hope so.

But a growing fear I have is about the net result of bundling
all this electrical interference into one small shiney phone.
Right now my iphone gets seriously hot when I’m downloading
data, will consumers really warm (literally) to the idea of
having their phones magnetise their fingertips?

So what about third generation haptics, who’s looking at
that? I used to love reading about the experiments from the 50s
(which are sadly no longer considered ethical) whereby
psychologists, who had probably taken too much LSD under
laboratory conditions, would poke around inside subject’s brains
with electrically charged probes and elicit ‘sensations’ by
stimulating specific parts of the  sensory cortex. In the
same way that we’ll soon be directly interfacing video into the
visual cortex, the next logical step would be to try and
interface touch sensations into the sensory cortex. That really
will be cool… and probably bumpy.

image of sensory cortex


Source NPR: Credit: Adam Cole, Nelson Hsu

Back to Blog