Sunday, November 25, 2007

iPhone Typing and Alternatives to Traditional Tactile Feedback

I read an article a couple weeks ago discussing a study that found iPhone users can type as fast as users of phones with traditional keypads, but they make more mistakes. I found part of these results quite surprising. Given my own experience with the iPhone, I’d certainly agree that iPhone typing is more error prone. However, I wouldn’t have guessed that iPhone typing speeds are on par with other qwerty devices, even with disregard to errors. For me, it seems to require so much more focus and attention to type at all on the iPhone, and I am much slower at typing on the touch screen than I was on my blackberry.

In searching for information about alternative forms of tactile feedback, I stumbled across Immersion, a company that offers tactile feedback for touch screens. Their interfaces add vibration feedback to onscreen interaction: “varying the frequency, waveform, amplitude, and duration of the vibration.” They claim such feedback can decrease “glance time”. This could certainly be useful useful for interacting with interfaces while on –the-go, or multi-tasking in other situations. I'd love to try an Immersion interface out! I wonder if it would improve iPhone typing, or if typing is too granular of a task?...

Sunday, October 21, 2007

Conversing with my iPhone

Let me start off my saying I’ve had my iPhone for a little over two months now, and I love it. It’s pretty, it plays music, it lets me talk to my friends… it even wakes me up in the morning. And soon, I’ll be able to develop custom applications for it, too. Cool!

With the announcement of the forthcoming iPhone SDK, I got to thinking: I wonder what the market is for iphone development. What kind of apps will people want? Will it be business related or regular use? What makes the iPhone a better tool than existing smart phones? What makes it worse?

One thing that sticks out to me is the lack of tactile feedback. No physical buttons means a slick pretty package, but at what cost to usability? Ok, so you are not supposed to text while you drive, but what about simply dialing a phone number? With my previous phones, my fingers could feel around the keypad and knowingly push the numbers I needed (thank goodness for all those hours I spent on the phone as a teenager!). I never had to look at the screen until I completed dialing. With the iPhone, I pretty much have to give full visual attention to the screen for each number or letter I type. Interacting with the iPhone definitely does not lend itself to multi-tasking.

Which leads me to one of the first things I’d like to develop for the iPhone with that shiny new sdk: an audio interface. I envision the ability to speak my emails or texts or notes or appointments or phone numbers (you get the picture) into my iPhone. In my vision, iPhone will speak back to me. It will read me my incoming texts or emails, speak to me when I have an upcoming appointment. I will be able to say, “iPhone, call my mom”, and it will ask me, “at work or at home?”.

My masters research on reading on-the-go showed promise for audio interfaces for "reading" passages for comprehension while walking. Particularly, the audio interfaces allowed people to more accurately and fluidly navigate their environments. I wonder if we would see similar results in real-life with an audio interface for the iPhone...