When Intel Corp. started talking about user-aware technologies last year, my first thought was that this was nothing new. Machines have always been very good at sensing when the user is facing a particularly critical deadline. That’s how they know when to break down.
But Intel was talking about something else. Actually, Intel was talking about quite a few things, some of which are more to do with user awareness than others, which is why it’s probably good that the company is now using the phrase “essential computing” instead of talking so much about user-aware technologies.
One example of user awareness is the Human Activity Recognition Project. Intel is working on ways of inferring what someone is doing so as to be able to help. For instance, say you’re cooking. Sensors attached to utensils and packages of ingredients would allow a computer to figure out what you’re trying to do and present a recipe, or remind you to preheat the oven. One possible use of this would be helping seniors remain in their homes, suggests Russ Schafer, director of marketing and technology management at Intel Research.
Another of Intel’s ideas is tracking your physical movement and presenting information on a phone or handheld computer to help you meet your exercise goals.
Intel isn’t alone in looking at this sort of idea. The Human Media Lab at Queen’s University is making technology aware of what people are doing in several interesting ways.
At the Queen’s lab, for instance, there is a computer with sensors hooked up to measure it’s operator’s heart rate and arm muscle movement. This way, it can determine if the person is working or not.
The uses of that could range from benign to big-brotherish. How would you like an automated system to track whether you’re working every minute? True, something like this may be the only way certain less trusting bosses will ever let employees work from home, but still, it’s not a pretty prospect.
On the other hand, Dr. Roel Vertegaal, director of the lab, says it could also be used to control interruptions. If I’m pounding away on my keyboard, as I am right now, my PC might be set not to notify me of incoming e-mail until I stop. It could also tell my phone not to ring.
There’s also a wall between two adjoining cubicles that has a special property. Unless the people on both sides of the wall look toward it, it’s opaque. But if they both turn toward the wall – and thus toward each other – at the same time, the wall becomes transparent. This is done with privacy glass, which works something like the liquid crystal display in your laptop.
The lab is also playing with various applications of sensor technology that can detect eye contact. It works by bouncing light off eyeballs. It can be used for something as simple as turning on a light, or to activate a camera when it’s pointed at someone’s face, or to pause video automatically when the viewer looks away.
Done right and used wisely, user-aware technology could make computers and consumer electronics easier to use. The big issue will be getting it right. We already have software that thinks it’s as smart as the people using it, and it’s often wrong.
Comment: cdnedit@itbusiness.ca