Powered by
Social Media Classroom
In this article, Intel’s Chief Technology officer Justin Rattner talks about context-aware devices that can “learn about who you are, how you live, work and play”. He suggests that future handheld devices will use a variety of sensory technology in order to collect and analyze information about their human user. For example, last Wednesday at IDF, Rattner demoed “a television remote control that figures out who is holding it based on how it is held and learns the viewer’s entertainment preferences”. Overall the article suggests that Intel’s interest in context-aware computing will be the company’s way of becoming more competitive in the smartphone and mobile computing market. This seems such a bold ambition for Intel, which the article describes as currently lagging behind Apple and Research In Motion in the smartphone industry.
I selected this article as my 202 in the news story because it takes the concept of MyLifeBits and the Memex into a whole new level. Namely, not only do these future “smart” devices organize information about us but they also use that information to make value judgments about us. However, like we discussed in class, identifying the state of mind of a person is more complex than identifying the physical state of a lettuce so I have my doubts on how much information context-aware devices can really record. Furthermore in addition to the security issues that this kind of technology raises, I wonder about who this technology would really benefit. I feel like knowing my TV viewing preferences would benefit cable companies more than it will benefit me.
Click here to see a webcast of Justin Rattner's IDF keynote speech http://www.intel.com/idf/keynote-speakers/
Click here for the article I read about Rattner's keynote speech http://www.reuters.com/article/idUSTRE68E5TN20100915