L17

4 Ways One Database Would Help Music Fans, Industry

You'd think, after finishing the final and the course, adding to the 202 blog would be the last thing on anyone's mind. But I saw this article and reflexively fired up the "Create Content" form, because this is essential 202, in my mind. It helped me realize why I've never been comfortable with music subscription services (I have a curated library that I don't want to get stuck behind a pay wall) and how musicians and industry would be better off if they could all agree to agree, for once.

Tagging with your mind (well, of course, all tagging uses your mind)

According to this article, Microsoft researchers have seen some limited success is reading subjects' minds and extracting some tagging info. That is, when exposed to images of human faces, the subjects' brains fire off in certain areas more so than when shown images of cars, non-human animals, etc. The researchers think that though this is greatly limited and specific at the same time, it shows a possible way to add tag info to images in a more rapid and "automatic" way.

IR for left-brain vs. right-brain people

 "One of the challenges of info pros has been to use the structured information-retrieving and -filtering tools, which really do require sequential, left-brained thinking, while simultaneously thinking creatively and intuitively about the entire spectrum of information sources and features, which requires right-brained analysis. It sort of feels like I'm trying to solve a quadratic equation while playing the piano."

Searching Congress

Can't believe I didn't know about this site before: http://metavid.org/wiki/

It allows you to search video of members' speeches before Congress with some interesting features. In addition to searching by keyword (I think through transcripts, which are provided by Congress), date, speaker, category (setting off 202 alarms), bill name/number, there are also some featured semantic queries.

More lifebits: recording everything makes other people uncomfortable

Here's another attempt at attempt at a MyLifeBits project. He only does it for a week, but he includes his daily conversations as well. He doesn't seem to offer many new insights into the issues surrounding capturing every possible ounce of digital data.

Google gets a break with lucky.

I apparently got cookied for a Google experiment that removes the long-running "I'm feeling lucky" button. Interestingly, it also defaults to removing nearly all elements from the page - the only remaining items are the Google logo, a search box, and the words "Press Enter to Search." A mouse-over brings back some links, minus the "I'm feeling lucky" option. See the pictures for details.

Google "Wonder Wheel" Visualizes Search Results

Apparently this launched back in May, but I just noticed it when playing around with Google for this week's discussion.

The "wonder-wheel" takes the input query as the center of the wheel and produces a set of expansions and related terms as petals. Clicking on a related term moves out from the origin node and shows a new set of expansions - the original node shrinks in size and gets fainter. You can repeat this multiple times to branch out over a conceptual space. Search results show on the left (where the ads normally are). Here's a sample search for flower:

facial recognition is sooo last year... how about object recognition?

In Lecture today, Bob made a point about the gap between humans & computers ability to sense and describe an object... and that a 3-year old is able to perform better (in terms of recognizing objects) than any computer.  But what about Facial Recognition software?

Facial Recognition software has been heavily invested in for a long time... perhaps we will move beyond recognizing faces to recognizing objects?

Syndicate content