mobile content management   

The decade of the 2000’s saw a rapid move towards the digitization of media in everyday life. Photos, music, user-generated videos, movies, books, and more all made the move from physical artifacts to digital objects. This change has brought about many new opportunities for the use of these media in daily life since digital artifacts can more easily afford different uses than physical atoms that are tied to a particular place and time.

The projects on this page will describe our journey from studying photo and music use to the concepts inspired by these investigations to products that we created based on the findings from field studies of these concepts in the lives of a variety of people.

photo sharing study
GENERATIVE RESEARCH (2002)

In 2002, photos were moving from physical artifacts towards digital files. Camera phones were also just on the horizon and we saw the opportunity for phones to be a central media hub in people’s lives, having a variety of photos always on hand. To better understand the motivations for taking and sharing photos, we conducted a study to observe current photo practices. This study involved home tours and interviews covering previous instances of photo sharing or receiving. We observed how photos were stored in the home and how they were searched on special occasions or just to reminisce. We also observed the (often desireable) sidetracking behavior where participants would end up on tangents, exploring photos that they happened to come along on their way to their target. We used the data from this study in designing future concepts such as the Media Assistant and Metadata Services as well as in the search portion of ZoneTag as described below.

Publications:
Personal vs. commercial content: the similarities between consumer use of photos and music. Frank Bentley, Crysta Metcalf, Gunnar Harboe. CHI ’06 Proceedings of the SIGCHI conference on Human Factors in computing systems, 2006

music context study
GENERATIVE RESEARCH (2004)

As large storage cards and hard drives were making their way to portable media devices, we knew that the phone would be the next device to become a central music device in people’s lives. We wanted to explore current music practices in order to invent novel ways of browsing collections, finding music, and sharing music recommendations. This study consisted of home tours, interviews about music aquisition and sharing, and contextual tasks to find and play music for certain occasions in the home or car. From this study, we learned about the ways that our participants refined music searches. Often, they started with music that was near at hand (or found at random) and then decided to play music that was similar or different to that music in various ways. This inspired the Metadata Knob (described below) and PlayTree concepts that served to help users to focus on music that met their mood/social context. This work also helped us in adapting the Metadata Services to support music content and a wider range of metadata attributes to use for music search.

Publications:
Personal vs. commercial content: the similarities between consumer use of photos and music. Frank Bentley, Crysta Metcalf, Gunnar Harboe. CHI ’06 Proceedings of the SIGCHI conference on Human Factors in computing systems, 2006

Patents:
Method and system for generating a play tree for selecting and playing media content. Gunnar Harboe, Frank Bentley, Crysta Metcalf, Vivek Thakkar. Patent Number 7,685,154 Multimedia Device For Providing Access to Media Content. Frank Bentley, Gunnar Harboe, Crysta Metcalf, Vivek Thakkar, Guy Romano Method and Apparatus of Determining Access Rights to Content Items. Jerome Picault, Frank Bentley, David Bourne, Nocolas Lhuillier, Crysta Metcalf, Joseph Wodka. US Patent Application Number: 20070073694

metadata knob
CONCEPT DEVELOPMENT (2005)

The metadata knob concept was inspired from the Music Context Study where we observed our participants start to play a song, and then desire to play songs that were related to that seed song in some way. Perhaps they wanted to listen to music that had a faster or slower beat. Perhaps they wanted music that was newer or older. Or perhaps they wanted music that they listen to more or less frequently. It was almost impossible to adjust music playback in these ways at the time. We invented a new way to browse music, through the use of a knob. The knob worked like a radio dial and turning the knob would adjust the selection attributes for the music being played. The songs would “tune” synchronously as the knob turned, much like a radio so that the user had instant feedback. The knob could be set to control BPM, Song Published Date, Song Acquired Date, or Playcount. A simple prototype was created using a Handyboard (in the days before Arduino) and an Atari controller as the knob. A user interface on the computer could be used to set the attribute that the knob was controlling and the software scoured the computer for music, indexed it, and provided playback through the computer speakers.

Patents:
Multimedia Device For Providing Access to Media Content. Frank Bentley, Gunnar Harboe, Crysta Metcalf, Vivek Thakkar, Guy Romano

metadata services
CONCEPT DEVELOPMENT (2003-2004)

The Metadata Services were a middleware solution for managing multimedia content on mobile devices. The services allowed for annotating content with a variety of extensible metdata tags, searching based on combinations of these tags, interfacing with hardware such as GPS and Cell ID for tagging content, and sharing content via communications stacks available on the device. This system was created based on findings from the Photo and Music studies and developed in conjunction with the Media Assistant application (below) to ensure that it was robust enough to meet demanding use cases. The system was productized at as part of the Media Finder (below) and was the most versitile personal media management solution available at the time.

Publications:
Intelligent Multimedia Content Management on Mobile Devices. Bhavan Gandhi, Alfonso Martinez, Frank Bentley. IEEE International Conference on Multimedia and Expo. June, 2004.

media assistant
CONCEPT DEVELOPMENT (2002-2004)

The Media Assistant was our showcase application for the Metadata Services as well as a vehicle to explore interaction with large personal content libraries. The application could ingest personal photo libraries and provided unique ways to search content based on combinations of location, year, month, season, people in the photos, and usage history (e.g. who they were shared with or received from) – the attibutes that our research showed were the most common ways people remember and describe their photos. We built the system iteratively and tested key components with users and their own personal photo libraries throughout the design and development cycle. We were the first to use a zoomable map interface to display clusters of photos in addition to providing a variety of menu-based navigation options for browsing large (1000’s) personal photo collections. We also learned about the importance of query modification through our studies, as our participants often mis-remembered key details about a target photo (e.g. the year or month) and had to change these as they were searching. We supported voice queries in addition to touchscren navigation, both novel concepts in 2002-2003. This concept influenced our ultimate design of the Media Finder application.

Publications:
Flexible Views: Annotating and finding context-tagged mobile content. Frank Bentley and Crysta Metcalf. Ubicomp 2006 workshop on Pervasive Image Capture and Sharing. September, 2006.

Intelligent Multimedia Content Management on Mobile Devices. Bhavan Gandhi, Alfonso Martinez, Frank Bentley. IEEE International Conference on Multimedia and Expo. June, 2004.

ambient memory player
CONCEPT DEVELOPMENT (2006)

The Ambient Memory Player was a concept that came from our Photo and Music studies. We observed that photos and music were both tied to events in people’s lives. Often when a song played, a person would remember a specific time in their life when they heard that song. In particular, one user had a vivid memory of sitting on the porch with her sisters in thick Nike sweatshirts on a cool fall night playing a particular song. Our concept hoped to make connections like this more explicit and encourage reminiscing around music. Ed DeGuzman implemented the prototype which used the Metadata Services to scan a user’s photo and music libraries. When a song started playing, the system would search for photos taken at the same time as one of the play times of the song or around the pubished or acquired date of the music. These photos would be displayed in a photo pane and would change with each song. The idea was that these photos could evoke memories of times associated with the music and enhance the music listening experience for users. In a two-week study, we observed how the images provoked instances of reminiscing and communication with others who were involved in the memories evoked by the photos. The photos were seen not to be distracting and were sometimes placed on a second monitor or just sticking out from behind a primary window so that they could be glanced at without much distraction from a primary task on the computer.

media finder
PRODUCT (2005-2008)

The Media Finder system was our commercial implementation of the Media Assistant and Metadata Services. An initial product-grade implementation was made on a mobile platform that never launched, but implemented the world’s first metadata-based filesystem on a mobile device. Later implementations of this service launched on Motorola’s Linux devices in Asia and later on 3G devices worldwide (such as 3G versions of the original RAZR), leading to tens of millions of users with the ability to find photos and music based on a variety of metadata attributes. Unfortunately, many of our most interesting user interface findings were not incorporated in the final product design such as the ability to easily modify queries or to modify playing music using “knob”-like interactions.

ZoneTag
PRODUCT (2006)

In 2006, we partnered with Yahoo! Research Berkeley to create a J2ME client for their ZoneTag mobile photo tagging service that would run on popular Motorola devices such as the 3G RAZR and SLVR. This application used network-based Cell ID to determine a user’s location and then retreived a list of suggested tags from the Yahoo! server based on nearby venues and events from Yahoo! Local, Upcoming, and other services. This made it easy to tag photos on devices with 12-key keyboards, which were most popular at the time. Our J2ME application was the world’s first application to allow for nearby photo browsing, with an easy way to find photos taken by others nearby using Cell ID location technology in addition to the base functionality of tagging and uploading newly taken photos to Flickr. The ZoneTag system remained operational until 2010.