(Girardin, 2007) claims that the current state of location-aware computer is focusing perhaps too strongly on increasing the accuracy and granularity of location sensing and tracking technologies. He calls for
…user-centered field studies…that would discuss (and perhaps challenge) the need of find-grained location information…Similarly, few user-centered studies have been done to understand how to design applications that take into account the lack of maturity, the underlying imperfections and inherent uncertainties of location technologies.
Our team has discovered through our brainstorming sessions that location-awareness alone is not a good enough method of discovering “situational context”. (Vogel et al, 2004) suggests many criteria that are necessary for the four stages of interaction with public displays. For example, body position and orientation of the user is useful in determining the user’s “interruptibility”, or openness to communication. Motion sensing is useful for subtle interaction, so that the system can tell if the user has paused or slowed down near the display (to determine interest in interaction). Fine-grained location awareness is a sufficient method of determining if the user is “close enough” for personal interaction with the system. To support (Vogel et al, 2004)’s four phases of interaction, therefore, we would need methods of determining interruptibility, interest in interaction, and whether or not the user is within the radius of personal interaction.
Several proposals by members of our team build upon the four phases listed above, extending the “personal” stage of interaction to also support small-group interaction (2-6 people) and transitions between small-group to personal and small-group to large-group interactions. Personalized small-group interaction differs from the subtle interaction phase in that subtle interactions do not include taking control of and explicitly issuing requests to the system. The BlueBoard and MERBoard systems described in (Russell et al, 2004) is an example of personalized small-group interaction. So, the system must also include sensor(s) that can distinguish between people collocated around the public display (subtle interaction) and members of a team gathering around the display (personal interaction).
- Is Bluetooth technology accurate enough for location-awareness in public displays, and what level of granularity is necessary to support the (Vogel et al, 2004)’s four phases of interaction?
- How can we detect interruptibility and interest in interaction in our users?
- What access controls are necessary to support both personal interactions with the public display and group interactions with the display without sacrificing functionality and/or privacy?
- What kinds of content do users wish to share with others in a public forum? In a semi-private forum?
Cellular phones are probably the most pervasive devices among students at SI, and while it is possible to determine location by triangulating their cellular signals, intercepting such signals likely require expensive equipment (and there probably aren’t enough cellular towers in suburban areas for triangulation to be possible). The next most common wireless technology that can be used for location-awareness is also found on most cell phones: Bluetooth wireless adapters. (Madhavapeddy & Tse, 2005) had found that Bluetooth technology was “poorly suited to the purpose of fine-grained, low latency location inference”, but we do not believe that fine-grained location-awareness is necessary for our project. Part of this project will involve looking for either software or hardware method of attenuating Bluetooth signals so that we can vary the detection range.
AttentionMeter, developed by Jackie Lee, Jon Wetzel, and Ted Selker at MIT, is a freely available software toolkit that analyses “facial expression, body motion, and attentive activities”. The 2nd part of this project will involve integrating AttentionMeter as the main interruptibility and interest-in-interaction detection. AttentionMeter requires the use of a webcam.
AttentionMeter used in conjunction with Bluetooth detection to control the type and scope of information that is displayed. Bluetooth detection can tell the system how many people are in the room/area, but mere presence does not convey level of interaction with the display. For example, a person can be standing very close to the device, but with his back towards the display. AttentionMeter can be used together with location-awareness to identify when it is appropriate for the display to show individually-relevant information.
One possible scenario (with numbers completely made up):
Mouly is the only user within the 3-ft radius of the bluetooth detection device, and AttentionMeter returns an attention level of 5, which the system infers to mean that he alone is looking display, either already interacting or is interested in interaction. Then, the system displays ThankYous that have been sent to him recently, and reminders of upcoming deadlines and events. When Hung walks up to the display, AttentionMeter detects another face, and realizes that Mouly had not given Hung permission to read Mouly’s notes, so they are faded out and no longer legible.
When Hung disengages with the screen (IE. by turning and walking away), the system recognizes that Mouly is once again the sole user of the system, and the notes come back into view.When Mouly also disengages from the system and walks away, the system recognizes that nobody is directly interacting with it, and scans the room for other people who are within viewing distance (setup for ambient interaction).
It finds that Ben, Perry, and David, along with Hung and Mouly, are in the room, and so displays information that is relevant to those 6 people. When everybody leaves the room, the system reverts to displaying the most public information, such as MichiPoster.
Figure 1: System Information Flow – Arrows indicate direction of information flow; items in red indicate the components of the system that I will be responsible for. (Click to see full image)
Figure 2: Proposed 4 Phases of Interaction. This differs from (Vogel et al, 2004) is that personal interaction can involve multiple users.
Figure 3a: Even though a user is close enough to the display to interact with it, AttentionMeter recognizes that the user is actually not open to interaction, and will refrain from displaying private information.
Figure 3b: Now that the user is close and is looking at the public display, AttentionMeter will notify the system and allow it to display the user’s personal information.
Figure 3c: Two friends are using the display. Personal information continues to be displayed, and collaborative functionalities are enabled as well to support groupwork, similar to MERBoard and BlueBoard.
Figure 3d: Two users are not friends: AttentionMeter notifies system and removes private information from the display.