21.514 Companions Project Newsletter

From: Humanist Discussion Group (by way of Willard McCarty willard.mccarty_at_kcl.ac.uk>
Date: Fri, 1 Feb 2008 06:54:47 +0000

               Humanist Discussion Group, Vol. 21, No. 514.
       Centre for Computing in the Humanities, King's College London
                     Submit to: humanist_at_princeton.edu

         Date: Fri, 01 Feb 2008 06:46:07 +0000
         From: Yorick Wilks <Yorick_at_dcs.shef.ac.uk>
         Subject: Companions Project Newsletter

Dear Colleague:

Welcome to the first newsletter of the COMPANIONS project: a 12.88m
Euro interdisciplinary research project which focuses on combining
advanced technologies to create personal, persistent 'agents' or
'Companions'. This will be an agent or 'presence' that communicates
and develops a relationship with its user primarily by using and
understanding speech.

This version is in text, with appropriate web links but you can
download the Newsletter as pdf (316 kb):

You can go to our website to find out more: http://www.companions-project.org/

Please forgive any duplication of this in your mailbox.
Best wishes,
Professor Yorick Wilks

1. What are Companions?
2. Demonstrators: The Health and Fitness Companion
3. Demonstrators: Introducing the PhotoPal WoZ Platform
4. Fourth Bellagio International Workshop on Human-Computer Conversation
5. Scientists Discuss Artificial Companions
6. Feature: Nabaztag as a first Companion
7. Our Industrial Partners
8. Scientific Governing Council
9. Contact

1. What are Companions?

COMPANIONS is a new paradigm for the way people deal with the
Internet, as computational agents that are personalized and
persistent, sensitive to the needs of, and relationship with, the
single owner. The embodiment of a Companion is relatively
unimportant: it could be a screen head, a mobile phone, or some
simple object, easy to carry about, like a handbag, The key thing is
talk: the Companion, whatever its size or shape, will be a
conversational entity, interacting with its owner over long periods.
It is an ECA (Embodied Conversational Agent) but with an emphasis on
the conversation rather than the graphical forms. That is what the
project's first trial demonstrators are aiming at, however simplified
they turn out to be.

The Companion vision is that a Companion becomes, in a precise sense,
part of the user's memory on the web, essentially their memory of
themselves and their life events. The originality here is the use of
conversation as a tool of reminiscence for users who will already
have much of their life's data in digital form, such as images, texts
and videos. The Companion is there to give that data a narrative
form, a life story, for the benefit of the user and their successors.
In the Senior Companion of course the aim is also to divert,
entertain, instruct and help.

A crucial component in the Companion vision is the relationship of
the Companion, not only to its owner, but to the Companions of other
users; as well as the relationship of users to each other mediated by
their Companions. Users will be likely to have revealed details of
themselves, their financial intentions, their health etc. to their
Companion they may not want shared or gossiped about. Companions will
need to show trust by understanding different levels and types of
information entrusted to them and what can be done with it: how far,
when making a date with another user via his or her Companion, it can
reveal relevant and useful things ('I think my owner likes yours!').
Companions must therefore know how they are affecting the moods and
emotions of their users and how far they are doing what they were
intended to do, and not more; they must be able to assess both their
owner's state and their own capacities.


2. Demonstrators: The Health and Fitness Companion

The Companions Demonstrators will show off the capabilities of the
technology, and a number of components or modules that future
research and products can build on to create a variety of Companions.

The Health and Fitness Companion supports a healthy lifestyle by
maintaining records of a user's health-related information (eating,
physical exercises) over a long time.

In the first prototype we have focused on dialogues where the user
gives the Companion background information, such as current exercise
and eating habits. Based on this, the Companion will provide the user
information on healthier eating habits (including how to cook
healthier food), and how these relate to various fitness activities.
Typically, the user and the Companion make an overall plan for the
whole day, and there can be more detailed conversations during the
day. The first prototype is capable of having a dialogue with the
user covering all daily meals. The following dialogue excerpt gives
an example of the current system:

Markku: <Touches the Companion with the RFID-tag>
Companion: Hi Markku! How are you doing today?
Markku: I'm fine, thank you.
Companion: How about taking a look at your eating habits?
Markku: Sure, why not.
Companion: Would you say yesterday was an average day?
Markku: Yes, I guess so.
Companion: What time did you wake up?
Markku: At seven o'clock
Companion: Do you usually wake up around seven in the morning?
Markku: No, usually around 8 in the morning.
Companion: What did you eat and drink between getting up and going to
Markku: Some coffee and juice
Companion: Did you eat something between getting up and going to work?
Markku: Yes, I had toast
Companion: Did you have milk or sugar in your coffee?
Markku: Yes, both milk and sugar

The current implementation uses a unique combination of spoken
dialogue system technology and physical Nabaztag/tag rabbits. In
future prototypes, cooking and physical exercises will be
incorporated, including a mobile Companion that provides feedback and
advice during physical activities, such as jogging exercises.


Watch the demonstrator on YouTube:

3. Demonstrators: Introducing the PhotoPal WoZ Platform

In order to explore how people talk about their digital photos, the
details of events, of places, of memories, the Companions project
have developed a series of Wizard of Oz (WoZ) platforms.

The notion of the WoZ approach is that to explore advanced
technologies that are beyond the reach of the researcher (through
barriers of cost, technical feasibility etc) we can replace the
technology with a human 'wizard'. This wizard then allows the
unknowing participant to experience, describe and interact with the
technological functionality with which the researcher is interested.

The Companions WoZ examples allow a participant to discuss their
photos with an on screen avatar, their PhotoPal. The set up enables
people to engage in an experience which is not yet technically
possible, namely to talk openly and naturally about their photos to
what they believe to be an intelligent computer system. The value of
this approach is twofold:

1. On an interaction design level it is possible to investigate how
best to develop elements such as interface, avatar design, aesthetics
and general functionality.

2. The sessions provide a hugely rich source of data for a variety of
dialogue corpora, thus aiding the speech work which will make such an
interaction a reality.


4. Fourth Bellagio International Workshop on Human-Computer Conversation

This workshop brings together academic researchers and industrialists
concerned with all aspects of human-computer conversation and the
associated research issues of emotion, relationships, companionship,
embodiment, ECAs, memory, evaluation etc. that persistent,
personalised computational agents will need solutions for if they are
to exist, as they will, in the reasonably near future.

Earlier workshops in this series were highly successful at getting
industry groups (as well as university and independent researchers)
to display their interests and prototypes.

Since 2000, there has been a great upsurge in research funding, and
another aim of the meeting will be to bring together members of
currently active research consortia in the EU and US, such as CALLAS,

Calls for papers, lists of invited speakers and the program
committee, will be sent out nearer the time.

Location: Grand Hotel Villa Serbelloni, Bellagio, Italy
Date: 6-7 October, 2008
Register your interest: events_at_oii.ox.ac.uk
Details: http://www.companions-project.org/events/

5. Scientists Discuss Artificial Companions

The Oxford Internet Institute, a Companions project partner, hosted a
workshop meeting of scientists from Europe and the USA to discuss the
future of 'Artificial Companions in Society' in Oxford on October
26th, 2007. Research into such characters is fast expanding and the
general perception is one of a future in which people will become
emotionally attached to these companions and will rely on them in
various ways, such as storing and accessing their personal memories
on the Internet.

The workshop was preceded by a lecture by Sherry Turkle, Abby
Rockefeller Mauze Professor of the Social Studies of Science at MIT
and Technology Director, MIT Initiative on Technology and Self. Her
lecture, entitled 'Cyberintimacies/Cybersolitudes', asked questions
such as 'What kinds of relationships are appropriate to have with machines?'

Papers presented at the workshop, on speakers' perspectives on the
present and future of this emerging science, included titles such
as: 'Conversationalists, maybe - But Confidants?' (by Margaret
Boden); 'Falling in Love with a Companion' (David Levy); 'Are
Artificial Companions Better Than Real Ones?' (Joanie Gillispie); 'A
Victorian Companion?' (Yorick Wilks) and 'Robots Should be Slaves'
(Joanna Bryson) (the Position papers are available on the Companions website).

The event was organized by Professor Yorick Wilks, Director of the
EU- funded Companions project, on behalf of the e-Horizons Institute,
a unit of the James Martin School of the 21st Century; the workshop
was also supported by Microsoft Research.

Companions events:

6. Feature: Nabaztag as a first Companion

The Nabaztag rabbit is being used as the first Companion avatar in
two initial demonstrators in the project. In the Health and Fitness
Companion being integrated at the University of Tampere, Nabaztag is
the free- standing rabbit available commercially from Violet in
Paris, adapted with an API so as to receive speech input as well as
output. In the Senior Companion, being integrated at the University
of Sheffield, Nabaztag is appearing as a talking 2D screen rabbit,
discussing the content of personal photos.Nabaztag is the brain-child
of Rafi Halidjan (the name is the Armenian for rabbit). It is a
plastic rabbit that can move its ears, flash colours and speak. Its
input is entirely wifi, driven from a remote website. The standard
use is to convey messages and emotions, all input at a website, to a
remote friend or lover: it can flash blue with ears down and say or
sing 'I miss you', or indeed anything else you type in. Rafi Halidjan
is on the Companions Governing Council and we hope for a closer
relationship in the future as Nabaztag has much of the attractive,
distinctive and expressive quality one wants in a physical embodiment
of the Companion idea: especially as the Tampere version can now
listen as well as talk!

7. Our Industrial Partners

The automatic speech recognizer (ASR) and the text-to-speech (TTS)
system integrated in the present Companions demonstrators have been
provided by Loquendo. While themes related to improving acoustic
quality, naturalness, and expressivity of speech generation are
subjects of research done by Loquendo in Companions, the ASR
integrated in the demonstrators is the Loquendo commercial product.
Loquendo ASR can support speech applications that need accurate
recognition of broad vocabularies, up to 1,000,000 forms of words.
The speech recognition engine is based on algorithms that integrate
neural networks and continuous density hidden Markov models. The TTS
pronunciation lexicon ensures that specialized vocabularies,
abbreviations, acronyms, and even regional pronunciation differences,
can sound as the speech application developer intends them to. The
acoustic characteristics of the voices (speech, speaking rate, and
volume) can be fine tuned and controlled at the application level.

One of the today's buzzwords among telco companies and a technology
that can significantly enable some of the future Companions features
is IMS (IP Multimedia Subsystem). This acts as the glue that will
bring today's mobile, fixed and IP telephony into a convergent,
unified entity. IMS will offer significant advantages on concepts
such as multidevice situations, usability, session transferring,
identity tracking, personalization, security and others, all of which
will undoubtedly be crucial in the future Companions prototypes. The
same Telefonica I+D team working on Companions has recently
collaborated in the development of scenarios and implementation for
one of these projects, involving many of the above concepts. Our
mixed presence and identity tracking demonstrator performs
transference of sessions between different devices such as PCs,
smartphones and touch panels, enabling users to perform complex tasks
such as reading their incoming mail and SMS/MMS, accessing the
internet and granting access to their homes. We hope to bring some of
this technology in later stages of the project to make the Companion
not only able to recognise each user's presence, but to follow her or
him along using different devices. http://www.tid.es/html_eng/index.html

As An Angel
As An Angel is a European SME, founded in 2001, based in Paris,
specialized in human-computer dialogue and agent-based multimodal
interfaces. It has produced a dozen conversational agents, prototypes
and demonstrators for advertising agencies, banks, and other
advertisers. As An Angel is about to launch an online automated
generator of personified avatars, named AngelStudio. This generator
allows the user to develop his or her virtual double in just a few
minutes. This personified and learning avatar will be able to achieve
the following delegated actions such as representing the user on
his/her personal site or blog, meeting sites and forums, online
communities or virtual worlds and simulation games. The first
application of AngelStudio technology is the game SimDate, currently
in its final stage of development and due to launch online in January
2008. In this simulation of a seduction dialogue, the players purpose
is to seduce a character (derived from a real person) through an
instant messaging-like dialogue with a webcam. http://www.asanangel.com/

The major undertaking of TeliaSonera has been the development of a
dialogue management method suitable for automatic troubleshooting and
other problem-solving applications. The method has a theorem-proving
flavour, in that it recursively decomposes tasks into sequences of
subtasks and atomic actions. An explicit objective when designing the
method has been that it should be tailored for use by other people
than the designers themselves. A pilot implementation in the domain
of over-the-phone broadband support has been developed and has been
used for internal data collection in order to validate the basic
ideas and fine-tune the design. We are particularly interested in
trying to apply unsupervised machine learning methods, such as
reinforcement learning, to enable this kind of system to learn to
improve its behaviour by itself. http://www.teliasonera.com/

The consortium:

8. Companions Scientific Governing Council

Chair: Professor Ricardo Baeza-Yates (Yahoo! Research, Barcelona,
Members: Professor James Allen (University of Rochester, USA);
Professor Dr Harry C. Bunt (University of Tilburg, Netherlands); Dr
Gregory Grefenstette (CEA, France); Dr Rafi Haladjian (Violet,
France); Dr David Levy (Intelligent Systems, UK).

9. Contact

Professor Yorick Wilks
Computer Science Dept, Univ. of Sheffield, 211 Portobello Rd, Regent
Sheffield S1 4DP, UK
Tel: +44 (0)114 2221804
Email: mailto:yorick_at_dcs.shef.ac.uk
Web: http://www.companions-project.org/

Companions is a European Commission Sixth Framework Programme
Society Technologies Integrated Project (IST-34434)
Received on Fri Feb 01 2008 - 02:11:27 EST

This archive was generated by hypermail 2.2.0 : Fri Feb 01 2008 - 02:11:28 EST