Call for Participation: 3rd VideOlympics Showcase Event at ACM CIVR '09
Call for Participation: 3rd VideOlympics Showcase Event at ACM CIVR ‘09
http://www.VideOlympics.org
ACM International Conference on Image and Video Retrieval (ACM CIVR'09)
July 8-10, 2009
Island of Santorini, Greece
http://www.civr2009.org
Demo sessions of video retrieval systems are ideal venues to disseminate
scientific results. Existing demo sessions, however, fail to engage the
audience fully. Real-time evaluation of several video retrieval systems in
a single showcase increases impact. Encouraged by the success of previous
editions, we will again organize a VideOlympics showcase at the 2009 ACM
International Conference on Image and Video Retrieval.
The major aim of the VideOlympics is to promote research in video
retrieval research. An additional main goal of the VideOlympics is giving
the audience a good perspective on the possibilities and limitations of
the current state-of-the-art systems. Where traditional evaluation
campaigns like TRECVID focus primarily on the effectiveness of collected
retrieval results, the VideOlympics also allows to take into account the
influence of interaction mechanisms and the advanced visualizations in the
interface. Specifically, we aim for a showcase that goes beyond the
regular demo session: it should be fun to do for the participants and fun
to watch for the conference audience. For all these reasons, the
VideOlympics should only have winners. Similar to last year, a number of
TRECVID participants will simultaneously do an *interactive search task*
during the VideOlympics showcase event. See the video trailer available at
www.VideOlympics.org for an impression of the event.
New in 2009
For the first time, we will include in the 2009 edition of the VideOlympics
a round with *novice* users, in addition to the round with expert users.
The novice users will be selected from a group of high-school teenagers from
the island of Santorini, for whom it can be assumed that they have a decent
English language level. Moreover it is allowed to provide each novice user
with a short training session with your video search engine (amount to be
defined).
Setup
We will be working with TRECVID 2008 test data (100 hrs of broadcast TV
news, documentaries, and educational programming) from the Netherlands
Institute for Sound and Vision. Text-only search topics will be revealed
during the showcase, similar in spirit to those used in several editions
of the TRECVID interactive search tasks, e.g.:
* Find shots of traffic signs
* Fins shots of a meeting with a large table and people visible
* Find shots of a daytime demonstration or protest with at least part
of one building visible
In contrast to TRECVID's interactive search task, where results are
submitted at the end of the search session, results should be submitted
immediately after they are found (see our protocol). Thus encouraging
quick retrieval of relevant results as well as unique results not found by
others. Note that the main aim of the VideOlympics is giving the audience
a good perspective on the possibilities and limitations of the current
state-of-the-art systems. Results are not meant for publication, but for
winning the *Golden Retriever* awards. Golden Retrievers will be awarded
for:
* Best performer
* Most impressive interface
* The public's favorite
* The one the public could use
* ...
In summary, the VideOlympics will only have winners.
Constraints
* Only one system per team can be connected to the scoreboard
* Your interactive systems should work on at least one laptop, with all
relevant TRECVID 2008 test data on board.
* Your interactive system can not use on-line information extraction from
the Internet.
* We provide the protocol and software for a simple submit system of
results (check: http://www.VideOlympics.org).
Participation
If you would like to participate in the VideOlympics Showcase, send an
email to: submit@videolympics.org before February 1, 2009.
Please include the following:
* A 1-page description of your video search engine, possibly including a
screen shot, formatted in PDF using the camera-ready templates available at
http://www.acm.org/sigs/publications/proceedings-templates/
* Indicate whether you will participate in the novice and/or expert round.
* 3 possible text-only search topics of varying complexity (NOT similar to
existing TRECVID 2007/2008 topics).
All system descriptions will be peer-reviewed by at least 2 referees.
Criteria for selection are novelty, uniqueness, and potential for audience
involvement. If your system is accepted, you are required to provide
ground truth annotation for 1 of your submitted search topics (the topic
will be selected by the organization).
Organizers
Cees Snoek, University of Amsterdam, The Netherlands
Marcel Worring, University of Amsterdam, The Netherlands
Rong Yan, IBM Research, USA
Alex Hauptmann, Carnegie Mellon University, USA
Co-Organizers
Ork de Rooij, University of Amsterdam, The Netherlands
Koen van de Sande, University of Amsterdam, The Netherlands
Stefanos Vrochidis, Informatics and Telematics Institute, Greece