LSC '20: Proceedings of the Third Annual Workshop on Lifelog Search Challenge
LSC '20: Proceedings of the Third Annual Workshop on Lifelog Search Challenge
SESSION: Oral Paper Session
Interactive Lifelog Retrieval with vitrivr
- Silvan Heller
- Mahnaz Amiri Parian
- Ralph Gasser
- Loris Sauter
- Heiko Schuldt
The variety and amount of data being collected in our everyday life poses unique challenges for multimedia retrieval. In the Lifelog Search Challenge (LSC), multimedia retrieval systems compete in finding events based on descriptions containing hints about structured, semi-structured an unstructured data. In this paper, we present the multimedia retrieval system vitrivr with a focus on the changes and additions made based on the new dataset, and our successful participation at LSC 2019. Specifically, we show how the new dataset can be used for retrieval in different modalities without sacrificing efficiency, describe two recent additions, temporal scoring and staged querying, and discuss the deep learning methods used to enrich the dataset.
VRLE: Lifelog Interaction Prototype in Virtual Reality: Lifelog Search Challenge at ACM ICMR 2020
- Aaron Duane
- Björn Þór Jónsson
- Cathal Gurrin
The Lifelog Search Challenge (LSC) invites researchers to share their prototypes for interactive lifelog retrieval and encourages competition to develop and evaluate effective methodologies to achieve this. With this paper we present a novel approach to visual lifelog exploration based on our research to date utilising virtual reality as a medium for interactive information retrieval. The prototype discussed is an iteration on a previous system which participated in the first LSC which occurred at ACM ICMR in 2018 where it ranked first place.
LifeGraph: A Knowledge Graph for Lifelogs
- Luca Rossetto
- Matthias Baumgartner
- Narges Ashena
- Florian Ruosch
- Romana Pernischová
- Abraham Bernstein
The data produced by efforts such as life logging is commonly multi modal and can have manifold interrelations with itself as well as external information. Representing this data in such a way that these rich relations as well as all the different sources can be leveraged is a non-trivial undertaking. In this paper, we present the first iteration of LifeGraph, a Knowledge Graph for lifelogging data. LifeGraph aims at not only capturing all aspects of the data contained in a lifelog but also linking them to external, static knowledge bases in order to put the log as a whole as well as its individual entries into a broader context. In the Lifelog Search Challenge 2020, we show a first proof-of-concept implementation of LifeGraph as well as a retrieval system prototype which utilizes it to search the log for specific events.
Exquisitor at the Lifelog Search Challenge 2020
- Omar Shahbaz Khan
- Mathias Dybkjær Larsen
- Liam Alex Sonto Poulsen
- Björn Þór Jónsson
- Jan Zahálka
- Stevan Rudinac
- Dennis Koelma
- Marcel Worring
We present an enhanced version of Exquisitor, our interactive and scalable media exploration system. At its core, Exquisitor is an interactive learning system using relevance feedback on media items to build a model of the users' information need. Relying on efficient media representation and indexing, it facilitates real-time user interaction. The new features for the Lifelog Search Challenge 2020 include support for timeline browsing, search functionality for finding positive examples, and significant interface improvements. Participation in the Lifelog Search Challenge allows us to compare our paradigm, relying predominantly on interactive learning, with more traditional search-based multimedia retrieval systems.
Myscéal: An Experimental Interactive Lifelog Retrieval System for LSC'20
- Ly-Duyen Tran
- Manh-Duy Nguyen
- Nguyen Thanh Binh
- Hyowon Lee
- Cathal Gurrin
The Lifelog Search Challenge (LSC), is an annual comparative bench-marking activity for comparing approaches to interactive retrieval from multi-modal lifelogs. Being an interactive search challenge, issues such as retrieval accuracy, search speed and usability of inter-faces are key challenges that must be addressed by every participant. In this paper, we introduce Myscéal, an interactive lifelog retrieval engine designed to support novice users to retrieve items of interest from a large multimodal lifelog. Additionally, we also introduce anew similarity measure called "aTFIDF", to match a user's free-text information need with the multimodal lifelog index.
A Multi-level Interactive Lifelog Search Engine with User Feedback
- Jiayu Li
- Min Zhang
- Weizhi Ma
- Yiqun Liu
- Shaoping Ma
With the rise of portable wearable devices, it is easier for users to save their lifelog data. As lifelog is usually disorganized with multi-modal information (even noisy sometimes), an interactive search engine is crucial for users to review and explore their lifelog. Unlike traditional search engines, lifelog search includes multi-modality information of images, text and other data from sensors, which brings challenges to data arrangement and search. Accordingly, users' information need is also multi-level. Hence, a single interaction mechanism may not be able to satisfy users' requirements. As the data set is highly personalized, interaction and feedback from users should also be considered in the search engine. Therefore, in this paper we present an interactive multi-modality lifelog search engine to help users manage and find lifelog data. To this end, lifelog data is clustered and processed in multi-level processing. Then, we build an interactive search engine, includingtext as query, image as query, andtimeline view modules. Besides, the system is able to adopt user feedback mechanisms in multi-round queries. Our system shows promising experimental results on LSC'20 dataset and development topics. The text-based search module gives correct results on more than 60% of the development topics at LSC'20.
lifeXplore at the Lifelog Search Challenge 2020
- Andreas Leibetseder
- Klaus Schoeffmann
Since its first iteration in 2018, the Lifelog Search Challenge (LSC) -- an interactive competition for retrieving lifelogging moments -- is co-located at the annual ACM International Conference on Multimedia Retrieval (ICMR) and has drawn international attention. With the goal of making an ever growing public lifelogging dataset searchable, several teams develop systems for quickly solving time-limited queries during the challenge. Having participated in both previous LSC iterations, i.e. LSC2018 and LSC2019, we present our lifeXplore system -- a video exploration and retrieval tool combining feature map browsing, concept search and filtering as well as hand-drawn sketching. The system is improved by including additional deep concept YOLO9000, optical character recognition (OCR) as well as adding uniform sampling as an alternative to the system's traditional underlying shot segmentation.
BIDAL-HCMUS@LSC2020: An Interactive Multimodal Lifelog Retrieval with Query-to-Sample Attention-based Search Engine
- Anh-Vu Mai-Nguyen
- Trong-Dat Phan
- Anh-Khoa Vo
- Van-Luon Tran
- Minh-Son Dao
- Koji Zettsu
In this paper, we introduce an interactive multimodal lifelog retrieval system with the search engine built by utilizing the attention mechanism. The algorithm upon which the system relies is constructed by applying two observations: (1) most of the images belonged to one event probably contain cues (e.g., objects) that relate to the content of queries. These cues contribute to the representative of the event, and (2) instances of one event can be associated with the content and context of such an event. Hence, when we can determine the seed (by leveraging the first observation), we can find all relevant instances (by utilizing the second observation). We also take a benefit of querying by samples (e.g., images) by converting text query to images using the attention-based mechanism. Thus, we can enrich and add more semantic meaning into the simple text query of users towards having more accurate results, as well as discovering hidden results that cannot reach by using only text queries. The system is designed for both novice and expert users with several filters to help users express their queries from general to particular descriptions and to polish their results.
Multimodal Retrieval through Relations between Subjects and Objects in Lifelog Images
- Tai-Te Chu
- Chia-Chun Chang
- An-Zi Yen
- Hen-Hsen Huang
- Hsin-Hsi Chen
With the development of wearable devices, people nowadays record their life experiences much easier than before. Lifelog retrieval becomes an emerging task. Because of the semantic gap between visual data and textual queries, retrieving lifelog images with text queries could be challenging. This paper proposes an interactive lifelog retrieval system that is aimed at retrieving more intuitive and accurate results. Our system is divided into the offline and the online parts. In the offline part, we aim to incorporate original visual and textual concepts from images into our system utilizing pre-trained word embedding. Moreover, we encode the information of relationships between subjects and objects in images by using a pre-trained relation graph generation model. In the online part, We provide an intuitive frontend with various metadata filters, which not only provides users with a convenient interface, but also a mechanism to exploit detail memory recall to users. In this case, users would clearly know the difference between the concepts in the clusters and efficiently browse the retrieved images clusters in a short time.
LifeSeeker 2.0: Interactive Lifelog Search Engine at LSC 2020
- Tu-Khiem Le
- Van-Tu Ninh
- Minh-Triet Tran
- Thanh-An Nguyen
- Hai-Dang Nguyen
- Liting Zhou
- Graham Healy
- Cathal Gurrin
In this paper we present our interactive lifelog retrieval engine in the LSC'20 comparative benchmarking challenge. The LifeSeeker2.0 interactive lifelog retrieval engine is developed by both Dublin City University and Ho Chi Minh University of Science, which represents an enhanced version of the two corresponding inter-active lifelog retrieval engines in LSC'19. The implementation of LifeSeeker 2.0 has been designed to focus on the searching by text query using a Bag-of-Words model with visual concept augmentation and additional improvements in query processing time, enhanced result display and browsing support, and interacting with visual graphs for both query and filter purposes.
VIRET Tool with Advanced Visual Browsing and Feedback
- Gregor Kovalčík
- Vít Škrhak
- Tomáš Souček
- Jakub Lokoč
VIRET is a video retrieval system that performed competitively in several recent interactive multimedia search competitions. For the Lifelog Search Challenge 2020, we implement several new features that focus on result set presentation based on dynamically computed self-organized maps, as well as additional feedback about the distribution of relevance scores for a query. The latter feature provides users with more insight to assess input queries and could help with the design of new ranking models in the future.
FIRST - Flexible Interactive Retrieval SysTem for Visual Lifelog Exploration at LSC 2020
- Minh-Triet Tran
- Thanh-An Nguyen
- Quoc-Cuong Tran
- Mai-Khiem Tran
- Khanh Nguyen
- Van-Tu Ninh
- Tu-Khiem Le
- Hoang-Phuc Trang-Trung
- Hoang-Anh Le
- Hai-Dang Nguyen
- Trong-Le Do
- Viet-Khoa Vo-Ho
- Cathal Gurrin
Lifelog can provide useful insights of our daily activities. It is essential to provide a flexible way for users to retrieve certain events or moments of interest, corresponding to a wide variation of query types. This motivates us to develop FIRST, a Flexible Interactive Retrieval SysTem, to help users to combine or integrate various query components in a flexible manner to handle different query scenarios, such as visual clustering data based on color histogram, visual similarity, GPS location, or scene attributes. We also employ personalized concept detection and image captioning to enhance image understanding from visual lifelog data, and develop an autoencoder-like approach for query text and image feature mapping. Furthermore, we refine the user interface of the retrieval system to better assist users in query expansion and verifying sequential events in a flexible temporal resolution to control the navigation speed through sequences of images.
SOMHunter for Lifelog Search
- František Mejzlík
- Patrik Veselý
- Miroslav Kratochvíl
- Tomáš Souček
- Jakub Lokoč
We present an extended version of SOMHunter, an interactive retrieval tool designed for known-item and ad-hoc search tasks over image and video datasets. Even though SOMHunter has achieved significant success at the most recent Video Browser Showdown 2020 competition, lifelog datasets and related search tasks constitute a different challenge with specific problems to be addressed. The presented version of SOMHunter integrates functionality required to deal with lots of highly similar images collected in typical lifelogs. Additionally, we employ search in available metadata including date and time filters.
Voxento: A Prototype Voice-controlled Interactive Search Engine for Lifelogs
- Ahmed Alateeq
- Mark Roantree
- Cathal Gurrin
In this paper, we describe an interactive voice-based retrieval system for lifelogs which has been developed to participate in the third Lifelog Search Challenge LSC'20, at ACM ICMR'20. Based on a standard text-based retrieval methodology, the novelty of Voxento is in the interactive voice facility that allows a user to interact with a personal lifelog using simple voice commands. Voxento was developed as an initial prototype of a pervasive computing system that can be deployed for wearable technologies such as Google Glass. The version of Voxento described in this paper has been optimised for use with a desktop computer in order to be competitive at the LSC'20 challenge.