|
Records |
Links |
|
Author |
Eduardo Aguilar; Bhalaji Nagarajan; Beatriz Remeseiro; Petia Radeva |
|
|
Title |
Bayesian deep learning for semantic segmentation of food images |
Type |
Journal Article |
|
Year |
2022 |
Publication |
Computers and Electrical Engineering |
Abbreviated Journal |
CEE |
|
|
Volume |
103 |
Issue |
|
Pages |
108380 |
|
|
Keywords |
Deep learning; Uncertainty quantification; Bayesian inference; Image segmentation; Food analysis |
|
|
Abstract |
Deep learning has provided promising results in various applications; however, algorithms tend to be overconfident in their predictions, even though they may be entirely wrong. Particularly for critical applications, the model should provide answers only when it is very sure of them. This article presents a Bayesian version of two different state-of-the-art semantic segmentation methods to perform multi-class segmentation of foods and estimate the uncertainty about the given predictions. The proposed methods were evaluated on three public pixel-annotated food datasets. As a result, we can conclude that Bayesian methods improve the performance achieved by the baseline architectures and, in addition, provide information to improve decision-making. Furthermore, based on the extracted uncertainty map, we proposed three measures to rank the images according to the degree of noisy annotations they contained. Note that the top 135 images ranked by one of these measures include more than half of the worst-labeled food images. |
|
|
Address |
October 2022 |
|
|
Corporate Author |
|
Thesis |
|
|
|
Publisher |
Science Direct |
Place of Publication |
|
Editor |
|
|
|
Language |
|
Summary Language |
|
Original Title |
|
|
|
Series Editor |
|
Series Title |
|
Abbreviated Series Title |
|
|
|
Series Volume |
|
Series Issue |
|
Edition |
|
|
|
ISSN |
|
ISBN |
|
Medium |
|
|
|
Area |
|
Expedition |
|
Conference |
|
|
|
Notes |
MILAB |
Approved |
no |
|
|
Call Number |
Admin @ si @ ANR2022 |
Serial |
3763 |
|
Permanent link to this record |
|
|
|
|
Author |
Jordi Vitria; M. Bressan; Petia Radeva |
|
|
Title |
Bayesian classification of cork stoppers using class-conditional independent component analysis |
Type |
Journal |
|
Year |
2006 |
Publication |
IEEE Transactions on Systems, Man and Cybernetics (Part C), 36(6) |
Abbreviated Journal |
|
|
|
Volume |
|
Issue |
|
Pages |
|
|
|
Keywords |
|
|
|
Abstract |
|
|
|
Address |
|
|
|
Corporate Author |
|
Thesis |
|
|
|
Publisher |
|
Place of Publication |
|
Editor |
|
|
|
Language |
|
Summary Language |
|
Original Title |
|
|
|
Series Editor |
|
Series Title |
|
Abbreviated Series Title |
|
|
|
Series Volume |
|
Series Issue |
|
Edition |
|
|
|
ISSN |
|
ISBN |
|
Medium |
|
|
|
Area |
|
Expedition |
|
Conference |
|
|
|
Notes |
OR;MILAB;MV |
Approved |
no |
|
|
Call Number |
BCNPCL @ bcnpcl @ VBR2006 |
Serial |
723 |
|
Permanent link to this record |
|
|
|
|
Author |
Jordi Vitria; M. Bressan; Petia Radeva |
|
|
Title |
Bayesian classification of cork stoppers using class-conditional independent component analysis |
Type |
Journal |
|
Year |
2007 |
Publication |
IEEE Transactions on Systems, Man and Cybernetics (Part C), 37(1): 32–38 (ISI 0,482) |
Abbreviated Journal |
|
|
|
Volume |
|
Issue |
|
Pages |
|
|
|
Keywords |
|
|
|
Abstract |
|
|
|
Address |
|
|
|
Corporate Author |
|
Thesis |
|
|
|
Publisher |
|
Place of Publication |
|
Editor |
|
|
|
Language |
|
Summary Language |
|
Original Title |
|
|
|
Series Editor |
|
Series Title |
|
Abbreviated Series Title |
|
|
|
Series Volume |
|
Series Issue |
|
Edition |
|
|
|
ISSN |
|
ISBN |
|
Medium |
|
|
|
Area |
|
Expedition |
|
Conference |
|
|
|
Notes |
OR;MILAB;MV |
Approved |
no |
|
|
Call Number |
BCNPCL @ bcnpcl @ VBR2007 |
Serial |
795 |
|
Permanent link to this record |
|
|
|
|
Author |
Alejandro Cartas; Juan Marin; Petia Radeva; Mariella Dimiccoli |
|
|
Title |
Batch-based activity recognition from egocentric photo-streams revisited |
Type |
Journal Article |
|
Year |
2018 |
Publication |
Pattern Analysis and Applications |
Abbreviated Journal |
PAA |
|
|
Volume |
21 |
Issue |
4 |
Pages |
953–965 |
|
|
Keywords |
Egocentric vision; Lifelogging; Activity recognition; Deep learning; Recurrent neural networks |
|
|
Abstract |
Wearable cameras can gather large amounts of image data that provide rich visual information about the daily activities of the wearer. Motivated by the large number of health applications that could be enabled by the automatic recognition of daily activities, such as lifestyle characterization for habit improvement, context-aware personal assistance and tele-rehabilitation services, we propose a system to classify 21 daily activities from photo-streams acquired by a wearable photo-camera. Our approach combines the advantages of a late fusion ensemble strategy relying on convolutional neural networks at image level with the ability of recurrent neural networks to account for the temporal evolution of high-level features in photo-streams without relying on event boundaries. The proposed batch-based approach achieved an overall accuracy of 89.85%, outperforming state-of-the-art end-to-end methodologies. These results were achieved on a dataset consists of 44,902 egocentric pictures from three persons captured during 26 days in average. |
|
|
Address |
|
|
|
Corporate Author |
|
Thesis |
|
|
|
Publisher |
|
Place of Publication |
|
Editor |
|
|
|
Language |
|
Summary Language |
|
Original Title |
|
|
|
Series Editor |
|
Series Title |
|
Abbreviated Series Title |
|
|
|
Series Volume |
|
Series Issue |
|
Edition |
|
|
|
ISSN |
|
ISBN |
|
Medium |
|
|
|
Area |
|
Expedition |
|
Conference |
|
|
|
Notes |
MILAB; no proj |
Approved |
no |
|
|
Call Number |
Admin @ si @ CMR2018 |
Serial |
3186 |
|
Permanent link to this record |
|
|
|
|
Author |
Khalid El Asnaoui; Petia Radeva |
|
|
Title |
Automatically Assess Day Similarity Using Visual Lifelogs |
Type |
Journal Article |
|
Year |
2020 |
Publication |
International Journal of Intelligent Systems |
Abbreviated Journal |
IJIS |
|
|
Volume |
29 |
Issue |
|
Pages |
298–310 |
|
|
Keywords |
|
|
|
Abstract |
Today, we witness the appearance of many lifelogging cameras that are able to capture the life of a person wearing the camera and which produce a large number of images everyday. Automatically characterizing the experience and extracting patterns of behavior of individuals from this huge collection of unlabeled and unstructured egocentric data present major challenges and require novel and efficient algorithmic solutions. The main goal of this work is to propose a new method to automatically assess day similarity from the lifelogging images of a person. We propose a technique to measure the similarity between images based on the Swain’s distance and generalize it to detect the similarity between daily visual data. To this purpose, we apply the dynamic time warping (DTW) combined with the Swain’s distance for final day similarity estimation. For validation, we apply our technique on the Egocentric Dataset of University of Barcelona (EDUB) of 4912 daily images acquired by four persons with preliminary encouraging results. |
|
|
Address |
|
|
|
Corporate Author |
|
Thesis |
|
|
|
Publisher |
|
Place of Publication |
|
Editor |
|
|
|
Language |
|
Summary Language |
|
Original Title |
|
|
|
Series Editor |
|
Series Title |
|
Abbreviated Series Title |
|
|
|
Series Volume |
|
Series Issue |
|
Edition |
|
|
|
ISSN |
|
ISBN |
|
Medium |
|
|
|
Area |
|
Expedition |
|
Conference |
|
|
|
Notes |
MILAB; no proj |
Approved |
no |
|
|
Call Number |
AsR2020 |
Serial |
3409 |
|
Permanent link to this record |