Yearly Archives: 2016

Detection of fast incoming objects with a moving camera

Detection of fast incoming objects with a moving camera

  • Fabio Poiesi and Andrea Cavallaro. Detection of fast incoming objects with a moving camera. In British Machine Vision Conference, pages 1-11, York, UK, 2016.
    [BibTeX] [Abstract] [Download PDF]

    Using a monocular camera for early collision detection in cluttered scenes to elude fast incoming objects is a desirable but challenging functionality for mobile robots, such as small drones. We present a novel moving object detection and avoidance algorithm for an uncalibrated camera that uses only the optical flow to predict collisions. First, we estimate the optical flow and compensate the global camera motion. Then we detect incoming objects while removing the noise caused by dynamic textures, nearby terrain and lens distortion by means of an adaptively learnt background-motion model. Next, we estimate the time to contact, namely the expected time for an incoming object to cross the infinite plane defined by the extension of the image plane. Finally, we combine the time to contact and the compensated motion in a Bayesian framework to identify an object-free region the robot can move towards to avoid the collision. We demonstrate and evaluate the proposed algorithm using footage of flying robots that observe fast incoming objects such as birds, balls and other drones.

    @InProceedings{2016-09-POIESI,
    title = {{Detection of fast incoming objects with a moving camera}},
    author = {Fabio Poiesi and Andrea Cavallaro},
    booktitle = {{British Machine Vision Conference}},
    address= {York, UK},
    date = {2016-09-19/2016-09-22},
    year = {2016},
    pages = {1-11},
    url = {http://fabio-poiesi.com/files/papers/conferences/2016_BMVC_DetectionFastIncomingObjects_Poiesi_Cavallaro.pdf},
    abstract = {Using a monocular camera for early collision detection in cluttered scenes to elude fast incoming objects is a desirable but challenging functionality for mobile robots, such as small drones. We present a novel moving object detection and avoidance algorithm for an uncalibrated camera that uses only the optical flow to predict collisions. First, we estimate the optical flow and compensate the global camera motion. Then we detect incoming objects while removing the noise caused by dynamic textures, nearby terrain and lens distortion by means of an adaptively learnt background-motion model. Next, we estimate the time to contact, namely the expected time for an incoming object to cross the infinite plane defined by the extension of the image plane. Finally, we combine the time to contact and the compensated motion in a Bayesian framework to identify an object-free region the robot can move towards to avoid the collision. We demonstrate and evaluate the proposed algorithm using footage of flying robots that observe fast incoming objects such as birds, balls and other drones.}
    }

Posted in Dissemination | Leave a comment
Positioning System for Recreated Reality Applications Implemented on a Multi-Processing Embedded System

Positioning System for Recreated Reality Applications Implemented on a Multi-Processing Embedded System

  • Patricia Martinez and Eugenio Villar. Positioning System for Recreated Reality Applications Implemented on a Multi-Processing Embedded System. In I Jornadas de Computacion Empotrada y Reconfigurable (JCER 2016), Salamanca, Spain, 2016.
    [BibTeX]
    @InProceedings{2016-09-MARTINEZ,
    author = {Patricia Martinez and Eugenio Villar},
    title = {{Positioning System for Recreated Reality Applications Implemented on a Multi-Processing Embedded System}},
    booktitle = {{I Jornadas de Computacion Empotrada y Reconfigurable (JCER 2016)}},
    date = {2016-09-14/2016-09-16},
    year = {2016},
    address = {Salamanca, Spain}
    }

Posted in Dissemination | Leave a comment
Simple Gait Parameterization and 3D Animation for Anonymous Visual Monitoring Based on Augmented Reality

Simple Gait Parameterization and 3D Animation for Anonymous Visual Monitoring Based on Augmented Reality

  • Piotr Szczuko. Simple Gait Parameterization and 3D Animation for Anonymous Visual Monitoring Based on Augmented Reality. Multimedia Tools and Applications, 75(17):10561-10581, 2016. doi:10.1007/s11042-015-2874-0
    [BibTeX] [Abstract]

    The article presents a method for video anonymization and replacing real human silhouettes with virtual 3D figures rendered on a screen. Video stream is processed to detect and to track objects, whereas anonymization stage employs animating avatars accordingly to behavior of detected persons. Location, movement speed, direction, and person height are taken into account during animation and rendering phases. This approach requires a calibrated camera, and utilizes results of visual object tracking. A procedure for transforming objects visual features and bounding boxes into gait parameters for animated figures is presented. Conclusions and future work perspectives are provided.

    @Article{2016-09-SZCZUKOa,
    author = {Piotr Szczuko},
    title = {{Simple Gait Parameterization and 3D Animation for Anonymous Visual Monitoring Based on Augmented Reality}},
    journal = {{Multimedia Tools and Applications}},
    volume = {75},
    number = {17},
    date = {2016-09-01},
    pages = {10561--10581},
    doi = {10.1007/s11042-015-2874-0},
    issn = {1380-7501},
    publisher = {Springer},
    abstract = {The article presents a method for video anonymization and replacing real human silhouettes with virtual 3D figures rendered on a screen. Video stream is processed to detect and to track objects, whereas anonymization stage employs animating avatars accordingly to behavior of detected persons. Location, movement speed, direction, and person height are taken into account during animation and rendering phases. This approach requires a calibrated camera, and utilizes results of visual object tracking. A procedure for transforming objects visual features and bounding boxes into gait parameters for animated figures is presented. Conclusions and future work perspectives are provided.},
    year = {2016}
    }

Posted in Dissemination | Leave a comment
Camera coalitions

Camera coalitions

  • Andrea Cavallaro. Camera coalitions. Invited talk at IEEE-EURASIP Summer School on Signal Processing (S3P-2016), 2016.
    [BibTeX] [Abstract]

    Cameras are everywhere. Miniature high-quality cameras are increasingly worn by people, mounted on dashboards and micro-drones, omnipresent in hallways, streets and stores; and in your smartphone. Countless applications will benefit from the capabilities offered by networks of wireless cameras that can autonomously sense, compute, decide and communicate. These networks are composed of cameras whose algorithms need to adapt in response to unknown or dynamic environments and to changes in the assigned task. In this lecture I will present recent methods for cameras to move and to interact locally based on content and context, and to form coalitions that reach coordinated decisions under resource and physical constraints. I will discuss how cameras self-evaluate their performance and improve the quality of the task they are executing through collaboration, adaptively.

    @Misc{2016-09-CAVALLAROa,
    author = {Andrea Cavallaro},
    title = {{Camera coalitions}},
    howpublished = {Invited talk at IEEE-EURASIP Summer School on Signal Processing (S3P-2016)},
    date = {2016-09-04/2016-09-10},
    year = {2016},
    address = {Trento, IT},
    abstract = {Cameras are everywhere. Miniature high-quality cameras are increasingly worn by people, mounted on dashboards and micro-drones, omnipresent in hallways, streets and stores; and in your smartphone. Countless applications will benefit from the capabilities offered by networks of wireless cameras that can autonomously sense, compute, decide and communicate. These networks are composed of cameras whose algorithms need to adapt in response to unknown or dynamic environments and to changes in the assigned task. In this lecture I will present recent methods for cameras to move and to interact locally based on content and context, and to form coalitions that reach coordinated decisions under resource and physical constraints. I will discuss how cameras self-evaluate their performance and improve the quality of the task they are executing through collaboration, adaptively.}
    }

Posted in Dissemination | Leave a comment
Detection of fast incoming objects with a moving camera

Detection of fast incoming objects with a moving camera

  • Fabio Poiesi and Andrea Cavallaro. Detection of fast incoming objects with a moving camera. 2016. Software of Detection of fast incoming objects with a moving camera
    [BibTeX] [Download software]
    @Misc{2016-09-POIESIb,
    author = {Fabio Poiesi and Andrea Cavallaro},
    title = {{Detection of fast incoming objects with a moving camera}},
    note = {Software of Detection of fast incoming objects with a moving camera},
    date = {2016-09-07},
    year = {2016},
    software = {http://www.eecs.qmul.ac.uk/~andrea/avoidance.html}
    }

Posted in Dissemination | Leave a comment
Detection of fast incoming objects with a moving camera

Detection of fast incoming objects with a moving camera

  • Fabio Poiesi and Andrea Cavallaro. Detection of fast incoming objects with a moving camera. 2016. Online video, illustration of Detection of fast incoming objects with a moving camera
    [BibTeX] [Watch video]
    @misc{2016-09-POIESIa,
    author = {Fabio Poiesi and Andrea Cavallaro},
    title = {{Detection of fast incoming objects with a moving camera}},
    note = {Online video, illustration of Detection of fast incoming objects with a moving camera},
    date = {2016-09-07},
    year = {2016},
    video = {http://www.eecs.qmul.ac.uk/~andrea/avoidance.html}
    }

Posted in Dissemination | Leave a comment
Online multi-target tracking with strong and weak detections

Online multi-target tracking with strong and weak detections

  • Ricardo Sanchez-Matilla, Fabio Poiesi, and Andrea Cavallaro. Online multi-target tracking with strong and weak detections. 2016. Online video, illustration of Online multi-target tracking with strong and weak detections
    [BibTeX] [Download PDF]
    @Misc{2016-09-MATILLA,
    author = {Ricardo Sanchez-Matilla and Fabio Poiesi and Andrea Cavallaro},
    title = {{Online multi-target tracking with strong and weak detections}},
    note = {Online video, illustration of Online multi-target tracking with strong and weak detections},
    date = {2016-09-07},
    year = {2016},
    url = {http://www.eecs.qmul.ac.uk/~andrea/eamtt.html}
    }

Posted in Dissemination | Leave a comment
Ear in the sky: Ego-noise reduction for auditory micro aerial vehicles

Ear in the sky: Ego-noise reduction for auditory micro aerial vehicles

  • Lin Wang and Andrea Cavallaro. Ear in the sky: Ego-noise reduction for auditory micro aerial vehicles. In Proceedings of 13th IEEE International Conference on Advanced Signal and Video based Surveillance (AVSS), Colorado Springs, USA, 2016. doi:10.1109/AVSS.2016.7738063
    [BibTeX] [Abstract]

    We investigate the spectral and spatial characteristics of the ego-noise of a multirotor micro aerial vehicle (MAV) using audio signals captured with multiple onboard microphones and derive a noise model that grounds the feasibility of microphone-array techniques for noise reduction. The spectral analysis suggests that the ego-noise consists of narrowband harmonic noise and broadband noise, whose spectra vary dynamically with the motor rotation speed. The spatial analysis suggests that the ego-noise of a P-rotor MAV can be modeled as P directional noises plus one diffuse noise. Moreover, because of the fixed positions of the microphones and motors, we can assume that the acoustic mixing network of the ego-noise is stationary. We validate the proposed noise model and the stationary mixing assumption by applying blind source separation to multi-channel recordings from both a static and a moving MAV and quantify the signal-to-noise ratio improvement. Moreover, we make all the audio recordings publicly available.

    @InProceedings{2016-08-WANG,
    title = {{Ear in the sky: Ego-noise reduction for auditory micro aerial vehicles}},
    author = {Lin Wang and Andrea Cavallaro},
    booktitle = {{Proceedings of 13th IEEE International Conference on Advanced Signal and Video based Surveillance (AVSS)}},
    address= {Colorado Springs, USA},
    date = {2016-08-23/2016-08-26},
    doi = {10.1109/AVSS.2016.7738063},
    year = {2016},
    abstract = {We investigate the spectral and spatial characteristics of the ego-noise of a multirotor micro aerial vehicle (MAV) using audio signals captured with multiple onboard microphones and derive a noise model that grounds the feasibility of microphone-array techniques for noise reduction. The spectral analysis suggests that the ego-noise consists of narrowband harmonic noise and broadband noise, whose spectra vary dynamically with the motor rotation speed. The spatial analysis suggests that the ego-noise of a P-rotor MAV can be modeled as P directional noises plus one diffuse noise. Moreover, because of the fixed positions of the microphones and motors, we can assume that the acoustic mixing network of the ego-noise is stationary. We validate the proposed noise model and the stationary mixing assumption by applying blind source separation to multi-channel recordings from both a static and a moving MAV and quantify the signal-to-noise ratio improvement. Moreover, we make all the audio recordings publicly available.}
    }

Posted in Dissemination | Leave a comment
Autonomous robotic cameras for collaborative target localization

Autonomous robotic cameras for collaborative target localization

  • Andrea Cavallaro. Autonomous robotic cameras for collaborative target localization. Invited talk at IEEE AVSS 2016 Workshop on Surveillance for Location-aware Data Protection, 2016.
    [BibTeX] [Download handouts]
    @Misc{2016-08-CAVALLARO,
    author = {Andrea Cavallaro},
    title = {{Autonomous robotic cameras for collaborative target localization}},
    howpublished = {Invited talk at IEEE AVSS 2016 Workshop on Surveillance for Location-aware Data Protection},
    date = {2016-08-23},
    year = {2016},
    address = {Colorado Springs, CO, USA},
    handouts = {http://www.eecs.qmul.ac.uk/~andrea/dwnld/2016.08.23_ColoradoSprings_AutonomousRoboticCameras.pdf}
    }

Posted in Dissemination | Leave a comment
ARTEMIS Technology Conference 2016: Form the Future Together!

ARTEMIS Technology Conference 2016: Form the Future Together!

ATT00001

ARTEMIS Industry Association organises the ARTEMIS Technology Conference (ATC), which will take place 4-6 October 2016 in Madrid, Spain. On 5-6 October, this event has – as the name indicates – a focus on deep technological presentations, both on project achievements and state-of-the-art technology, while on 4 October (starting at 12:00 hrs) and 5 October, a Pre-Brokerage for H2020 calls for projects in the area of Embedded Intelligence will take place.

Theme sessions

This 2nd edition of the ATC again contains a collection of parallel technical sessions on 5-6 October, built around the following themes:

1. Smart Cities
2. Smart Energy
3. Interoperability in CPS and IoT
4. Future CPS industrial research challenges

H2020 Pre-Brokerage

As the ATC takes place early October, its timing fits perfectly into the preparation period of project proposals for the related H2020 calls, including the ECSEL calls. Therefore, ARTEMIS-IA has organised a Pre-Brokerage at this event on 4-5 October, including project idea exhibition, in which consortia from communities that face Embedded Intelligence (Embedded & Cyber-Physical Systems, Internet of Things, Digital Platforms) related challenges will present their ideas for new projects. Last year the project became the winner of the site http://casinoreview-online.co.uk/. On this site you can find reviews on online casinos that will help you choose the best of all presented. This project is one of the most popular in the society of gamblers. If you would like to present a project idea, please use the Project Idea Tool.

Registration

Registration for the event is now open. Make use of the opportunity to discuss Embedded Intelligence related technological innovations and challenges, shake hands with key players in industry, R&I and knowledge centres, get first-hand information about new H2020 project ideas and even join these project consortia. In short: do not miss out on this important event and register today!

More information about the event, programme and hotel can be found on the ARTEMIS Technology Conference webpage

Posted in News, Public-events | Leave a comment