[Show abstract][Hide abstract] ABSTRACT: This paper presents a system for obstacle detection in railway level crossings from 3D point clouds acquired with tilting 2D laser scanners. Although large obstacles in railway level crossings are detectable with current solutions, the detection of small obstacles remains an open problem. By relying on a tilting laser scanner, the proposed system is able to acquire highly dense and accurate point clouds, enabling the detection of small obstacles, like rocks laying near the rail. During an offline training phase, the system learns a background model of the level crossing from a set of point clouds. Then, online, obstacles are detected as occupied space contrasting with the background model. To reduce the need for manual on-site calibration, the system automatically estimates the pose of the level crossing and railway with respect to the laser scanner. Experimental results show the ability of the system to successfully perform on a set of 41 point clouds acquired in an operational one-lane level crossing.
No preview Â· Article Â· Feb 2016 Â· Journal of Sensors
[Show abstract][Hide abstract] ABSTRACT: This paper presents an aerial-ground field robotic team, designed to collect and transport soil and biota samples in estuarine mud-flats. The robotic system has been devised so that its sampling and storage capabilities are suited for radionuclides and heavy metals environmental monitoring. Automating these time-consuming and physically demanding tasks is expected to positively impact both their scope and frequency. The success of an environmental monitoring study heavily depends on the statistical significance and accuracy of the sampling procedures, which most often require frequent human intervention. The bird's-eye view provided by the aerial vehicle aims at supporting remote mission specification and execution monitoring. This paper also proposes a preliminary experimental protocol tailored to exploit the capabilities oâµered by the robotic system. Preliminary field trials in real estuarine mudflats show the ability of the robotic system to successfully extract and transport soil samples for oâine analysis.
[Show abstract][Hide abstract] ABSTRACT: Terrains of different nature exhibit distinct structural dynamics when exposed to wind due to their own intrinsic material composition. This translates into peculiar optical flow patterns that can be used to identify the terrain type. In this sense, this paper proposes an active vision-based water detection model that exploits the predictable optical flow patterns induced by the downwash effect of vertical takeoff and landing Unmanned Aerial Vehicles (UAV). To determine whether a water surface is below the UAV, the system tracks the optical flow in the video feed captured by a downward-looking camera. Then, an histogram of optical flow orientation is built and compared against a model histogram, given the expected effect induced by the downwash effect. The histograms are compared and similarity between both histograms is used as the likelihood of the terrain as being covered by water The resulting classification can be used to guide the landing of the UAV or to produce cost maps supporting ground vehicles' safe navigation. The model was successfully validated on 20 videos acquired with an hexacopter while hovering above sandy, grassy, and water-covered terrains.
[Show abstract][Hide abstract] ABSTRACT: This paper presents an open-source watertight multirotor Unmanned Aerial Vehicle (UAV) capable of vertical take-off and landing on both solid terrain and water for environmental monitoring. The UAVâs propulsion system has been designed so as to also enable the active control of the UAVâs drift along the
water surface. This low power locomotion method, novel to such a vehicle, aims to extend the available operation time on water bodiesâ surveys. The ability to take-off from water allows the UAV to overcome any obstruction that appears on its path, such as a dam. A set of field trials show the UAVâs water-tightness, its take-off and landing capabilities in both land and water, and also the ability to actively control its on-surface drifting.
[Show abstract][Hide abstract] ABSTRACT: A robust maintenance of ecosystems demands for highly accurate and frequent monitoring of their status. The extension and remoteness of some environments renders their human-based monitoring extremely difficult. Riverine environments are a notorious example, as their sampling requires to bear into account both streams and riverbanks. The relevance of monitoring riverine environments is magnified by the intricate interactions that occur between river waters and coastal waters. This article provides a critical survey of existing solutions using robots for environmental monitoring of water bodies. Based on the survey, this article argues that autonomous robotic marsupial systems are especially adequate for the tasks at hand. Lessons learned, as well as future avenues on the application of marsupial robotic teams to environmental monitoring, are laid out in this article.
[Show abstract][Hide abstract] ABSTRACT: This paper presents a method for 3-D based ob-stacle detection on autonomous vehicles navigating in vegetated environments. At its core three different methods processing the surrounding occupancy, taken at separate stages and volumetric resolutions, are combined to a reliable and broad solution. Geometric relationships are evaluated at a coarse, yet robust, volumetric representation to form an initial assessment on obstacles. Then, a more careful evaluation takes place, at finer resolutions, to determine which obstacles are part of the scene's vegetation, thus not real obstacles. Field experiments are shown to validate the method's applicability on two different autonomous vehicles: a water surface robot and a terrestrial four-wheeled one.
[Show abstract][Hide abstract] ABSTRACT: This paper presents RIVERWATCH, an autonomous surface-aerial marsupial robotic team for riverine environmental monitoring. The robotic system is composed of an Autonomous Surface Vehicle (ASV) piggybacking a multirotor Unmanned Aerial Vehicle (UAV) with vertical takeoff and landing capabilities. The ASV provides the team with longrange transportation in all-weather conditions, whereas the UAV assures an augmented perception of the environment. The coordinated aerial, underwater, and surface level perception allows the team to assess navigation cost from the near field to the far field, which is key for safe navigation and environmental monitoring data gathering. The robotic system is validated on a set of field trials.
[Show abstract][Hide abstract] ABSTRACT: This paper presents the core ideas of the RIVERWATCH experiment and describes its hardware architecture. The RIVERWATCH experiment considers the use of autonomous surface vehicles piggybacking multi-rotor unmanned aerial vehicles for the automatic monitoring of riverine environments. While the surface vehicle benefits from the aerial vehicle to extend its field of view, the aerial vehicle benefits from the surface vehicle to ensure long-range mobility. This symbiotic relation between both robots is expected to enhance the robustness and long lasting of the ensemble. The hardware architecture includes a considerable set of state-of-the-art sensory modalities and it is abstracted from the perception and navigation algorithms by recurring to the Robotics Operating System (ROS). A set of field trials shows the ability of the prototype to scan a closed water body. The datasets obtained from the field trials are freely available to the robotics community.
[Show abstract][Hide abstract] ABSTRACT: Testing and debugging real hardware is a time consuming task, in particular for the case of aquatic robots, for which it is necessary to transport and deploy the robots on the water. Performing waterborne and airborne field experiments with expensive hardware embedded in not yet fully functional prototypes is a highly risky endeavour. In this sense, physics-based 3D simulators are key for a fast paced and affordable development of such robotic systems. This paper contributes with a modular, open-source, and soon to be freely online available, ROS-based multi-robot simulator specially focused for aerial and water surface vehicles. This simulator is being developed as part of the RIVERWATCH experiment in the ECHORD european FP7 project. This experiment aims at demonstrating a multi-robot system for remote monitoring of riverine environments.
[Show abstract][Hide abstract] ABSTRACT: This paper presents a robot navigation system capable of online self-reconfiguration according to the needs imposed by the various contexts present in heterogeneous environments. The ability to cope with heterogeneous environments is key for a robust deployment of service robots in truly demanding scenarios. In the proposed system, flexibility is present at the several layers composing the robot's navigation system. At the lowest layer, proper locomotion modes are selected according to the environment's local context. At the highest layer, proper motion and path planning strategies are selected according to the environment's global context. While local context is obtained directly from the robot's sensory input, global context is inspected from semantic labels registered off-line on geo-referenced maps. The proposed system leverages on the well-known Robotics Operating System (ROS) framework for the implementation of the major navigation system components. The system was successfully validated over approximately 1 Km long experiments on INTROBOT, an all-terrain industrial-grade robot equipped with four independently steered wheels.
[Show abstract][Hide abstract] ABSTRACT: This paper presents ARES-III, a multi-purpose service robot for robust operation in all-terrain outdoor environments. Currently in pre-production phase, ARES-III is aimed to fulfil the requirements of a robotic platform that is able to support the development of real world applications in surveillance, agriculture, environmental monitoring, and other related domains. These demanding scenarios motivate a design focused on the reliability of the mechanical platform, the scalability of the control system, and the flexibility of its self-diagnosis and error recovery mechanisms. These are key features of ARES-III often disregarded in current commercial and research platforms. First, a comprehensive set of field trials demonstrated the ability of the ARES-III chassis, made of durable materials and with no-slip quasi-omnidirectional kinematic characteristics, to perform robustly in rough terrain. Second, supported by a control system fully compliant with the wide spread Robot Operating System (ROS), the scalability of ARES-III is enforced. Finally, the integration of active self-diagnosis and error recovery mechanisms in ARES-III control system fosters long lasting operation.
During the course of my MsC (completed in 2014) and early research work towards my PhD, under supervision of José Barata, I have been involved with diverse mobile robotics-related projects at the UNINOVA institute. I have received my MSc equivalency from Faculdade de Ciências e Tecnologia (Universidade Nova de Lisboa) submitting my dissertation on volumetric approaches to hazard mapping for ground vehicles’ off-road navigation. In line with the work on further national and european research projects, I'm working on a PhD study on artificial intelligence of mobile vehicles, namely on the cognitive end to visual attention.
An Aerial-Ground Robotic Team for Systematic Soil and Biota Sampling in Estuarine Mudflats
Conference Second Iberian Robotics Conference (ROBOT'2015).