Mostrar el registro sencillo

dc.contributor.authorRobla Gómez, María Sandra 
dc.contributor.authorLlata García, José Ramón
dc.contributor.authorTorre Ferrero, Carlos 
dc.contributor.authorGonzález Sarabia, Esther 
dc.contributor.authorBecerra, Víctor M. 
dc.contributor.authorPérez Oria, Juan María 
dc.contributor.otherUniversidad de Cantabriaes_ES
dc.date.accessioned2014-08-08T10:53:58Z
dc.date.available2014-08-08T10:53:58Z
dc.date.issued2014-06-12
dc.identifier.issn1687-6180
dc.identifier.otherDPI2012-36959es_ES
dc.identifier.urihttp://hdl.handle.net/10902/5054
dc.description.abstractThis work presents a method of information fusion involving data captured by both a standard charge-coupled device (CCD) camera and a time-of-flight (ToF) camera to be used in the detection of the proximity between a manipulator robot and a human. Both cameras are assumed to be located above the work area of an industrial robot. The fusion of colour images and time-of-flight information makes it possible to know the 3D localization of objects with respect to a world coordinate system. At the same time, this allows to know their colour information. Considering that ToF information given by the range camera contains innacuracies including distance error, border error, and pixel saturation, some corrections over the ToF information are proposed and developed to improve the results. The proposed fusion method uses the calibration parameters of both cameras to reproject 3D ToF points, expressed in a common coordinate system for both cameras and a robot arm, in 2D colour images. In addition to this, using the 3D information, the motion detection in a robot industrial environment is achieved, and the fusion of information is applied to the foreground objects previously detected. This combination of information results in a matrix that links colour and 3D information, giving the possibility of characterising the object by its colour in addition to its 3D localisation. Further development of these methods will make it possible to identify objects and their position in the real world and to use this information to prevent possible collisions between the robot and such objects.es_ES
dc.description.sponsorshipThis work has been supported by the Ministry of Economy and Competitiveness of the Spanish Government (project DPI2012-36959)es_ES
dc.format.extent20 p.es_ES
dc.language.isoenges_ES
dc.publisherSpringerOpenes_ES
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internationales_ES
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/
dc.sourceEURASIP Journal on Advances in Signal Processing 2014, 2014:88es_ES
dc.subject.otherActive securityes_ES
dc.subject.otherIndustrial robotes_ES
dc.subject.otherToF and colour camerases_ES
dc.subject.otherInformation fusiones_ES
dc.titleVisual sensor fusion for active security in robotic industrial environmentses_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.relation.publisherVersionhttps://doi.org/10.1186/1687-6180-2014-88es_ES
dc.rights.accessRightsopenAccesses_ES
dc.identifier.DOI10.1186/1687-6180-2014-88
dc.type.versionpublishedVersiones_ES


Ficheros en el ítem

Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo

Attribution-NonCommercial-NoDerivatives 4.0 InternationalExcepto si se señala otra cosa, la licencia del ítem se describe como Attribution-NonCommercial-NoDerivatives 4.0 International