Please use this identifier to cite or link to this item: http://lib.uet.vnu.edu.vn/handle/123456789/703
Title: LOCALIZATION AN OBJECT IN 3D-SPACE BY IMAGE-SENSOR AND KINECT DEVICE
Authors: Hoang-Anh Tran
Issue Date: 2014
Abstract: In the year of 1970s, along with the birth of computer with high speed processing, human started to get more attention to computer vision. Thus, nowadays, computer vision has become more and more popular. It is a broad field and has many application related to industry, security and society. However, the more developed of technology, the higher requirement about precision in image processing. An image should not only contain data about form, shape or color but also the depth information of the scene. The most popular method to perceive the depth information of an image is stereo imaging. This method was created base one the characteristic of our two eyes to recognize the depth. Another method to achieve the depth values is the structured light technology which projecting a known pattern of pixels on to the scene to deduce the depth. The purpose of this research is to have a better understanding about these two methods and also make some comparisons between them: what their advantage and disadvantage are when put in some applications of image processing. This research methodology is calculating the depth values of several points in the capture image of both stereo camera and Kinect device. After that, we would make a comparison between each points and decide what the advantage and disadvantage of the methods are and also if errors in some relative points is acceptable or not. Finally, depend on these statistics, we would estimate which method is better when using to locate an object in 3D space.
URI: https://lib.uet.vnu.edu.vn/handle/123456789/703
Appears in Collections:Khóa luận Khoa Vật lý kỹ thuật và Công nghệ Nano

Files in This Item:
File Description SizeFormat 
K55Đ - Anh Tran Hoang - Tom tat tieng Anh.doc23.5 kBMicrosoft WordView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.