Live 3D-TV Streaming

University essay from Blekinge Tekniska Högskola/Sektionen för ingenjörsvetenskap

Abstract: The world is not flat as a pancake. It has height, width and depth. So we should see it even on TV. So far we cannot see three-dimensional programs directly into our TVs. Not even in cinemas with 3D cinema works "for real". For still there the magic sits in those glasses. The glasses of different colors allow distinguishing right and left eye impression tightened so that one sees different images with each eye. That is what creates the illusion of three dimensions. The goal of this thesis is to be on track to change that. Then you should achieve the same feeling without having required glasses, however, with a different technique. Do you remember those pictures that used to accompany the cereal packets? When angled in one direction, it was Donald Duck and angled it the other way it was Mickey Mouse. Our work is in the same way, though not with different images but with different perspectives. Same ribbed surface that existed at the pictures in cereal packets, are used as matter of fact on our 3D TV screen. Depending from which angle you look certain image information is hiding as it falls behind the ribbed surface. It thus separates views through the screen. This thesis project is focused on a prototyping of live 3D TV streaming application where a live video of a scene is viewed on a 3D auto-stereoscopic display that gives two different perspectives, or views, simultaneously. The TV uses a face search (eye tracking) system to set up the television optimal for those who want to see 3D without glasses. During thesis a simple 3D studio was built where the focus has been to show depth perception. For scene capturing two cameras were used. We have found an engineering solution to take pictures simultaneously from the cameras. The input images from two cameras are sent to an analog to digital converter (frame grabber) as two channels of a virtual color camera, which means real time and synchronized capturing in a simple way. The project has several applications written in C++ using various open source libraries, which essentially grab stereo image sequences from cameras using frame grabber, transfer image sequences to other applications via server communication, and display the live video in 3D display by exclusive rendering method. The communications between different applications for the purposes of transmission and receiving of video data is done using socket programming. The results of the project are very promising in which the live video of a scene can be viewed with noticeable depth despite obvious lagging in video timing.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)