Welcome, guys, this site was created to be a place where creativity and imagination blend in harmony in terms of games, movies, arts, songs, health, and technology. This site is currently under development, any suggestion about the content will be happily received. We hope you enjoy your visits!

Thursday, March 13, 2014

Google's Project Tango





Recently, Google has just announced their new project called “Project Tango”. Project Tango is a new phone with highly customized hardware and software, designed to allow it to track its motion in full 3D, in real-time, as you hold it. In the prototype which was unveiled by Google, the smartphone is fitted with 3D sensors as part of its Project Tango – which aims to give mobile devices a human-scale understanding of space and motion.


The phone is equipped with a variety of cameras and vision sensors that provides a whole new perspective on the world around it. It works by emitting pulses of infrared light and records it with a sensor when it reflected back. The sensor can make a quarter million 3D measurements every second, allowing it to build a detailed depth map of surrounding space. Google says the Tango smartphone can capture a wealth of data never before available to app developers, including depth- and object-tracking and real-time 3D mapping. And it's no bigger or more dependent on power than your typical smartphone.




It runs Android and includes development APIs to provide position, orientation, and depth data to standard Android applications written in Java, C/C++, as well as the Unity Game Engine. These early prototypes, algorithms, and APIs are still in active development. So, these experimental devices are intended only for the adventurous and are not a final shipping product.


The processing chips used in Project Tango is Myriad 1 vision processor developed by Movidius (A company that has been working on computer vision technology for the past seven years), which Google paired with sensors and cameras to give the smartphone the same level of computer vision and tracking that formerly required much larger equipment. In fact, El-Ouzzane says the technology isn't very different at all from what NASA's Exploration Rover used to map the surface of Mars a decade ago, but instead of being in a 400-pound vehicle, it fits in the palm of your hand. 


After we have all the data, what can you do with them? That's really up to app developers and is the reason Google is giving out 200 of these prototype devices to developers in the coming weeks. The devices that we saw were equipped with a few demonstration apps to show off some of the hardware's capabilities. One of the apps was able to display a distance heat map on top of what the camera sees, layering blue colors on far away objects and red colors on things that are close up. Another took the data from the image sensors and paired with the device's standard motion sensors and gyroscopes to map out paths of movement down to 1 percent accuracy and then plot that onto an interactive 3D map.




One of the demos was an app that was able to capture a 3D model of a scene in real time and draw it on the display as you moved the device around the room. It's pretty amazing to see a three-dimensional model of the table in front of you get drawn in real time in just a few seconds by a smartphone. The potential applications for this type of technology are pretty widespread, with the most obvious ones being 3D-mapping apps for room and building planning. But El-Ouzzane notes that the depth-tracking technology could also be used to help the visually impaired "see" in front of them and give them warnings and alerts to obstacles in their paths. It's not hard to imagine this being integrated into a wearable necklace that would replace the age-old walking stick. Other applications could be advanced augmented-reality games that provide far greater detail in scenes and integrate more models of real-world objects. Movidius' processor can feed a lot more data to a smartphone's graphics chip than a standard camera, giving the GPU that much more to work with when developing a scene. The visual-effects world could also use this technology to build 3D sets and models in record time and with little work compared to what is required today.


El-Ouazzane wasn't able to tell us how much the Project Tango device cost to build or when we'll be able to see something using this technology in a consumer product, but he was confident that it won't be very long before everyone has computer-vision-equipped smartphones in their hands. Google says that developers that have applied for access to the prototype device should have it by the middle of March, and chances are we'll see the products of those developers' efforts in the very near future. 



Name                    :  Hans Chandra
Lecturer Name      :  Aditya Pratomo
Campus                 :  Surya University 




https://www.google.com/atap/projecttango/ 
; ;