Registar
Resultados 1 a 3 de 3
  1. #1
    Tech Ubër-Dominus Avatar de Jorge-Vieira
    Registo
    Nov 2013
    Local
    City 17
    Posts
    30,017
    Avaliação
    1 (100%)

    Intel Project Tango

    Intel’s Project Tango Augmented Reality Tablet Lets You Shoot At Robots

    Google’s got a lot of projects that don’t lineup up with its operations as a search engine or a software provider. This was in fact one of the primary reasons that we saw the company diversify its core and become the part of larger parent company, now to be known as Alphabet. Whether Google and Co. get to keep the name, given that BMW already owns a company by the similar name, we’ll find out soon. But the company has just shown off its latest Project Tango tablet, this time manufactured in conjunction with US chipmaking giant, Intel.

    Intel Shows Off Project Tango Tablet With RealSense 3D

    Given Google’s preference to give anything even remotely futuristic a try, we got the first Project Tango tablet last year when the company partnered with Nvidia for its release. Powered by the Tegra K1, the device came with 4 GB of RAM, 128 GB of internal storage, a depth sensor and became the second device to run Nividia’s mobile SoC, after the Xiaomi MiPad. We also saw Google cut its price in half after a while, whether due to disappointing demand or to further boost market interest, its unknown. But things are moving ahead with Project Tango, as this time its Intel’s turn to mess around with Augmented Reality on Google’s behalf.





    As far as its functionality goes, the device manages to stay true to the concept of Augmented Reality. Not only does it let you simulate 3D models of your environment in real time, but it also lets you interact with it in new and interesting ways. Exampled include building Minecraft style building blocks into your mapped environment. In fact, one of the coolest features that the company showed off with the Tango device at IDF was a custom built NERF gun which nests the device and allows you to duck and shoot at and from robots that are mapped in your environment.

    The device comes with a 6-inch touchscreen, Intel’s Cherry Trail X5 processor ticking under the hood and a combined USB/HDMI port.The Project Tango developer kit is expected to go on sale by this year’s end and will cost around the same $512 price tag that goes along Nvidia’s device as well.


    http://www.portugal-tech.pt/image.php?type=sigpic&userid=566&dateline=1384876765

  2. #2
    Tech Ubër-Dominus Avatar de Jorge-Vieira
    Registo
    Nov 2013
    Local
    City 17
    Posts
    30,017
    Avaliação
    1 (100%)
    Intel Adds Hand-Tracking To Project Tango Phone, Without Peripherals

    Back at IDF 2015, I saw a couple of demos with Intel’s Project Tango smartphone, and although I could see the potential for what the group was trying to accomplish, the whole thing was a little rough.
    In just a few months, the group inside Intel working on Project Tango has added at least one major update, in the way of hand and object tracking, and the Intel team did it without using any additional peripherals. At IDF, they showed hand tracking with an OSVR headset and a Leap Motion sensor mounted on the front. Now, they can do it with just the RealSense camera (ZR300) embedded into the phone, with six degrees of freedom (6DoF).
    To be more specific, project head Paul Zhao calls it “hand virtualization,” not “hand tracking,” but the effect is the same. You can see your hands rendered in the virtual environment. If you get the device’s camera close enough, you can see a facsimile of any walls or other people you might bump into, as well.
    HMD And No HMD

    There are really two ways to use a device like Intel’s Project Tango phone. One way is as an augmented reality device, where you hold up the camera, and through the phone’s display you can see various things superimposed on the real world. Another way is to drop the phone into an HMD and enter a VR environment.
    Zhao showed me a simple HMD that the Tango team whipped up, and he ran me through a couple of demos. Mainly, I was presented with a mountainous environment and my virtualized hands extended out in front of me. I could walk over to a stone structure and knock it over, put my hand under a waterfall and watch the water splash off of it, and walk off the edge of the cliff. (I didn’t, though, because I could not force myself to do it, and furthermore, when I edged close to the precipice, I bumped into a real physical wall, and after that I was done pushing my luck.)


    The hand tracking virtualization was far superior to what I saw at IDF, but there’s still some work to do. As you can see from the images, the facsimile hand is quite pixelated, and you have to keep your hands extended far out in front of you so that the RealSense camera can see them. With slower movements, the tracking was persistent, but I’m not confident that it will be strong when you move more quickly. (To be honest, I didn’t even think to test that out, because when I had the HMD on, I was in an area with lots of foot traffic and nearby walls, and I didn’t want to whack anyone.)
    There’s a two-pronged approach to achieving this tracking, one active and one passive. The first part consists of laser projection (active); lasers bounce off of objects to let the camera know where they are. However, this leaves “holes” in the picture. Zhao noted that although they could send out a shotgun blast of lasers to get a more complete picture, what ends up happening is that you get too much noise and false positives. Therefore, they limit the number of lasers and instead employ “ambient IR” (passive) to fill in those holes without introducing all the noise.




    Intel Is In The Camera Business

    Intel’s Project Tango team deserves some acclaim for making big strides in a relatively short time.
    They’ve built a phone and added the ability to produce virtualized versions of your hands, other people nearby, and planes such as walls and floors, and they’ve done it with an embedded RealSense camera instead of a peripheral. To enable VR experiences, they made their own HMD, too.
    Even so, the phone is very much a phone; Zhao told me that the device has passed FCC certification, so you could drop a SIM card into it and use it like any other handset. And the whole package, including the final version of the dock I saw at IDF and the SDK, costs $399. It’s shipping in Q2, and if you want the dev kit, you can get it here.
    But really, Intel doesn’t want to sell phones. It wants to sell RealSense cameras. Its Project Tango phone is an elaborate proof of concept for what RealSense can do on a mobile device with Google’s Project Tango technology (or a variety of others, such as Unity). It hopes that its relatively inexpensive dev kits will encourage developers to innovate on the hardware.
    It’s been rather slow going and somewhat up, and down, for RealSense cameras across the board, but Intel’s handset shows some of the capabilities it unlocks -- including what it would be like to have both AR and VR capabilities baked into a smartphone.
    Noticia:
    http://www.tomshardware.com/news/int...ing,31270.html
    http://www.portugal-tech.pt/image.php?type=sigpic&userid=566&dateline=1384876765

  3. #3
    Tech Ubër-Dominus Avatar de Jorge-Vieira
    Registo
    Nov 2013
    Local
    City 17
    Posts
    30,017
    Avaliação
    1 (100%)
    Tengo Tango: Lenovo Demos Project Tango Experience In Art Museum


    As far as tradeshows go, you can hardly ask for a better locale than Barcelona, Spain, and you can’t go wrong with using the Museu Nacional D’Art De Catalunya as a backdrop for your event and product demo, which is what Lenovo did at Mobile World Congress to showcase its Project Tango tablet concept.
    Was it earth-shattering? No. But it did give us a taste of what Project Tango can do and how augmented reality can offer benefits in a variety of ways.



    Lenovo Demos Project Tango Tablet At Barcelona Art Museum
    A number of journalists, including yours truly, were shuttled from the MWC convention center to the art museum for refreshments and demos. Lenovo had us sorted into groups, and as our number was called, we walked through a gallery with a tour guide. She showed us how some paintings were virtually tagged so that when you aimed the Project Tango tablet’s camera at them, little white rectangles appear. When you click a box, you’re presented with additional information about the work, its artist and so on.

    Further, when you aimed the tablet at the floor, you could see a blue dotted line showing you where you should go next. At times, though, following the line technically meant that you would be making a sharp turn into a priceless work of art, as the device seemed to have trouble drawing the line around corners. At the end of the tour—the end of the blue dotted line—we saw an inverted tear drop, like on Google Maps, that indicates you’ve arrived at your destination.

    Think of it like those ear-piece-walking-tour things, but better—far more interactive, comprehensive and flexible.

    While we waited for our turn, we grazed and meandered about in a big hall in the museum. A number of people had Project Tango tablets, and you could track where they all were on a live map (they were represented by little arrows) on a huge display set up in one corner.

    If you had one of the tablets, you could hold it up and point it around the room. When the device “found” another Tango tablet, you could see a virtual icon floating above that person’s head.

    So, then, there were really two use cases on display at the Lenovo event, but to me, both point towards navigation. In one case, you can see how you could engage more fully with people, places and things that are tagged. Sure, it’s cool in a museum, but imagine that you’re in an airport trying to figure out where you’re supposed to go. You could hold your phone (or tablet, but in this fantasy it’s a phone) up to a gate and see the flight information, such as when it’s departing and if it’s delayed. When walking the streets of a city, you could point your phone at buildings along the block and see the info you’d normally extract from a Google search (business name, hours, menu, ratings, etc.) on your display. And so on and so forth. All of this data would be superimposed on the image your camera sees and displays on your screen.


    I spent a good deal of time in Barcelona using my phone and Google Maps to get around. How much better would it have been, though, if instead of glancing down at an arrow slowly creeping its way down an illustrated facsimile of the street, I could have followed that blue dotted line from our flat all the way to La Boqueria for lunch.

    I also love the location feature. If you and your acquaintances were meeting somewhere—say, a music festival, which is crowded and expansive—and your devices could all “see” each other, you could hold your phone up to the crowds and just look for the icons dancing above your friends’ heads.
    It’s important to note that none of this is obtrusive to others nearby. You’re the only person that can see the superimpositions and overlays on your device’s screen (unless someone is being creepy and trying to steal a glance).

    We’ve seen progress on Project Tango from both Lenovo and Intel just since CES (which was mere weeks ago), and the two companies are taking Google’s technology in different, but promising, directions.
    Noticia:
    http://www.tomshardware.com/news/len...mos,31293.html
    http://www.portugal-tech.pt/image.php?type=sigpic&userid=566&dateline=1384876765

 

 

Informação da Thread

Users Browsing this Thread

Estão neste momento 1 users a ver esta thread. (0 membros e 1 visitantes)

Bookmarks

Regras

  • Você Não Poderá criar novos Tópicos
  • Você Não Poderá colocar Respostas
  • Você Não Poderá colocar Anexos
  • Você Não Pode Editar os seus Posts
  •