Image & Vision Te...
Follow
Find
2.1K views | +0 today
Image & Vision Technology
New Technologies for Image and Computer Vision, such as Biometrics, Visual Surveillance, Augmented Reality, Kinect, 3D, Wearable Vision,etc.
Curated by Yuan-Kai Wang
Your new post is loading...
Your new post is loading...
Scooped by Yuan-Kai Wang
Scoop.it!

Kinect for Xbox One: An always-on, works-in-the-dark camera and microphone. What could possibly go wrong? | ExtremeTech

Kinect for Xbox One: An always-on, works-in-the-dark camera and microphone. What could possibly go wrong? | ExtremeTech | Image & Vision Technology | Scoop.it
The Xbox One will feature, by default, an always-on, works-in-the-dark, microphone and camera that's constantly connected to the internet and 300,000 servers. What could possibly go wrong?
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Facebook’s New Colocation And Image Recognition Patents Tease The Future Of Sharing

Facebook’s New Colocation And Image Recognition Patents Tease The Future Of Sharing | Image & Vision Technology | Scoop.it
Facebook’s empire was built on photo tags and sharing, but it’s a grueling process many neglect.
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Natalia Immersive SmartGoggles: A dose of augmented reality (Wired UK)

Natalia Immersive SmartGoggles: A dose of augmented reality (Wired UK) | Image & Vision Technology | Scoop.it
This is what you wear when reality is not enough.See it on Scoop.it, via augmented reality examples...
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Artificial Intelligence, Powered by Many Humans - Technology Review

Artificial Intelligence, Powered by Many Humans - Technology Review | Image & Vision Technology | Scoop.it
Crowdsourcing is usually mindless -- parcelling out microtasks to many people, each of whom contribute a bit of work or perception (often for a few cents).
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Tech Journalists’ Initial Impressions of the Google Glass Camera Glasses

Tech Journalists’ Initial Impressions of the Google Glass Camera Glasses | Image & Vision Technology | Scoop.it

Michael Zhang · Sep 16, 2012

 

Developers who pre-ordered Google’s Project Glass glasses for $1,500 won’t be receiving them until early 2013, but a number of lucky journalists were recently given the opportunity to take the camera-equipped, augmented reality eye-piece for a test drive. The New York Times’ gadget kingmaker David Pogue writes that the device has the potential to be one of the rare devices that introduces a whole new gadget category to the world,

 

[...] a few things are clear. The speed and power, the tiny size and weight, the clarity and effectiveness of the audio and video, are beyond anything I could have imagined. The company is expending a lot of effort on design — hardware and software — which is absolutely the right approach for something as personal as a wearable gadget

 

[...] it’s much too soon to predict Google Glass’s success or failure. But it’s easy to see that it has potential no other machine has ever had before — and that Google is shepherding its development in exactly the right way.

 

Spencer E. Ante of the Wall Street Journal is enthusiastic about the technology as well, but thinks it still needs a “killer app” in order to be accepted:

 

After 10 minutes of playing with the glasses [...] I could see their long-term potential. The device fit well. It was easy to snap a picture or video without taking my smartphone out of my pocket. It was cool to see the information there in front of my right eye, though a little disorienting. I kept closing my left eye, which was uncomfortable.

[...] What’s really missing, though, is a killer app that could really show the technology’s potential. As Mr. Brin tells it, the glasses are like a less obtrusive smartphone that rids the world of people looking down at their devices while walking on the street. That is great, but it doesn’t seem ambitious enough.

 

The ability to photograph life hands-free is already something of a “killer app” in my eyes, since it would allow people to snap photos in situations in which even GoPros would seem unwieldy. Google seems to agree, since virtually all of the Project Glass demos and promos up to this point have focused on the device’s potential as a camera.

 

The spread of smartphone photography shows that the general public wants as-easy-as-possible photos of their lives and memories, so the big challenge for Google is to make its product look great in all aspects: how the device looks while it’s being worn and how the images look when the pop out.

 

If they can nail both those things, then it stands a much better chance of replacing a huge aspect of how smartphones are currently being used.

 

 

more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Wearable Tech Makes a Fashion Statement

Wearable Tech Makes a Fashion Statement | Image & Vision Technology | Scoop.it
A new movie premieres in New York today and chances are none of you will ever see it.

It’s a short film titled “DVF Through Glass” and it’s video that models working for designer Diane von Furstenberg shot during New York’s Fashion Week using Google glasses they were wearing. (Google prefers to call its augmented reality devices Google Glass to distinguish them from actual glasses because they contain no glass. Got that?)

They’re the frames that caused such a stir last spring when Google unveiled them, wearable computers that can shoot videos and photos and tell you where the nearest Starbucks can be found. By wearing them as they strolled down the runway, von Furstenberg’s models became high-tech accessorized. For its part, Google managed to de-geek its invention a tad by putting it on fashion models, not to mention grab some New York media exposure before all the spotlights swung over to Apple’s iPhone 5.

As Spencer Ante pointed out in The Wall Street Journal this week, Google Glass remains a work in progress, with much of its software unfinished. It won’t be available until next year and, at $1,500 a pop, will likely be a novelty bauble for awhile.

Getting appcessorized

Still, it’s already the best known of what are being called “appcessories,” wearable devices that work with smart phones. Earlier this week, a potential challenger, glasses developed by a British firm called The Technology Partnership (TTP), made its debut. Unlike Google Glass, the TTP device looks like regular glasses and beams an image directly into the wearer’s eye, instead of making him or her shift focus to a tiny screen attached to the frame.

Then there’s the Pebble, a smart watch that tells you the time, but also connects wirelessly with your iPhone or Android phone to show you who’s calling, display text messages, Facebook or email alerts and let you control, from your wrist, what’s playing on your smartphone. Its inventors had hoped to raise $100,000 on Kickstarter, with the goal of selling 1,000 watches. Instead they raised $10 million and already have orders for 85,000 watches–so many that they’ve had to push back the first shipment, which was supposed to start this month.

It’s that kind of response that has a lot of people predicting that wearable computing is the next big wave, the thing that will free us from what’s been called the “black mirror” of our smartphone screens. Your phone may still be the powerful little computer you carry around, but it may never have to leave your pocket.

Ring power

Or you can do without the phone altogether. London digital art director Dhani Sutanto created an enamel ring with the electronics of a transit card implanted in it. One swipe of his ring and he can ride the London subway.

His goal, he says, is to design “interactions without buttons,” to link physical items–such as a ring–to your virtual identity and preferences.

“Imagine a blind person using an ATM and fumbling with the buttons or touch screen,” Sutanto recently told an interviewer. “If they had wearable technology in the form of a ring, for example, they could approach and just touch it. The ATM would say, “Welcome, Mr. Smith. Here’s your £20.”

Turn me on

Google wasn’t alone in infusing tech in Fashion Week. Microsoft was there, too, presenting a dress that tweeted. Okay, the dress, made of paper, didn’t actually tweet, but the person wearing it could, using a keyboard on its bodice, decorate the bottom of the dress with Twitter banter.

My guess–and hope–is that this won’t catch on and we will never have to live in a world where people wear their tweets on their sleeves. But another breakthrough in wearable tech a few months ago could dramatically change what we expect our clothes to do for us.

Scientists at the University of Exeter in the U.K. have created a substance that can be woven into a fabric to produce the lightest, most transparent and flexible material ever made that conducts electricity. One day, they say, we could be walking around in clothing that carries a charge.

To me, this would not seem a good fashion choice if there’s even a chance of thunder and lightning. But the researchers at Exeter have happier thoughts. They talk of shirts that turn into MP3 players and of charging your phone with your pants.

Which could give new meaning to “wardrobe malfunction.”

Plugged in

Here are other recent developments in wearable tech:

You’ve got the power: A British professor is trying to produce clothing made with materials capable of generating electricity from either the warmth or movement of the human body.
If you must talk in public, do it with style: Nothing stylish about walking around wearing a Bluetooth headset. But now, at least for women, there are other options, such as a pendant that works like a headset, but looks like a necklace.
One device to rule them all: Scientists at Dartmouth are developing a device worn like a bracelet that would authenticate a user’s identity and connect any other medical devices he or she has had implanted or is wearing.
...

more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Wearable Cameras May One Day Give Us Ultra-HDR Vision

Wearable Cameras May One Day Give Us Ultra-HDR Vision | Image & Vision Technology | Scoop.it

Michael Zhang · Sep 12, 2012

 

When doing certain types of welding, special helmets with dark lens shades should be used to protect the eyes from the extremely bright welding arc and sparks. The masks help filter out light, protecting your eyes, but at the same time make it hard to see the details in what you’re doing. In other words, the dynamic range is too high, and wearers are unable to see both the arc and the objects they’re welding.

 

A group of researchers in the EyeTap Personal Imaging Lab at the University of Toronto have a solution, and it involves cameras. They’ve created a “quantigraphic camera” that can give people enhanced vision. Instead of being tuned to one particular brightness, it attempts to make everything in front of the wearer visible by using ultra high dynamic range imaging.

 

For example, a welder using the helmet would be able to see both the details of the bright welding arc and the details on the metal he or she is working on.

Here’s a slightly technical video explaining the project:

...

more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Kintinuous - Kinect Creates Full 3D Models

Kintinuous - Kinect Creates Full 3D Models | Image & Vision Technology | Scoop.it
Kintinuous - Kinect Creates Full 3D ModelsiProgrammerThe system is being designed to allow robots to build models of their surrounding - Simultaneous Localization and Mapping or SLAM.See it on Scoop.it, via Mobile computer vision...
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Projection-based Augmented Reality Demo

Projection-based Augmented Reality Demo | Image & Vision Technology | Scoop.it
Projected Augmented Reality Prototype by RTT AG (www.rtt.ag) and Extend3D (www.extend3d.de) shown at "RTT Excite" Conference in Detroit, MI, May, 31st + June...See it on Scoop.it, via Mobile computer vision...
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Olympus Announces Augmented Reality Glasses Project

Olympus Announces Augmented Reality Glasses Project | Image & Vision Technology | Scoop.it
T he Japanese firm Olympus, perhaps in reaction to Google's recent unveiling of Project Glass has let it be known they have been researching and developing wearable displays for more than 20 years.
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

How to Beat Facial-Recognition Software

How to Beat Facial-Recognition Software | Image & Vision Technology | Scoop.it
Kilpatrick gives presentations about the capabilities of facial-recognition systems, but is concerned about privacy as well.
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Infographic: We Want Our Connected TV - Magnet Media

Infographic: We Want Our Connected TV - Magnet Media | Image & Vision Technology | Scoop.it
An Infographic from Tremor video examines some data about connected TV users and shows why marketers need to be paying attention. ...
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

TV Guide rolls out iOS app update to enhance the second screen ...

TV Guide rolls out iOS app update to enhance the second screen ... | Image & Vision Technology | Scoop.it
While there's a growing number of TV show discovery services using mobile devices (aka the second screen) to enhance the viewing experience, few have the name recognition of TV Guide.See it on Scoop.it, via Smart TV, the new tv...
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Google+ Now Using 'Computer Vision' to Identify and Index Photos by Content

Google+ Now Using 'Computer Vision' to Identify and Index Photos by Content | Image & Vision Technology | Scoop.it
Today, Google’s Search blog announced that the company has started implementing technology that will allow you to search for your photos based on what they contain visually, even if there’s not a tag in sight.
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

糖尿病患免扎針 眼球測血糖將上市

糖尿病患免扎針 眼球測血糖將上市 | Image & Vision Technology | Scoop.it

 

2012/10/3

 

糖尿病患有福了!工研院研發出最新的非侵入式「即時自我監測血糖儀」技術,透過低能量紅外光照射眼球水晶體前的前房液,檢測葡萄糖濃度的變化,推算人體相對的血糖濃度值。目前已進入人體試驗,最快5年後推出。

 

工研院電子與光電研究所長詹益仁表示,糖尿病的盛行使醫療照護費用不斷升高,2010年全球花費在糖尿病照護的支出約為3800億美元。我國40歲以上人口的糖尿病盛行率,也從30年前5%提升到目前的10%。對糖尿病患者而言,最讓人難耐的莫過於每天4-6次穿刺採血檢驗血糖值。


詹益仁解釋,工研院首度研發的非侵入式「即時自我監測血糖儀」技術,透過低能量紅外光照射水晶體前的前房液,測量前房液中葡萄糖對紅外光產生吸光性與旋光性的變化,來檢測眼睛內前房液的葡萄糖濃度改變,進而推估測試者的血糖濃度值。

 

副所長刁國棟也指出,目前市面上最普遍使用的血糖監測產品仍以侵入式血糖儀為主。以採血針於指尖採血,配合使用血糖試紙,經由電化學反應來測量血液中的血糖濃度。因指尖採血有引發病患疼痛及引發感染的疑慮,部分糖尿病患並未經常測量其血糖,導致血糖濃度控制不佳。

 

工研院電子與光電研究所組長陳治誠則表示,非侵入式血糖監測就是在不流血的情況下對病患進行血糖測量,減少病患因採血帶來的疼痛和壓力。這項技術不受眼球形狀影響,只要不是失明或白內障等嚴重視覺障礙者,都可以測量,特色是無痛、無血、無感染、無耗材,並可做連續監控。目前已成功完成動物實驗,即將進入人體試驗,預計在5年後上市。

more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Kinect and PlayStation Eye combined in 3D augmented reality coffee table

Kinect and PlayStation Eye combined in 3D augmented reality coffee table | Image & Vision Technology | Scoop.it
Microsoft Surface is cool, sure, but it could be even cooler if it took inspiration from another division of the company.
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Google Glass Prepares To Change The World

Google Glass Prepares To Change The World | Image & Vision Technology | Scoop.it

By David Pogue, New York Times, September 13, 2012

New gadgets — I mean whole new gadget categories — don’t come along very often. The iPhone was one recent example. You could argue that the iPad was another. But if there’s anything at all as different and bold on the horizon, surely it’s Google Glass.
That, of course, is Google’s prototype of a device you wear on your face. Google doesn’t like the term “glasses,” because there aren’t any lenses. (The Glass team, part of Google’s experimental labs, also doesn’t like terms like “augmented reality” or “wearable computer,” which both have certain baggage.)
Instead, Glass looks like only the headband of a pair of glasses — the part that hooks on your ears and lies along your eyebrow line — with a small, transparent block positioned above and to the right of your right eye. That, of course, is a screen, and the Google Glass is actually a fairly full-blown computer. Or maybe like a smartphone that you never have to take out of your pocket.
This idea got a lot of people excited when Nick Bilton wrote about the glasses in February in The New York Times. Google first demonstrated it April in a video. In May, at Google’s I/O conference, Glass got some more play as attendees watched a live video feed from the Glass as a sky diver leapt from a plane and parachuted onto the roof of the conference building. But so far, very few non-Googlers have been allowed to try them on.
Last week, I got a chance to try on a pair. I’m hosting a PBS series called “Nova ScienceNow” (it premieres Oct. 10), and one of the episodes is about the future of tech. Of course, projecting what’s yet to come in consumer tech is nearly impossible, but Google Glass seemed like a perfect example of a breakthrough on the verge. So last week the Nova crew and I met with Babak Parviz, head of the Glass project, to discuss and try out the prototypes.
Now, Google emphasized — and so do I — that Google Glass is still at a very, very early stage. Lots of factors still haven’t been finalized, including what Glass will do, what the interface will look like, how it will work, and so on. Google doesn’t want to get the public excited about some feature that may not materialize in the final version. (At the moment, Google is planning to offer the prototypes to developers next year — for $1,500 — in anticipation of selling Glass to the public in, perhaps, 2014.)
When you actually handle these things, you can’t believe how little they weigh. Less than a pair of sunglasses, in my estimation. Glass is an absolutely astonishing feat of miniaturization and integration.
Inside the right earpiece — that is, the horizontal support that goes over your ear — Google has packed memory, a processor, a camera, speaker and microphone, Bluetooth and Wi-Fi antennas, accelerometer, gyroscope, compass and a battery. All inside the earpiece.
Google has said that eventually, Glass will have a cellular radio, so it can get online; at this point, it hooks up wirelessly with your phone for an online connection.
And the mind-blowing thing is, this slim thing is the prototype. It’s only going to get smaller in future generations. “This is the bulkiest version of Glass we’ll ever make,” Babak told me.
The biggest triumph — and to me, the biggest surprise — is that the tiny screen is completely invisible when you’re talking or driving or reading. You just forget about it completely. There’s nothing at all between your eyes and whatever, or whomever, you’re looking at.
And yet when you do focus on the screen, shifting your gaze up and to the right, that tiny half-inch display is surprisingly immersive. It’s as though you’re looking at a big laptop screen or something.
(Even though I usually need reading glasses for close-up material, this very close-up display seemed to float far enough away that I didn’t need them. Because, yeah — wearing glasses under Glass might look weird.)
The hardware breakthrough, in other words, is there. Google is proceeding carefully to make sure it gets the rest of it as right as possible on the first try.
But the potential is already amazing. Mr. Pariz stressed that Glass is designed for two primary purposes — sharing and instant access to information — hands-free, without having to pull anything out of your pocket.
You can control the software by swiping a finger on that right earpiece in different directions; it’s a touchpad. Your swipes could guide you through simple menus. In various presentations, Google has proposed icons for things like taking a picture, recording video, making a phone call, navigating on Google Maps, checking your calendar and so on. A tap selects the option you want.
In recent demonstrations, Google has also shown that you can use speech recognition to control Glass. You say “O.K., Glass” to call up the menu.
To illustrate how Glass might change the game for sharing your life with others, I tried a demo in which a photo appeared — a jungly scene with a wooden footbridge just in front of me. The theme from “Jurassic Park” played crisply in my right ear. (Cute, real cute.)
But as I looked left, right, up or down, my view changed accordingly, as though I were wearing one of those old virtual-reality headsets. The tracking of my head angle and the response to the immersive photo was incredibly crisp and accurate. By swiping my finger on the touchpad, I could change to other scenes.
Now, there’s a lot of road between today’s prototype and the day when Google Glass will be on everyone’s faces. Google will have to nail down the design — and hammer down the price. Issues of privacy and distraction will have to be ironed out (although I’m not nearly as worried about distraction as I was before I tried them on). Glasses wearers may have to wait until Glass can be incorporated into actual glasses.
We may be waiting, too, for that one overwhelmingly compelling feature, something that you can’t do with your phone (beyond making it hands-free). We’ve seen that the masses can’t even be bothered to put on special glasses to watch 3-D TV; it may take some unimagined killer app to convince them to wear Google Glass headsets all day.
But already, a few things are clear. The speed and power, the tiny size and weight, the clarity and effectiveness of the audio and video, are beyond anything I could have imagined. The company is expending a lot of effort on design — hardware and software — which is absolutely the right approach for something as personal as a wearable gadget. And even in this early prototype, you already sense that Google is sweating over the clarity and simplicity of the experience — also a smart approach.
In short, it’s much too soon to predict Google Glass’s success or failure. But it’s easy to see that it has potential no other machine has ever had before — and that Google is shepherding its development in exactly the right way.

more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

How to Beat Facial-Recognition Software

How to Beat Facial-Recognition Software | Image & Vision Technology | Scoop.it

26 January 2012

 

Over the last decade, computers have become better at seeing faces. Software can tell if a camera has a face in its frame of vision, and law enforcement has been testing facial-recognition programs that can supposedly pick out suspects in a crowd. That's prompted an arms race between the people who build facial-recognition systems — and those seeking ways to defeat them.

 

Facial-recognition software is becoming a bigger issue for privacy advocates as well. Surveillance cameras are already ubiquitous in the U.K., are showing up in more places in the U.S. and may increasingly be connected to facial-recognition systems.

 

"I went to a Kinko's a while ago," said Alex Kilpatrick, chief technology officer and co-founder of Tactical Information Systems, a company in Austin, Texas, that sells facial-recognition software to law enforcement and the military. "I saw three cameras just while I was standing in line. You see them in all kinds of places now."

 

he American Civil Liberties Union (ACLU) has said it is deeply concerned with the way facial-recognition systems are used. Police use such systems to flag criminals in public places, the ACLU says, but it argues that the Transportation Security Administration's (TSA) use of the technology in Boston's Logan Airport and in T.F. Green Airport near Providence, R.I., doesn't seem to have helped catch any criminals or terrorists.

 

...

more...
No comment yet.
Rescooped by Yuan-Kai Wang from Amazing Science
Scoop.it!

Interactive 3D protein structures on a virtual reality wall

Interactive 3D protein structures on a virtual reality wall | Image & Vision Technology | Scoop.it

How do you get to know a protein? How about from the inside out? If you ask chemistry professor James Hinton, "It’s really important that scientists as well as students are able to touch, feel, see … embrace–if you like, these proteins structures”. For decades, with funding from the National Science Foundation (NSF), Hinton has used nuclear magnetic resonance (NMR) to look at protein structure and function. But he wanted to find a way to educate and engage students about his discoveries.

 

The picture above shows an example of the interactive visualization of proteins from the Protein Data Bank (PDB), using PDB browser software on the C-Wall (virtual reality wall) at the California Institute for Telecommunications and Information Technology (Calit2) at the University of California, San Diego. The work was performed by Jürgen P. Schulze, project scientist, in collaboration with Jeff Milton, Philip Weber and Professor Philip Bourne of the University of California, San Diego. The software supports collaborative viewing of proteins at multiple sites on the Internet


Via Dr. Stefan Gruenwald
more...
Sandys VR's curator insight, March 27, 2013 6:12 PM

Heard about this before, very cool use of VR!

Luis Carlos Peña Gordillo's curator insight, November 4, 2013 1:45 AM

Realidad virtual en visualización química.

Scooped by Yuan-Kai Wang
Scoop.it!

Photosynth now available in the Windows Phone Marketplace

Photosynth now available in the Windows Phone Marketplace | Image & Vision Technology | Scoop.it
Using the latest in computer vision techniques, Photosynth is the acknowledged leader in mobile panorama creation. Think of it as the instagram of panorama shots, if you will.
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

New 3D imagery for Google Earth for mobile

New 3D imagery for Google Earth for mobile | Image & Vision Technology | Scoop.it
Since 2006, we've had textured 3D buildings in Google Earth, and we're excited to announce that we'll begin adding 3D models to entire metropolitan areas to ...See it on Scoop.it, via Mobile computer vision...
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

How to Make your own sixth sense Device

How to Make your own sixth sense Device | Image & Vision Technology | Scoop.it
The video stream captured by the camera is passed to the mobile computing device that does the calculation of appropriate computer vision.
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

With Kinect-like gestures, SoftKinetic leads the way to Intel’s perceptual computing

With Kinect-like gestures, SoftKinetic leads the way to Intel’s perceptual computing | Image & Vision Technology | Scoop.it
SoftKinetic's close-range gesture control technology is at the heart of Intel's upcoming non-touch control for laptops.
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

FREAK: Fast Retina Keypoints now in OpenCV 2.4.2

FREAK: Fast Retina Keypoints now in OpenCV 2.4.2 | Image & Vision Technology | Scoop.it
FREAK (EPFL) is a novel keypoint descriptor inspired by the human visual system and more precisely the retina, coined Fast Retina Keypoint (FREAK).
more...
No comment yet.
Scooped by Yuan-Kai Wang
Scoop.it!

Volvo: Augmented Reality X-Ray iPad App

Volvo: Augmented Reality X-Ray iPad App | Image & Vision Technology | Scoop.it
Perhaps the new Volvo has so much hidden technology or maybe because the they just wanted to create some extra buzz at the show, either way, the Volvo X-Ray App that was released at the Geneva Auto Show is a pretty cool way for potential customers...See...
more...
No comment yet.