Digital Content Expo 2009: AR & VR Applications Come Together (2/2)

Continued from the previous blog post on Digital Content Expo 2009.

Pull-Navi (The head mount device to navigate a holder by pulling his ears)
(Developed by Kajimoto Laboratory, the University of Electro-Communications)

That motion reminds us of a rider controlling a horse by pulling a lead rope on his back. When a controller moves a yoke on his remote control device, the head-mount device will pull its holder by the closer ear to the direction ordered. The concept of the device is highly inspired by the 40-year running popular TV animation series of Japanese sweetheart Sazae-san[YouTube, J] who pulls his younger brother by the ears when scolding him for his bad behavior.

PullNavi
Picture: Engadget Japanese

– – – – – – – – – –

Foldy (The laundry folding robot)
(Developed by Design UI Project, ERATO, Japan Science and Technology Agency)

First, choose a laundry clothing to be folded and capture an image of it with a camera. Then the system defines how it should be folded and sets a moving path to do so.

– – – – – – – – – –

Photoelastic Search (The rubbery force-sensing interactive display)
(Developed by Koike Lab., the University of Electro-Communications)

By using an LCD and photo-elasticity, the lab team has developed a pressure-sensitive two and half-dimension interactive surface. If you press a certain point of the transparent gel, that part will be depressed. A overhead camera detects the change of refracted ray caused by the depression, and then the system changes the face emotion projected on the gel.

– – – – – – – – – –

Bloxels (Illuminating blocks change color depending on the distance from the light source)
(Developed by Yasuaki Kakehi Lab.[J], Keio Univ. and Naemura Lab.[J], Univ. of Tokyo)

Every cube has a photodiode on the bottom. That semiconductor measures the strength of the ray to find the distance from the light source, and will let the cube’s illumination change color depending on the distance.Double-click the movie above to play.

via 4Gamer.net[J], Phile-Web[J], Ikuya Takamori’s column on Wired Vision[J], Weekly Ascii[J], and Mainichi.jp[J]

Digital Content Expo 2009: AR & VR Applications Come Together (1/2)

Logo of Digital Content Expo 2009

Last week, there was an annual event called Digital Content Expo 2009 at Miraikan, Japan’s national museum of emerging science and innovation located in the Tokyo waterfront area, which introduced approximately 50 artistic elements and R&D results of newly developed digital content technologies.

Let’s see some of items which were exhibited at the event with IT news media’s video coverage and the exhibitor’s presentation videos.

– – – – – – – – – – –

Back To the Mouth(Game Title) & La Flèche de l’odeur (Blowgun Device, meaning the Arrow of Smell in French)
(Developed by Kosaka Laboratory[J], Dept. of Computer Engineering and International Communication, Kanazawa Technical College[J])

People usually do their best to inhibit bad breath, but you need not take care of it when you play this game. The college exhibited a shooting game on which you may attack monsters with your bad breath. During the game, players take strong-smelling foods and drinks such as cheese, chocolate, potato chips, cakes, beer and wine, and they use a smell sensor-enabled blowgun and attack monsters by blowing it in REAL and throwing “smell balls” in VIRTUAL.

Corresponding to the monster species, a certain smell has an effect only on attacking a certain type of monsters, that’s why users are forced to keep changing their breath smell by taking foods to terminate all types of monsters.

– – – – – – – – – – –

Ikabo[J] (A squid-shaped robot controllable with a Nintendo WII remote controller)
(Developed by Future University – Hakodate for promoting sight-seeing business in the city of Hokkaido)

The robot is 2.2 meters high (7.2 feet high) and 220 kilograms weigh (485 lbs. weigh), and has nine joints in the arms, three joints in the head and two joints in the eyes.

– – – – – – – – – – –

360 Degree-Viewable Display
(Developed by Sony)

A 30 centimeters high (almost one foot high) colunar display shows you 3D motion pictures no matter which side you’re standing to watch it without wearing a pair of 3D glasses. For example, when someone’s portrait is on screen, you can see his face if you stand at the front of the display. If you step into the back of it, you see the back of his head on it.

– – – – – – – – – – –

Media Vehicle (A Mashmallow-shaped capsule having a display and the capability of controlling your equilibrioception)
(Developed by VR Lab., Tsukuba University)

According to the vehicle’s developer team, it is defined as a vehicle designed for moving around between the real world and the virtual world. When a controller standing outside it moves and shakes a position sensor-enabled camera, a passenger in the vehicle look and feel as though you were in a rolling ball, corresponding to the camera motion. Be careful not to be virtual car-sick.

Media Vehicle
(Picture: Gizmodo)

– – – – – – – – – – –

Table Interface
(Developed by Cyberdine who is well-known for having developed Robot Suit HAL)

I think this is very similar to Microsoft Surface.

– – – – – – – – – – –

Funbrella (Record a pattern of raindrop impacts on a sensor-enabled umbrella, and play it later)
(Developed by Human Interface Engineering Lab., Osaka University)

Several sensors embedded on the umbrella will record a pettern of real raindrop impacts, and you can play that pattern to feel the rainshower experience which may remind you of a certain memory associated with it. With this, remember how it was raining when you said good-bye to your ex-girlfriend.

To be continued to an upcoming blog post for more exhibited items.

via 4Gamer.net[J], Phile-Web[J], Ikuya Takamori’s column on Wired Vision[J], Weekly Ascii[J], and Mainichi.jp[J]