Last week, there was an annual event called Digital Content Expo 2009 at Miraikan, Japan’s national museum of emerging science and innovation located in the Tokyo waterfront area, which introduced approximately 50 artistic elements and R&D results of newly developed digital content technologies.
Let’s see some of items which were exhibited at the event with IT news media’s video coverage and the exhibitor’s presentation videos.
– – – – – – – – – – –
Back To the Mouth(Game Title) & La Flèche de l’odeur (Blowgun Device, meaning the Arrow of Smell in French)
(Developed by Kosaka Laboratory[J], Dept. of Computer Engineering and International Communication, Kanazawa Technical College[J])
People usually do their best to inhibit bad breath, but you need not take care of it when you play this game. The college exhibited a shooting game on which you may attack monsters with your bad breath. During the game, players take strong-smelling foods and drinks such as cheese, chocolate, potato chips, cakes, beer and wine, and they use a smell sensor-enabled blowgun and attack monsters by blowing it in REAL and throwing “smell balls” in VIRTUAL.
Corresponding to the monster species, a certain smell has an effect only on attacking a certain type of monsters, that’s why users are forced to keep changing their breath smell by taking foods to terminate all types of monsters.
– – – – – – – – – – –
Ikabo[J] (A squid-shaped robot controllable with a Nintendo WII remote controller)
(Developed by Future University – Hakodate for promoting sight-seeing business in the city of Hokkaido)
The robot is 2.2 meters high (7.2 feet high) and 220 kilograms weigh (485 lbs. weigh), and has nine joints in the arms, three joints in the head and two joints in the eyes.
– – – – – – – – – – –
360 Degree-Viewable Display
(Developed by Sony)
A 30 centimeters high (almost one foot high) colunar display shows you 3D motion pictures no matter which side you’re standing to watch it without wearing a pair of 3D glasses. For example, when someone’s portrait is on screen, you can see his face if you stand at the front of the display. If you step into the back of it, you see the back of his head on it.
– – – – – – – – – – –
Media Vehicle (A Mashmallow-shaped capsule having a display and the capability of controlling your equilibrioception)
(Developed by VR Lab., Tsukuba University)
According to the vehicle’s developer team, it is defined as a vehicle designed for moving around between the real world and the virtual world. When a controller standing outside it moves and shakes a position sensor-enabled camera, a passenger in the vehicle look and feel as though you were in a rolling ball, corresponding to the camera motion. Be careful not to be virtual car-sick.
(Picture: Gizmodo)
– – – – – – – – – – –
Table Interface
(Developed by Cyberdine who is well-known for having developed Robot Suit HAL)
I think this is very similar to Microsoft Surface.
– – – – – – – – – – –
Funbrella (Record a pattern of raindrop impacts on a sensor-enabled umbrella, and play it later)
(Developed by Human Interface Engineering Lab., Osaka University)
Several sensors embedded on the umbrella will record a pettern of real raindrop impacts, and you can play that pattern to feel the rainshower experience which may remind you of a certain memory associated with it. With this, remember how it was raining when you said good-bye to your ex-girlfriend.
To be continued to an upcoming blog post for more exhibited items.
via 4Gamer.net[J], Phile-Web[J], Ikuya Takamori’s column on Wired Vision[J], Weekly Ascii[J], and Mainichi.jp[J]