iPrediction from 2007 [B, T]

I wrote this for an iPhone contest that Gizmodo was having in July 2007. I decided to post it as apparently it is going to be a viable product some day.

Entry Name: iEyes; visual communication and augmented reality device.

Description: A set of contact lenses that sends and receives wireless data from various sources via a separate miniature “basestation” that one would carry with them. The basestation is not shown in these examples but it could be pen-sized or wristwatch-shaped. There is also an optional control pad, either a FingerPad (about the size and shape of a poker chip) or a TonguePad that one wears clipped behind the upper incisors.

Function: The iEyes essentially conveys data from the wearer’s cell phone, Blackberry, media player, GPS device, etc. Visual data is presented to the user similar to the goggles one can buy today to watch movies on a virtual widescreen.

Power Source: The iEyes have built in solar cells to power themselves off of ambient light; with capacitive electricity storage when in a dark room.

Feasibility: With emerging technologies like flexible electronic circuits, conductive plastics, nanotechnology-based super capacitors, multi-element lenses and the like, iEyes could be manufactured as early as 2009 for government use; with less expensive consumer-priced versions by 2012.

The iEye would appear to the casual observer to be nothing more than a contact lens. If one looked closer you could see that there is something a bit different about it. The lenses would be worn in pairs, built-in sensors would keep the screen aligned with the vertical and horizontal so it does not matter how the lenses are put in the eye. A preferences setting could override this, say if you were lying on your side and watching a movie; the virtual screen orientation would be rotated 90 degrees so the movie appears correct from your point of view.

This is the optional TonguePad worn behind the front teeth that lets you control basic functions such as answering a phone, scrolling e-mail text or playing a movie. As mentioned earlier, the basestation is not shown, but it could be pen-sized or wristwatch-shaped. The basestation would have connections to various devices, probably by high-speed wireless modules; somewhat akin to Bluetooth without the limitations of battery life and data transmission rates.

When a phone call is received, the phone number and person calling is seen by the iEyes wearer. The yellow arrow pointing towards the phone means a call is coming in from that person; pointing away from the phone means the call is going out to that person from the iEyes wearer. The iEyes – using data from the built-in cameras – continually adjusts text colors to be in contrast with the ambient environment. Calls can be answered by either touching the phone keypad, a button on the basestation, or blinking both eyes in rapid succession. If using the optional TonguePad – as this user is – one can either click the green circle (which means “yes” or “accept”) or the red stopsign (which means “no” or “ignore”). The cursor is by default indicative of what is being used to control it, such as a tongue if using the TonguePad.

When reading e-mail received on a Blackberry or iPhone, the message is scrolled up and down by using the blue arrows; the envelope means “reply” and the “go back” arrow means to go back to the list of e-mails. The yellow triangle at the end of the message text means there is more text to scroll down to read, when scrolling down a similar triangle appears at the top. To reply with the FingerPad or TonguePad there are user-defined phrases like “I will call you” or “Please forward to Ms. Lyons at my office.” To actually type a message one could use the virtual keyboard text entry function but this could be laborious. Ideally one would use a small optional keyboard or the keyboard on their phone or Blackberry. In this example the cursor is a finger, which indicates the FingerPad is being used.

Playing movies is one of the most basic yet amazing features of the iEyes. You see movies – depending on source of course – as equivalent in quality up to a 1080p high definition video. In this example the green triangle is less transparent meaning the movie is playing. Some of the features include the ability to have the movie very transparent so as not to obscure one’s view, or to have the background almost opaque. If wearing the TonguePad this is accomplished by clicking the contrast control and sliding the tongue to the left or right. In a darkened room the movies need no backlighting as the images are formed by OLEDs (organic light-emitting diodes) which generate light themselves.

When using the GPS function for directions, the directions are overlaid on top of what you are seeing. Spoken directions can be also sent to either your phone’s speaker or headset. One can also switch to the metric and real system of measurement.

Embedded in the lens are millions of tiny light-detecting elements, each with their own lens. The concept has been demonstrated in making lenses that function similar to an insect’s compound eye. In this case the (at least in the first version) there would be no zoom feature. A 5 megapixel picture is taken by either clicking a button on the basestation, using the optional FingerPad or TonguePad (in this case) to click the onscreen “shutter release” button, or winking twice in rapid succession with one eye. When the picture is taken software in the basestation decides which eye is seeing the better image and uses that data. An optional feature snaps a photo using both images; since this is a 3-dimensional photo the image can be viewed on a computer screen or TV with special polarized accessory glasses preserving the 3D view.

One of the most amazing features of the iEyes works in conjunction with an Internet connection through the phone as well as the yet-to-be-announced GoogleEyes service from Google. With GoogleEyes turned on, when you look at an object and request data, GoogleEyes attempts to identify the object (giving a confidence for the identification) as well as providing a definition from a user-defined source, in this case Wikipedia.

No comments: