Panamorph AR/VR Display Module Development

Ultimately the future success of wearable augmented and virtual reality display products will depend on operating with minimal battery power for wireless operation of practical duration while also working within wireless bandwidth limitations for delivering high resolution content supporting wide field of view displays providing ergonomic and realistic viewing environments. Development of core technologies in support of such products has been in the DNA of Panamorph and its affiliate pursuits since the early 1990s. Panamorph is now applying these patented and patent-pending technologies to the creation of next generation optical display modules and licensable intellectual property for wearable products offered by industry leaders driving AR/VR/XR/MR products and applications. For more information, industry-related inquiries can be directed to Shawn Kelly at cs@panamorph.com (direct technology/business/partner related inquiries only please).

First generation dual monocular optical system shown supporting a 18mm x 11mm exit pupil and 22mm eye relief using a 31mm x 17mm transmissive display into a 76 x 46 degree (89 diagonal) fully-overlapped FOV. Modulated SubPupil Technology results in an 80% reduction in lighting power usage for 5X wireless battery life and lower heat generation and an 80% reduction in stray light for higher contrast. Rear lighting system to be replaced by waveguide system for a compact second generation design.

 

MODULATED SUBPUPIL TECHNOLOGY
FOR LOWER HEAT AND 10X+ AR/VR BATTERY LIFE

A typical near-eye display produces a system exit pupil large enough so the entire image is visible regardless of eye rotation as it scans different image locations. The wider the field of view, the larger the system exit pupil needs to be relative to the size of the eye pupil and, consequently, the user only “sees” a small percentage of the light through the system exit pupil at any given time. The remaining, much larger percentage of light missing the eye pupil is a waste of power.

Modulated SubPupil (MSP) technology treats the near-eye display system as two cooperating optical systems. The first optical system includes any number of types of magnifier optics in front of a pixel-based display to form a virtual image. The second optical system includes an array of small light sources each of which floods the pixel-based display through conditioning optics to thereafter form an image of that array of light sources through the magnifier optics at the system exit pupil. Each light source therefore illuminates a subsection or “subpupil” of the system exit pupil. By controlling the intensity of each light source as guided by eye pupil tracking so that light is provided only through subpupils entering the eye pupil over 90% of the remaining subpupils can be turned “off” to provide a similar reduction in power usage and associated heat generation. This technology is more completely described in a pending World Intellectual Property Organization patent application to be published in mid-August and based primarily on US 63/222,978 Near-Eye Display System.

While MSP technology supports a number of different magnifier and illumination arrangements, a very compact embodiment employs a pancake magnifier for the first optical system and a waveguide illuminator combined with a varifocal lens for the second. This combination provides the proven benefits of a pancake magnifier for a highly compact wearable display while compensating for the high pancake light loss through significant reduction in the overall power since only a small percentage of bright light sources are turned on at any given time. In a way, MSP is similar to pupil steering concepts but without mechanical or other active components – relying on turning on specific light sources as a more simple alternative.

 

Download PDF

 

 

ON-DEMAND RESOLUTION
FOR OPTIMUM BANDWIDTH USAGE

Wearable displays with a wide field of view require high resolution content to provide sufficient visual detail, placing a great burden on bandwidth limitations of a wireless connection. Similar to the processing-saving concept of foveated rendering of virtual worlds, Panamorph’s on-demand resolution technology delivers relatively low resolution imagery for the full field of view while incrementally increasing that resolution wherever the user’s eye is directed. Fundamentally, available high resolution content is decimated at the server into sequentially lower resolution images and sets of extra data that can be sent to and reintegrated by the wearable display using simple processing operations to rebuild higher resolution in a local region of the full field of view as guided by eye tracking. Originally developed for multi-format video encoding and later for user-responsive enhancement of images for mobile applications, on-demand resolution is now being additionally purposed for VR/AR applications.


Download PDF

 

This technology is comprised of a suite of technologies represented by the following issued and pending patents (US patent and application numbers unless otherwise noted) and all building from a core technology of Multi Format Encoding / Decoding algorithms.

17/052,186 Image Processing System and Method (Server Side Bandwidth-Responsive Fade-in)
17/086,399 Image Processing System and Method (Client Side Bandwidth-Responsive Fade-in)
17/086,400 Image Processing System and Method (Lossless Progressive Image Encoding/Decoding)
17/216,557 Image Processing System and Method (On-Demand Resolution Responsive to User Interaction)
11,350,015 Image Processing System and Method (General Noise Removal)
11,297,203 Image Processing System and Method (Lossy Progressive Image Encoding/Decoding)
10,554,856 Image Processing System and Method (Noise Removal for Extended Bit Depth)
9,774,761 Image Processing System and Method (Deep Color Decoding)
9,584,701 Image Processing System and Method (Deep Color Encoding)
383,139 (India) Image Processing System and Method (MFD/E)
8,855,195 Image Processing System and Method (Multi Format Decoding)
8,798,136 Image Processing System and Method (Multi Format Encoding)

The above is provided for information purposes only and does not convey any rights
to the subject matter or related intellectual property.