This application relates to a see-through head-mounted display using recorded substrate-guided holographic continuous lens (SGHCL) and a microdisplay with laser illumination. The high diffraction efficiency of the volume SGHCL creates very high luminance of the virtual image.
As the extended reality (XR) landscape evolves, two standout projects Project Aura and Project Moohan are shaping the future of wearable technology. Both initiatives, backed by major tech players, aim to redefine user interaction through advanced features and seamless integration. This article delves into their specifications, functionalities, and potential impact. | As the extended reality (XR) landscape evolves, two standout projects Project Aura and Project Moohan are shaping the future of wearable technology. Both initiatives, backed by major tech players, aim to redefine user interaction through advanced features and seamless integration. This article delves into their
US20250258375 - EYEWEAR DISPLAY HAVING A WAVEGUIDE WITH ADJUSTABLE REFLECTORS Pupil expansion for waveguided light extration often results in unwanted losses. Smart reflectors driven by eye tracking are a viable solution
Among other possible functions described in the patent, one describes pixelated area ("multi-state" scattering panels) to direct the user sight, based on a trigger.
This is not that novel principle on smartglasses, implemented, for example, with liquid crystals or the like. Other companies proposed versions of the mechanism in the past. What is novel is the implementation by Meta, up to now and the possible novel algorithms to drive the pixellated area layer
***
"The multi-state scattering panels can be selectively activated based on a trigger, such as identification of an object in the real-world environment, based on an event occurring in an XR application executing on the XR display, based on detected audio, etc."
Aspects of the present disclosure integrate pixelated multi-state panels (e.g., liquid crystal dimming panels) into artificial reality (XR) displays (e.g., augmented reality (AR) glasses) or conventional glasses. Conventional visual displays for XR displays can be expensive, have a low field-of-view, and consume high power. On the other hand, pixelated multi-state panels consume lower power, have a wide field-of-view, and are light, inexpensive, and computationally simple. The multi-state panels can have 4 configurations: 1) included on the periphery of an XR display, 2) as a standalone display, 3) as a display that's overlaid onto an XR display, and/or 4) as a secondary externally facing display. The multi-state panels can be selectively activated based on a trigger, such as identification of an object in the real-world environment, based on an event occurring in an XR application executing on the XR display, based on detected audio, etc.
WO2025171242 - HOLOGRAPHIC TWO-DIMENSIONAL (2D) IMAGE PROJECTION FOR AN AUGMENTED REALITY (AR) WAVEGUIDE DISPLAY Systems, methods, and/or apparatuses for a holographic projection module in a near-eye display device, which may display
>>augmented reality/virtual reality (AR/VR) content to the user. In one aspect, >>a spatial light modulator (SLM) is illuminated by a planar wavefront, >>which the SLM modulates with a pattern and projects the patterned light through a projection lens to form a 2D hologram which is input to >>a waveguide display.
>>Some examples may include a high-order filter to form an aperture in the Fourier domain controllable by the phase pattern displayed on the SLM; other examples may have no projection lenses, where the displayed SLM image corresponds to the user perceived Fourier domain image representation.
>>Some examples may use two stacked SLMs, a complex wavefront modulation SLM,
>>and/or a phase SLM with a mask (such as, e.g., a binary amplitude mask).
https://youtu.be/wPKfrYrFjBM Meta's been on a roll, partnering with Ray-Ban and now with Oakley to debut the HSTN glasses, but remember when Google was sort of the de-facto leader in the smart headset space? Sure, Google Glass and Google Cardboard were VERY early for their times
Wearable device and method of operating the same - SAMSUNG ELECTRONICS CO., LTD.
Samsung has filed a patent for a wearable sensor that can read the user's veins to establish his or her identity. If implemented, this allows Samsung's future wearables to double as keys for doors and cars. "The sensor takes a picture of the user’s vein structure and characteristics, then compares it to a vein image in its memory that it knows belongs to the user. The sensor might also detect the user’s pulse rate, which is also unique from person to person."Check also, from 2016 https://www.fastcompany.com/3056357/this-samsung-patent-lets-smartwatches-recognize-you-by-your-veins
- Electronic Device Display With Array of Discrete Light-Emitting Diodes
An electronic device may include a display. The display may be formed by an array of light-emitting diodes mounted to the surface of a substrate. The substrate may be a silicon substrate. Circuitry may be located in spaces between the light-emitting diodes. Circuitry may also be located on the rear surface of the silicon substrate and may be coupled to the array of light-emitting diodes using through-silicon vias. The circuitry may include integrated circuits and other components that are attached to the substrate and may include transistors and other circuitry formed within the silicon substrate. Touch sensor electrodes, light sensors, and other components may be located in the spaces between the light-emitting diodes. The substrate may be formed from a transparent material that allows image light to reach a lens and image sensor mounted below the substrate.
►Disclaimer◄ Some of the footage used in this video is not original content produced by TechieBetterment. Portions of stock footage of products were gathered from multiple sources including, manufactures, fellow creators, and various other sources. If something belongs to you, and you want it to be removed, please do not hesitate to contact us at: asktechgasmic@gmail.com
TechieBetterment is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com. As an Amazon Associate, I earn from qualifying purchases.
I hope you enjoyed watching!
TIMESTAMPS 0:00 Intro 0:45 Project Aura Explained in Plain Terms 1:54 Why Google Partnered Instead of Building Everything In-House 3:09 The Hardware Philosophy Behind Aura 4:23 Android XR Is the Real Product Here 4:42 Gemini Is What Finally Makes Smart Glasses Make Sense 7:00 How Google’s Approach Stacks Up Against Apple and Meta 8:37 The Trust Problem Google Has To Get Right This Time 10:23 What the Timeline Really Tells Us About Google’s Intentions
US20250306686 - Electrical Stimulation From A Wristband With Augmented Reality Visual Effects For Remote Haptic Sensations In The Hand, And Systems And Methods Of Use Thereof
A method of providing
>>Remote haptic feedback is described.
>>The method includes applying, via a set of electrodes of a wearable device, a haptic signal to a first portion of a user.
>> The haptic signal is configured to cause haptic feedback to be perceived at a second portion of the user that is distinct from the first portion of the user.
>> The method further includes causing a visual indication of the haptic feedback at the second portion to be displayed to the user via a display of a head-wearable device.
EP4605795 - AUGMENTED REALITY DISPLAY WITH FREEFORM OPTICS AND INTEGRATED PRESCRIPTION LENS
A dual-component lightguide employs two freeform surfaces separated by a gap with prescription lens integration. A world-side component includes a spherical world-side surface and a freeform eye-side surface An eye-side component ncludes a freeform world-side surface that conforms to the freeform eye-side surface of the world-side component and an eye-side surface that is shaped to provide corrective optics based on a desired prescription.
"Haylo to Invest USD 100M in MicroLED Company Plessey via GoerTek loan" "Plessey, founded in 2000, specializes in developing ultra–high-resolution Micro LED displays for #AR and #VR devices. The company has collaborated with AR glasses maker Vuzix Corporation and tech giant Meta to jointly develop AR/VR #microdisplay technologies. In January this year, Plessey and Meta released their latest research results — the world’s brightest red Micro LED display, capable of delivering up to 6,000,000 nits of brightness at high resolution (<5 microns)."
Recently, Meta and Google presented two arm-based interfaces as a possible HMI for the smartglasses -but not only - which appears evidently quite similar. I was very surprised when both came out.
So, I started digging among patents assignments, partnerships and acquisitions related to their wearable display area - and related interfaces - of the two companies, sometimes including other key subcomponents, such as, for example, micro displays companies. The map is so dense that it reminds me of an MOCVD reactor, a complicated grid with plenty of numbered valves and thin “pipelines”.
A very complex, meaningful structure emerged and intriguing dynamics at the level of patents details.
Referring to the Map, each company little labele it includes the date referring to the foundation, while the dates included into little green rectangles are related to merging or acquisition or patent assignment events.
It is quite intriguing that Thalmic Labs IP arm band related IP and know-how were both transferred to Meta “via” North Inc, which, in turn, was then acquired by Google, back in 2019 and 2020, respectively (red colour)
That’s the point where the arm interfaces culture overlapping comes from.
In the scheme the dotted arrows are related to IP transferring, while the blue continuous arrows address complete acquisitions, and the violet continuous arrows point to the final reference product, prototype or project of the company. Sometimes come soundy partnerships.
The scheme does not have the aim to be complete, the intention is to give a flavour of the complexity of propriety and IP transfer for Meta and Google around AR glasses products and related HMI including internal or experimental projects.
Hitachi’s Metaverse Platform for Nuclear Power Plants enhances collaboration among stakeholders, streamlines design and construction workflows, and supports the development of practical and effective investment plans by integrating metaverse and AI technologies.
Full-colour 3D holographic augmented-reality displays with metasurface waveguides
Emerging spatial computing systems seamlessly superimpose digital information on the physical environment observed by a user, enabling transformative experiences across various domains, such as entertainment, education, communication and training1–3. However, the widespread adoption of augmented-reality (AR) displays has been limited due to the bulky projection optics of their light engines and their inability to accurately portray three-dimensional (3D) depth cues for virtual content, among other factors4,5. Here we introduce a holographic AR system that overcomes these challenges using a unique combination of inverse-designed full-colour metasurface gratings, a compact dispersion-compensating waveguide geometry and artificial-intelligence-driven holography algorithms. These elements are co-designed to eliminate the need for bulky collimation optics between the spatial light modulator and the waveguide and to present vibrant, full-colour, 3D AR content in a compact device form factor. To deliver unprecedented visual quality with our prototype, we develop an innovative image formation model that combines a physically accurate waveguide model with learned components that are automatically calibrated using camera feedback. Our unique co-design of a nanophotonic metasurface waveguide and artificial-intelligence-driven holographic algorithms represents a significant advancement in creating visually compelling 3D AR experiences in a compact wearable device. We develop a method for providing high-quality, holographic, three-dimensional augmented-reality images in a small form factor suitable for incorporation in eyeglass-scale wearables, using high-refraction-index glass waveguides with nanoscale metasurfaces, and incorporating artificial intelligence.
This application relates to a see-through head-mounted display using recorded substrate-guided holographic continuous lens (SGHCL) and a microdisplay with laser illumination. The high diffraction efficiency of the volume SGHCL creates very high luminance of the virtual image.
"Project Aria is a research program from Meta, to help build the future responsibly. Project Aria unlocks new possibilities of how we connect with and experience the world.
To get content containing either thought or leadership enter:
To get content containing both thought and leadership enter:
To get content containing the expression thought leadership enter:
You can enter several keywords and you can refine them whenever you want. Our suggestion engine uses more signals but entering a few keywords here will rapidly give you great content to curate.