Among other possible functions described in the patent, one describes pixelated area ("multi-state" scattering panels) to direct the user sight, based on a trigger.
This is not that novel principle on smartglasses, implemented, for example, with liquid crystals or the like. Other companies proposed versions of the mechanism in the past. What is novel is the implementation by Meta, up to now and the possible novel algorithms to drive the pixellated area layer
***
"The multi-state scattering panels can be selectively activated based on a trigger, such as identification of an object in the real-world environment, based on an event occurring in an XR application executing on the XR display, based on detected audio, etc."
Aspects of the present disclosure integrate pixelated multi-state panels (e.g., liquid crystal dimming panels) into artificial reality (XR) displays (e.g., augmented reality (AR) glasses) or conventional glasses. Conventional visual displays for XR displays can be expensive, have a low field-of-view, and consume high power. On the other hand, pixelated multi-state panels consume lower power, have a wide field-of-view, and are light, inexpensive, and computationally simple. The multi-state panels can have 4 configurations: 1) included on the periphery of an XR display, 2) as a standalone display, 3) as a display that's overlaid onto an XR display, and/or 4) as a secondary externally facing display. The multi-state panels can be selectively activated based on a trigger, such as identification of an object in the real-world environment, based on an event occurring in an XR application executing on the XR display, based on detected audio, etc.
US20250291100 - OPTICAL MATERIAL ENCAPSULATED DIFFRACTIVE GRATINGS
[]...]F
Due to the sets of grating structures of a lightguide directing light based on the parameters of the grating structures, changing how a set of grating structures directs light requires modifying one or more parameters of the grating structures. However, modifying the parameters of the grating structures requires fabricating a new set of grating structures with the modified parameters.[...]
As such, modifying the parameters of the grating structures to compensate for manufacturing tolerances, manufacturing errors, design changes, and the like requires new sets of grating structures to be fabricated, increasing the cost and time needed to manufacture lightguides including these sets of grating structures.
While fabricating a lightguide including such sets of diffractive grating structures, certain conditions arise requiring how a set of diffractive grating structures directs light to be modified (e.g., requiring the diffraction efficiency of the set of diffractive grating structures to be modified). For example, in response to certain tolerances in the manufacturing process of a lightguide, one or more components of a lightguide (e.g., incoupler, EPE, outcoupler, body) deviate from a desired function (e.g., one or more components of the lightguide disperse light differently than intended). To compensate for these deviations, the diffraction efficiencies of one or more sets of diffractive grating structures are modified such that the lightguide functions as intended. As another example, certain characteristics or parameters for a set of diffractive grating structures increase the chance of introducing deformities in one or more components of the lightguide during the manufacturing processes such as air bubbles, working stamp damage, or the like. Such deformities in the components of the lightguide prevent these components from operating as intended. To compensate for these deformities, the diffraction efficiencies at which one or more sets of diffractive grating structures direct light are modified such that the lightguide operates as desired.
However, because a set of diffractive grating structures is configured to direct light based on the physical parameters (e.g., angle, depth, period) of the diffractive grating structures in the set of diffractive grating structures, modifying how a set of diffractive gratings directs light (e.g., modifying the diffraction efficiency) requires changing one or more parameters of the diffractive grating structures. To change these physical parameters, the diffractive grating structures must be refabricated with the modified parameters, which increases the time and cost required to fabricate the lightguide. To this end, systems and techniques disclosed herein are directed to a lightguide including a deposited optical material configured to modify how a set of diffractive grating structures directs received light (e.g., to modify the diffraction efficiency at which a set of diffractive grating structures directs received light). For example, such a lightguide includes a substrate having opposing surfaces that is formed from an effectively transparent material so as to allow a user to view a real-world space in front of the user. Deposed on a first surface of this substrate is a set of diffractive grating structures having one or more predetermined parameters based on a pattern of a master stamp, such as predetermined angles, predetermined duty cycles, predetermined periods, predetermined heights, and the like. Additionally, the lightguide includes an optical material deposited on the first surface of the substrate such that one or more diffractive grating structures of the set of diffractive grating structures are at least partially encapsulated by the optical material. Such an optical material, for example, has a predetermined refractive index that causes a fraction of the light received by the optical material to be diffracted at an angle based on the predetermined refractive index. As an example, the optical material includes a refractive index that causes a fraction of light received by the optical material to be directed toward the set of diffractive grating structures, a surface of the lightguide, or both such that the fraction of light is incident upon a surface of the lightguide, the diffractive grating structures of the set of diffractive grating structures, or both at a predetermined angle. The surfaces of the lightguide, the set of diffractive grating structures, or both then direct the received fraction of light based on one or more parameters of the diffractive grating structures, the predetermined angle upon which the received light is incident, or both.
In this way, a lightguide allows how a set of diffractive grating structures directs light to be modified without changing the physical parameters of the diffractive grating structures. That is to say, the lightguide allows for the diffractive index of a set of diffractive gratings to be modified without changing the physical parameters of the diffractive grating structures.
►Disclaimer◄ Some of the footage used in this video is not original content produced by TechieBetterment. Portions of stock footage of products were gathered from multiple sources including, manufactures, fellow creators, and various other sources. If something belongs to you, and you want it to be removed, please do not hesitate to contact us at: asktechgasmic@gmail.com
TechieBetterment is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com. As an Amazon Associate, I earn from qualifying purchases.
I hope you enjoyed watching!
TIMESTAMPS 0:00 Intro 0:45 Project Aura Explained in Plain Terms 1:54 Why Google Partnered Instead of Building Everything In-House 3:09 The Hardware Philosophy Behind Aura 4:23 Android XR Is the Real Product Here 4:42 Gemini Is What Finally Makes Smart Glasses Make Sense 7:00 How Google’s Approach Stacks Up Against Apple and Meta 8:37 The Trust Problem Google Has To Get Right This Time 10:23 What the Timeline Really Tells Us About Google’s Intentions
US20250306686 - Electrical Stimulation From A Wristband With Augmented Reality Visual Effects For Remote Haptic Sensations In The Hand, And Systems And Methods Of Use Thereof
A method of providing
>>Remote haptic feedback is described.
>>The method includes applying, via a set of electrodes of a wearable device, a haptic signal to a first portion of a user.
>> The haptic signal is configured to cause haptic feedback to be perceived at a second portion of the user that is distinct from the first portion of the user.
>> The method further includes causing a visual indication of the haptic feedback at the second portion to be displayed to the user via a display of a head-wearable device.
EP4605795 - AUGMENTED REALITY DISPLAY WITH FREEFORM OPTICS AND INTEGRATED PRESCRIPTION LENS
A dual-component lightguide employs two freeform surfaces separated by a gap with prescription lens integration. A world-side component includes a spherical world-side surface and a freeform eye-side surface An eye-side component ncludes a freeform world-side surface that conforms to the freeform eye-side surface of the world-side component and an eye-side surface that is shaped to provide corrective optics based on a desired prescription.
"Haylo to Invest USD 100M in MicroLED Company Plessey via GoerTek loan" "Plessey, founded in 2000, specializes in developing ultra–high-resolution Micro LED displays for #AR and #VR devices. The company has collaborated with AR glasses maker Vuzix Corporation and tech giant Meta to jointly develop AR/VR #microdisplay technologies. In January this year, Plessey and Meta released their latest research results — the world’s brightest red Micro LED display, capable of delivering up to 6,000,000 nits of brightness at high resolution (<5 microns)."
Recently, Meta and Google presented two arm-based interfaces as a possible HMI for the smartglasses -but not only - which appears evidently quite similar. I was very surprised when both came out.
So, I started digging among patents assignments, partnerships and acquisitions related to their wearable display area - and related interfaces - of the two companies, sometimes including other key subcomponents, such as, for example, micro displays companies. The map is so dense that it reminds me of an MOCVD reactor, a complicated grid with plenty of numbered valves and thin “pipelines”.
A very complex, meaningful structure emerged and intriguing dynamics at the level of patents details.
Referring to the Map, each company little labele it includes the date referring to the foundation, while the dates included into little green rectangles are related to merging or acquisition or patent assignment events.
It is quite intriguing that Thalmic Labs IP arm band related IP and know-how were both transferred to Meta “via” North Inc, which, in turn, was then acquired by Google, back in 2019 and 2020, respectively (red colour)
That’s the point where the arm interfaces culture overlapping comes from.
In the scheme the dotted arrows are related to IP transferring, while the blue continuous arrows address complete acquisitions, and the violet continuous arrows point to the final reference product, prototype or project of the company. Sometimes come soundy partnerships.
The scheme does not have the aim to be complete, the intention is to give a flavour of the complexity of propriety and IP transfer for Meta and Google around AR glasses products and related HMI including internal or experimental projects.
Hitachi’s Metaverse Platform for Nuclear Power Plants enhances collaboration among stakeholders, streamlines design and construction workflows, and supports the development of practical and effective investment plans by integrating metaverse and AI technologies.
Full-colour 3D holographic augmented-reality displays with metasurface waveguides
Emerging spatial computing systems seamlessly superimpose digital information on the physical environment observed by a user, enabling transformative experiences across various domains, such as entertainment, education, communication and training1–3. However, the widespread adoption of augmented-reality (AR) displays has been limited due to the bulky projection optics of their light engines and their inability to accurately portray three-dimensional (3D) depth cues for virtual content, among other factors4,5. Here we introduce a holographic AR system that overcomes these challenges using a unique combination of inverse-designed full-colour metasurface gratings, a compact dispersion-compensating waveguide geometry and artificial-intelligence-driven holography algorithms. These elements are co-designed to eliminate the need for bulky collimation optics between the spatial light modulator and the waveguide and to present vibrant, full-colour, 3D AR content in a compact device form factor. To deliver unprecedented visual quality with our prototype, we develop an innovative image formation model that combines a physically accurate waveguide model with learned components that are automatically calibrated using camera feedback. Our unique co-design of a nanophotonic metasurface waveguide and artificial-intelligence-driven holographic algorithms represents a significant advancement in creating visually compelling 3D AR experiences in a compact wearable device. We develop a method for providing high-quality, holographic, three-dimensional augmented-reality images in a small form factor suitable for incorporation in eyeglass-scale wearables, using high-refraction-index glass waveguides with nanoscale metasurfaces, and incorporating artificial intelligence.
According yet other embodiments, methods for forming an antireflection coating on a metasurface are provided. In some embodiments, a method may comprise providing an optically transmissive substrate comprising a metasurface, the metasurface comprising a plurality of nanostructures, depositing a layer of an optically transparent material over the plurality of nanostructures, wherein the layer of optically transparent material forms the antireflection coating.
According to some embodiments, the optically transparent material comprises a polymer. In some embodiments, the optically transparent material comprises photoresist. In some embodiments, a distance from a topmost surface of the nanostructures to a topmost surface of the formed antireflection coating is from about 10 nm to about 1 micron. In some embodiments, conformally depositing the optically transparent material comprises spin coating the optically transparent material over the nanostructures. In some embodiments, conformally depositing the optically transparent material comprises performing a chemical vapor deposition (CVD) process.
On December 31, 2025, Jinxin Technology Holding Company, a leading digital content service provider in China, introduced the NAMI INSIGHT One.
The new wearable device, which integrates artificial intelligence with lightweight AR technology, is designed specifically for educational purposes. The product was officially launched through the company’s flagship stores via Chinese e-commerce platforms.
From the website we learn that the company focuses on AI-Powered Education: Making Learning More Efficient and Engaging.
Founded in 2014 by a team of former executives from Huawei and Microsoft, Shanghai Jinxin Network Technology Co., Ltd., has been dedicating to building a powerful digital education content production engine using advanced AI/AR/digital human technologies for the past 10 years.
Through its brand Nanobox and partners, it provides users educational content products and services.
Currently, it has become a leading provider in China.
Based on its experience in digital textbooks and insights into the education of primary and secondary school students in China, Jinxin Technology has built a new ecosystem of intelligent education content and services, including immersive learning experiences, stimulating learning interest, and aiming to improve learning efficiency
As the extended reality (XR) landscape evolves, two standout projects Project Aura and Project Moohan are shaping the future of wearable technology. Both initiatives, backed by major tech players, aim to redefine user interaction through advanced features and seamless integration. This article delves into their specifications, functionalities, and potential impact. | As the extended reality (XR) landscape evolves, two standout projects Project Aura and Project Moohan are shaping the future of wearable technology. Both initiatives, backed by major tech players, aim to redefine user interaction through advanced features and seamless integration. This article delves into their
US20250258375 - EYEWEAR DISPLAY HAVING A WAVEGUIDE WITH ADJUSTABLE REFLECTORS Pupil expansion for waveguided light extration often results in unwanted losses. Smart reflectors driven by eye tracking are a viable solution
Among other possible functions described in the patent, one describes pixelated area ("multi-state" scattering panels) to direct the user sight, based on a trigger.
This is not that novel principle on smartglasses, implemented, for example, with liquid crystals or the like. Other companies proposed versions of the mechanism in the past. What is novel is the implementation by Meta, up to now and the possible novel algorithms to drive the pixellated area layer
***
"The multi-state scattering panels can be selectively activated based on a trigger, such as identification of an object in the real-world environment, based on an event occurring in an XR application executing on the XR display, based on detected audio, etc."
Aspects of the present disclosure integrate pixelated multi-state panels (e.g., liquid crystal dimming panels) into artificial reality (XR) displays (e.g., augmented reality (AR) glasses) or conventional glasses. Conventional visual displays for XR displays can be expensive, have a low field-of-view, and consume high power. On the other hand, pixelated multi-state panels consume lower power, have a wide field-of-view, and are light, inexpensive, and computationally simple. The multi-state panels can have 4 configurations: 1) included on the periphery of an XR display, 2) as a standalone display, 3) as a display that's overlaid onto an XR display, and/or 4) as a secondary externally facing display. The multi-state panels can be selectively activated based on a trigger, such as identification of an object in the real-world environment, based on an event occurring in an XR application executing on the XR display, based on detected audio, etc.
WO2025171242 - HOLOGRAPHIC TWO-DIMENSIONAL (2D) IMAGE PROJECTION FOR AN AUGMENTED REALITY (AR) WAVEGUIDE DISPLAY Systems, methods, and/or apparatuses for a holographic projection module in a near-eye display device, which may display
>>augmented reality/virtual reality (AR/VR) content to the user. In one aspect, >>a spatial light modulator (SLM) is illuminated by a planar wavefront, >>which the SLM modulates with a pattern and projects the patterned light through a projection lens to form a 2D hologram which is input to >>a waveguide display.
>>Some examples may include a high-order filter to form an aperture in the Fourier domain controllable by the phase pattern displayed on the SLM; other examples may have no projection lenses, where the displayed SLM image corresponds to the user perceived Fourier domain image representation.
>>Some examples may use two stacked SLMs, a complex wavefront modulation SLM,
>>and/or a phase SLM with a mask (such as, e.g., a binary amplitude mask).
https://youtu.be/wPKfrYrFjBM Meta's been on a roll, partnering with Ray-Ban and now with Oakley to debut the HSTN glasses, but remember when Google was sort of the de-facto leader in the smart headset space? Sure, Google Glass and Google Cardboard were VERY early for their times
Wearable device and method of operating the same - SAMSUNG ELECTRONICS CO., LTD.
Samsung has filed a patent for a wearable sensor that can read the user's veins to establish his or her identity. If implemented, this allows Samsung's future wearables to double as keys for doors and cars. "The sensor takes a picture of the user’s vein structure and characteristics, then compares it to a vein image in its memory that it knows belongs to the user. The sensor might also detect the user’s pulse rate, which is also unique from person to person."Check also, from 2016 https://www.fastcompany.com/3056357/this-samsung-patent-lets-smartwatches-recognize-you-by-your-veins
To get content containing either thought or leadership enter:
To get content containing both thought and leadership enter:
To get content containing the expression thought leadership enter:
You can enter several keywords and you can refine them whenever you want. Our suggestion engine uses more signals but entering a few keywords here will rapidly give you great content to curate.