*To receive email announcements and live stream information for upcoming seminars, please subscribe to the SystemX Seminar/EE310 Mailing list here.
For the past decade, display and sensor hardware developments for mixed reality and smart glasses were merely a shot in the dark, providing enough display immersion and visual comfort for developers to build up apps, especially for the enterprise field. On the sensor side, emphasis was put on 6DOF head tracking and spatial mapping, gesture sensing and later eye tracking. Today, as universal use cases for consumer emerge such as co-presence, digital twin and remote conferencing, new requirements are expressed in the product requirement documents (PRD) to enable such experiences, both on the display and sensing side. It is not only a race to smaller form factor and light weight devices for large field of view (FOV) and lower power, but the requirements are also on additional display and sensing features specifically tuned to implement such new universal use cases. Broad acceptance of wearable displays especially in the consumer field is contingent on enabling these new display and sensing requirements in small form factors and low power.
Bernard has been involved in optics and photonics over the past 25 years in areas such as optical data storage, optical computing, optical telecom, optical metrology and more recently wearable displays. He joined Google [X] Labs in 2010 as the Principal Optical Architect on the Google Glass project, and he moved to Microsoft in 2015 as the Partner Optical Architect on the HoloLens project. He wrote various books on micro-optics, and he is listed as the main inventor on more than 50 patents, especially on optical metrology and wearable displays.He is also an SPIE short course instructor and he cochairs a SPIE conferences on digital optics and immersive displays.Bernard is an SPIE fellow since 2013, he has been a member of the SPIE board from 2016 to 2019 and he is the current SPIE Vice-President.