ARefin Lab - Research Problems

Perceptual Focus Issues in Augmented Reality

In optical see-through (OST) augmented reality (AR) displays (e.g., Microsoft HoloLens, Google Glass), information is often distributed between real-world and virtual contexts and often appears at different distances from the user. Therefore, the user must repeatedly switch contexts and refocus the eyes to integrate the information. Therefore, in AR, integrating information between real and virtual contexts raises the issues of (1) context switching, where users must switch visual and cognitive attention between information sources, (2) focal distance switching, where users must accommodate (change the shape of the eye’s lens) to see, in sharp focus, information at a new distance, and (3) transient focal blur, the focal blur user perceives while switching the focal distance. The addressed problems involve the display’s optical design and how it interacts with human perception and vision.  If these problems are not handled properly, users can suffer visual fatigue and incorrect distance perception, leading to reduced performance. These issues impact many OST AR applications, including medical procedures, battlefield and tactical awareness applications, and heads-up displays in cars and aircraft, among others. If you want to learn more about this research, please read the following paper:

Mohammed Safayet Arefin, Nate Phillips, Alexander Plopski, Joseph L. Gabbard, and J. Edward Swan II. The Effect of Context Switching, Focal Switching Distance, Binocular and Monocular Viewing, and Transient Focal Blur on Human Performance in Optical See-Through Augmented Reality. IEEE Transactions on Visualization and Computer Graphics, Special Issue on IEEE Virtual Reality and 3D User Interfaces (VR 2022), 28(5):2014–2025, Mar 2022. DOI: 10.1109/TVCG.2022.3150503. Download: [Pre-Print] [Appendix]



Example of perceptual focus issues in the AR system. When the user focus is on the background, the AR symbology becomes blurry, and when the user focus is on the AR symbology, the background becomes blurry. The reason is that both AR symbology and the real-world background are at different focal distances. Frames are taken from a YouTube video showing the Google Glass AR display in daily use.

Link: https://www.youtube.com/watch?v=Vb2uojqKvFM  

Out-of-focus Issue in Augmented Reality

During continuous switching of eye focus from one distance to another in the augmented or virtual reality system to integrate information, users observe only one piece of information (either real or virtual) in focus. Other information becomes blurred for a concise amount of time (around 360 milliseconds). We termed this situation as "Out-of-focus." 


In our analysis, we observed that participants' performance decreased in the OST AR system due to the out-of-focus issue. Therefore, it brings the importance of generating a sharper representation of the out-of-focus visual information. More specifically, virtual information must be rendered to look sharper when seen as out-of-focus or during incorrect accommodation demand. We termed this sharper rendering of the virtual information as "SharpView AR." To accomplish this, it requires the knowledge of mathematical modeling of out-of-focus blur (retinal blurred image) with Zernike polynomials, which model focal deficiencies of human vision, and developing a focus correction algorithm based on total variation optimization, which corrects out-of-focus blur. The research requires synthetic simulation, optical camera-based measurement, and used-based study for validation purposes. 


If you want to learn more about this research, please read the following paper and my PhD dissertation:

Mohammed Safayet Arefin, Carlos Montalto, Alexander Plopski, and J. Edward Swan II. A SharpView Font with Enhanced Out-of-Focus Text Legibility for Augmented Reality Systems. In Proceedings of IEEE Virtual Reality and 3D User Interfaces (IEEE VR), Orlando, FL, USA, March 2024 pp. 31-41. doi: 10.1109/VR58804.2024.00027.  [Pre-Print] [Video]


 Augmented Reality Fonts with Enhanced Out-of-Focus Text Legibility [Download]

TextDemo.mp4

This video shows an example of SharpView AR. We considered textual information as the primary AR component, specifically short AR text labels. This novel AR font is termed as “Shaprview font,” and it promises to mitigate the effect of out-of-focus issues.


This video was captured through the optics of the optical see-through augmented reality system. The real information (cross: X) was presented at 4.0m, and the SharpView virtual information (word: 'TEXT') was rendered at 0.25m. SharpView virtual information (pre-corrected font) was generated for the out-of-focus aberration of +4.57D and pupil diameter of 5mm. When the camera lens focused on the real "X" at 0.25m, our SharpView rendered information exhibited sharper representation and improved visual acuity though it was out-of-focus. However, when the camera focused on our rendered virtual information, the real information (X) became blurred as it was not rendered according to our algorithm. 

Visual Perception in XR

The extended reality space can be divided into three environmental realities– real world, augmented reality (AR), and virtual reality (VR). In all three realities, information can be presented at different distances from the user. It is expected that users would perceive the information at the same depth in all three realities. Still, perceptual researchers have found that users overestimate or underestimate virtual object distance in AR and VR compared to real-world objects. From the depth perception perspective, there is a lack of literature on human visual system behavior between the real world and AR-VR. This body of research is essential for many applications, such as military, medical, and maintenance. In addition, commercial AR and VR devices incorporated with eye trackers (e.g., Microsoft HoloLens 2, HTC Vive Pro Eye) have only recently been developed and enabled us to investigate a novel set of research questions. In this research, we are exploring an interesting research question - considered an exciting research question: How does our visual system respond to depth changes in extended reality?


For this research, we considered two depth-dependent components of the binocular human visual system: eye vergence angle (EVA) and interpupillary distance (IPD). Eye vergence angle is the amount of eye rotation angles required to focus an object at a particular depth under binocular vision. IPD is the distance between the center of the left and right eye pupils. We calculated these human visual system components from the eye tracker data for eye fixations on real and simulated virtual objects, specifically from 3D eye gaze direction and 3D eye gaze origin information. We are investigating how the depth-dependent human visual system component behaves in depth changes in XR environments: real objects in the real-world (real), virtual objects in the real-world (AR), and virtual objects in the virtual world (VR).   If you want to learn more about this research, please read the following paper:


Mohammed Safayet Arefin, J. Edward Swan II, Russell A. Cohen-Hoffing, and Steven M. Thurman. Estimating Perceptual Depth Changes with Eye Vergence and Interpupillary Distance using an Eye Tracker in Virtual Reality. In ACM Symposium on Eye Tracking Research and Applications (ETRA), ACM, June 2022. DOI: 10.1145/3517031.3529632 

Download: [Pre-Print

The eye vergence angle (EVA) and inter-pupillary distance (IPD) are defined according to binocular human vision when the eyes verge on a near or far target. Calculating EVA and IPD from eye tracker data using the vectors of 3D gaze direction and 3D gaze origin.