https://www.selleckchem.com/products/seclidemstat.html BACKGROUND Real-world illumination challenges both autonomous sensing and displays, because scene luminance can vary by up to 109-to-1, whereas vision models have limited ability to generalize beyond 100-to-1 luminance contrast. Brain mechanisms automatically normalize the visual input based on feature context, but they remain poorly understood because of the limitations of commercially available displays. NEW METHOD Here, we describe procedures for setup, calibration, and precision check of an HDR display system, based on a JVC DLA-RS600U reference projector, with over 100,000-to-1 luminance dynamic range (636-0.006055 cd/m2), pseudo 11 bit grayscale precision, and 3 ms temporal precision in the MATLAB/Psychtoolbox software environment. The setup is synchronized with electroencephalography (EEG) and infrared eye-tracking measurements. RESULTS We show display metrics including light scatter versus average display luminance (ADL), spatial uniformity, and spatial uniformity at high spatial frequency. We also show a luminance normalization phenomenon, contextual facilitation of a high contrast target, whose discovery required HDR display. COMPARISON WITH EXISTING METHODS This system provides 100-fold greater dynamic range than standard 1000-to-1 contrast displays and increases the number of gray levels from 256 or 1024 (8 or 10 bits) to 2048 (pseudo 11 bits), enabling the study of mesopic-to-photopic vision, at the expense of spatial non-uniformities. CONCLUSIONS This HDR research capability opens new questions of how visual perception is resilient to real-world luminance dynamics and will lead to improved visual modeling of dense urban and forest environments and of mixed indoor-outdoor environments such as cockpits and augmented reality. Our display metrics code can be found at https//github.com/USArmyResearchLab/ARL-Display-Metrics-and-Average-Display-Luminance. Published by Elsevier B.V.BACKGROUND Our understan