Color constancy mechanisms in virtual reality environments

dc.contributor.authorGil Rodríguez, Raquel
dc.contributor.authorHedjar, Laysa
dc.contributor.authorToscani, Matteo
dc.contributor.authorGuarnera, Dar’ya
dc.contributor.authorGuarnera, Giuseppe Claudio
dc.contributor.authorGegenfurtner, Karl R.
dc.date.accessioned2024-10-31T12:16:20Z
dc.date.available2024-10-31T12:16:20Z
dc.date.issued2024
dc.description.abstractPrior research has demonstrated high levels of color constancy in real-world scenarios featuring single light sources, extensive fields of view, and prolonged adaptation periods. However, exploring the specific cues humans rely on becomes challenging, if not unfeasible, with actual objects and lighting conditions. To circumvent these obstacles, we employed virtual reality technology to craft immersive, realistic settings that can be manipulated in real time. We designed forest and office scenes illuminated by five colors. Participants selected a test object most resembling a previously shown achromatic reference. To study color constancy mechanisms, we modified scenes to neutralize three contributors: local surround (placing a uniform-colored leaf under test objects), maximum flux (keeping the brightest object constant), and spatial mean (maintaining a neutral average light reflectance), employing two methods for the latter: changing object reflectances or introducing new elements. We found that color constancy was high in conditions with all cues present, aligning with past research. However, removing individual cues led to varied impacts on constancy. Local surrounds significantly reduced performance, especially under green illumination, showing strong interaction between greenish light and rose-colored contexts. In contrast, the maximum flux mechanism barely affected performance, challenging assumptions used in white balancing algorithms. The spatial mean experiment showed disparate effects: Adding objects slightly impacted performance, while changing reflectances nearly eliminated constancy, suggesting human color constancy relies more on scene interpretation than pixel-based calculations.en
dc.identifier.urihttps://jlupub.ub.uni-giessen.de/handle/jlupub/19730
dc.identifier.urihttps://doi.org/10.22029/jlupub-19087
dc.language.isoen
dc.rightsNamensnennung - Nicht kommerziell - Keine Bearbeitungen 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subject.ddcddc:150
dc.titleColor constancy mechanisms in virtual reality environments
dc.typearticle
local.affiliationFB 06 - Psychologie und Sportwissenschaft
local.source.articlenumber5
local.source.epage28
local.source.journaltitleJournal of vision
local.source.number6
local.source.spage1
local.source.urihttps://doi.org/10.1167/jov.24.5.6
local.source.volume24

Dateien

Originalbündel
Gerade angezeigt 1 - 1 von 1
Lade...
Vorschaubild
Name:
10.1167_jov.24.5.6.pdf
Größe:
7.61 MB
Format:
Adobe Portable Document Format