Hi all, I have a question regarding the transformation of degrees of visual angle (e.g. from the pre-computed RF positions, which are given in degrees) to monitor pixels.
Using the module allensdk.brain_observatory.stimulus_info, specifically its object BrainObservatoryMonitor, I get a px2deg ratio of 9.68 using the method visual_degrees_to_pixels. However, in the whitepaper, it says that the monitor size in pixels was 1920x1200, corresponding to 120° x 95° visual angle. Using this, I get a px2deg ratio of 1920/120 = 16, which is quite different. I’m not sure why I get this discrepancy - thank you very much in advance for any help!
Sorry, just realized: Does this discrepancy have to do with the stimulus warping? If so, does the value of the variable visual_degrees_to_pixels then represent the approximate relationship between degrees and pixels from the mouse’s perspective after warping?
Hi! You’re on the right track!
The visual size of the monitor (120x95) is based on the size of the monitor and its distance from the mouse’s eye. The px2deg value is true for the center of the monitor. But because the stimulus is warped, that relationship is not uniform across the monitor.
Thank you very much - but can’t we then assume that, if the warping perfectly corrects for the distortion caused by placing the monitor close to the mouse’s eye, the px2deg translation is constant from the perspective of the mouse? I’m asking because I’d like to extract local patches of the natural movies in the receptive field of a given unit, and thus need to translate both the RF position and RF area from degrees to pixels. (I’m using the method map_stimulus_coordinate_to_monitor_coordinate to relate template pixels to monitor pixels.)
In the BrainObservatoryMonitor module are functions “lsn_image_to_screen” and “natural_movie_image_to_screen” that puts the stimulus templates of each stimulus into the same screen coordinates, allowing you to directly to compare.
We actually made a function as part of a course to do this that might help: