This may be a bit more of an observation than a question, but I am interested to hear what others have seen. I have made a stereo photo-viewer that will take any photo and present it properly scaled and positioned to each eye. I have used it to view regular 2d photos. I have seen that items which are red in color appear to be at a greater distance than blue items. White and black items are also altered from red and blue appearing in between depending on context ( colors nearby). I am guessing this is the strong lenses and the chromatic aberration effect which would tend to make blue image components larger than red due to the higher refractive effects on blue light. Sometimes this adds a very real and significant depth perception depending on what colors are place in the scene just by chance. Unfortunately, it can be disruptive if these contrasting colors are found near each other on a flat surface, a floral print curtain for instance. Does the SDK have a facility to artificially change the position of an objects Red/Green/Blue color channel separately in 3D space much like the pincushion correction or will it always contribute some blurriness to the final perceived image? It seems the lens correction would only alter the stretching of pixel coordinates as whole pixels with Red Green and Blue components for each pixel adjusted in position by the transform equally in the horizontal plane. This would have a deleterious effect on image clarity and might have drastic effects on depth perception in game style graphics or toon shaded animation. I haven't had much luck getting the LibOVR to work with my VS2013. It seems to get errors about missing DLL's that are not included in Win7 by default, bu soon enough I hope to use the distortion correction and a horizontal perspective algorithm to produce a SIM3D photo album viewer. Making samples in a photo editor has been promising although the 3d effects of the R/G/B refractory differences seem to be far more powerful than the perceived effects of perspective.