cancel
Showing results for 
Search instead for 
Did you mean: 

[0.4.3 & 0.4.4] Blue Peripherals

Carandiru
Honored Guest
I have to change the code in Util_Render_Stereo.cpp to get acceptable results (no blue lines at the peripherals of the eye distortion mesh)
I have attached some images showing the problem (May need to open image in new window with right click to see whole picture😞

Before Code Change:


After Code Change:


Vignette Disabled:



Does this only affect AMD cards ?


// When does the fade-to-black edge start? Chosen heuristically.
float fadeOutBorderFractionTexture = 0.05f;
float fadeOutBorderFractionTextureInnerEdge = 0.05f;
float fadeOutBorderFractionScreen = 0.05f;
// float fadeOutFloor = 0.6f; // the floor controls how much black is in the fade region

if (hmdRenderInfo.HmdType == HmdType_DK1)
{
fadeOutBorderFractionTexture = 0.3f;
fadeOutBorderFractionTextureInnerEdge = 0.075f;
fadeOutBorderFractionScreen = 0.075f;
// fadeOutFloor = 0.25f;
}

// Fade out at texture edges.
// The furthest out will be the blue channel, because of chromatic aberration (true of any standard lens)
Vector2f sourceTexCoordBlueNDC = TransformTanFovSpaceToRendertargetNDC ( eyeToSourceNDC, tanEyeAnglesB );
if (rightEye)
{
// The inner edge of the eye texture is usually much more magnified, because it's right against the middle of the screen, not the FOV edge.
// So we want a different scaling factor for that. This code flips the texture NDC so that +1.0 is the inner edge
sourceTexCoordBlueNDC.x = -sourceTexCoordBlueNDC.x;
}
float edgeFadeIn = ( 1.0f / fadeOutBorderFractionTextureInnerEdge ) * ( 1.0f - sourceTexCoordBlueNDC.x ) ; // Inner
edgeFadeIn = Alg::Min ( edgeFadeIn, ( 1.0f / fadeOutBorderFractionTexture ) * ( 1.0f + sourceTexCoordBlueNDC.x ) ); // Outer
edgeFadeIn = Alg::Min ( edgeFadeIn, ( 1.0f / fadeOutBorderFractionTexture ) * ( 1.0f - sourceTexCoordBlueNDC.y ) ); // Upper
edgeFadeIn = Alg::Min ( edgeFadeIn, ( 1.0f / fadeOutBorderFractionTexture ) * ( 1.0f + sourceTexCoordBlueNDC.y ) ); // Lower

// Also fade out at screen edges. Since this is in pixel space, no need to do inner specially.
float edgeFadeInScreen = ( 1.0f / fadeOutBorderFractionScreen ) *
( 1.0f - Alg::Max ( Alg::Abs ( screenNDC.x ), Alg::Abs ( screenNDC.y ) ) );
edgeFadeIn = Alg::Min ( edgeFadeInScreen, edgeFadeIn );// + fadeOutFloor;

// Note - this is NOT clamped negatively.
// For rendering methods that interpolate over a coarse grid, we need the values to go negative for correct intersection with zero.
result.Shade = Alg::Min ( edgeFadeIn, 1.0f );
result.ScreenPosNDC.x = 0.5f * screenNDC.x - 0.5f + xOffset;
result.ScreenPosNDC.y = -screenNDC.y;
http://www.supersinfulsilicon.com/ supersinfulsilicon - software Home of the MaxVR Oculus Rift Video Player https://twitter.com/Carandiru
4 REPLIES 4

TWhite
Explorer
"Carandiru" wrote:

Does this only affect AMD cards ?


Nope. It affects everyone using runtime 0.4.3 and up so far. Nvidia and AMD equally getting hit by this. There's another thread on it but so far you're the only one to post a fix that I've seen.
CPU: i7 3770k 4.6Ghz GPU: EVGA GTX 780

Carandiru
Honored Guest
Well I hope the code change helps out others until Oculus comes up with a better solution.

Cheers!
http://www.supersinfulsilicon.com/ supersinfulsilicon - software Home of the MaxVR Oculus Rift Video Player https://twitter.com/Carandiru

cybereality
Grand Champion
I've sent a message to see if anyone here knows more about this.
AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV

brantlew
Adventurer
This is a chroma-bleed bug when using a shared render target texture. It was there before but occluded by the vignette. As we have pulled the vignette back it has become more exposed. The code fix above returns the older vignette (with the lower FOV) to hide it. If you can use separate eye textures it should fix it. We are looking into a fix for the shared texture case.