Age Verification
This website contains age-restricted material including nudity and explicit content. By entering, you confirm being at least 18 years old or the age of majority in the jurisdiction you are accessing the website from.
I am 18+ or older - Enter
I am under 18 - Exit
Our parental controls page explains how you can easily block access to this site.

Last posts - Page 491

  Forum

TheEmu
Joined in Jul 2012
7424 post(s)

Discussions for Scenes for Version 1.2.X Fullscreen Mode here

Everything about iStripper
July 18, 2021, 5054 answers
Could lack of a "Swizzle" feature explain why some other members' rigs are not compiling some shaders
where the code "boxes" (so to speak) don't balance

No. The "swizzle" operation is a very basic feature of the GLSL shader language and will be supported in hardware by all GPUs (there may be some that did not, but if so the compiler has to generate extra code making them very slow, just as it has to if everything is done in software). The incusion of swizzle as a texture feature indicates that the hardware also has a feature that enables swizzling to be done directly when acessing texture memory but that is not relevant to the simple case under discussion.

The problem is that the compliers in some GPU drivers accept as valid statements that the published GLSL standard says are invalid and, instead of issuing an error message and rejecting the program, they try to be helpful by doing something that may or may not be what the programmer intended.

For example, if v2 and v3 are of type vec2 and vec3 respectively the statement

v2 = v3;

which being short for

v2.xy = v3.xyz;

should be rejected because you can't put three values into something that can only hold two. But some of the compliers treat the above as if it was

v2.xy = v3.xy;

similarly they accept v3 = v2, i.e.

v3.xyz = v2.xy;

as valid and just partially update the value of v3 leaving v3.z unchanged even though the language standard says otherwise.

This behaviour is never useful in new code as it hides bugs. All it ever does is save a few characters in a statement at the expense of making its meaning obscure. The only excuse for it is that the earliest compilers did it so the behaviour is kept so that old programs keep on working - but in that case it would be far better to at least issue a warning rather than silently accept what the language standard defines to be invalid code.
EverthangForever
Joined in Oct 2009
4470 post(s)

Discussions for Scenes for Version 1.2.X Fullscreen Mode here

Everything about iStripper
July 18, 2021, 5054 answers
Some GPU's will ignore this error and just swizzle the vectors to match
@WA thank you so much for this explanation (above)

I noticed that if I delete my vghd.log , then quit iStripper and re-open it
to play one scene file, the newly generated vghd.log provides
a summary of fullscreen OpenGL dumps showing data specific to my graphics card.
including:

OPENGL GL_VENDOR: NVIDIA Corporation]
OPENGL GL_RENDERER: GeForce GT 730/PCIe/SSE2]
OPENGL GL_VERSION: 4.6.0 NVIDIA 462.30]
OPENGL GL_SHADING_LANGUAGE_VERSION: 4.60 NVIDIA]
OPENGL version: "4.6"]

OPENGL features: ("Multitexture", "Shaders", "Buffers", "Framebuffers", "BlendColor", "BlendEquation", "BlendEquationSeparate", "BlendFuncSeparate", "BlendSubtract", "CompressedTextures", "Multisample", "StencilSeparate", "NPOTTextures", "NPOTTextureRepeat", "FixedFunctionPipeline")]

OPENGL texture features: ("ImmutableStorage", "ImmutableMultisampleStorage", "TextureRectangle", "TextureArrays", "Texture3D", "TextureMultisample", "TextureBuffer", "TextureCubeMapArrays", "Swizzle", "StencilTexturing", "AnisotropicFiltering", "NPOTTextures", "NPOTTextureRepeat", "Texture1D")]

OPENGL texture units: 32]
OPENGL texture max size: 16384]

I noticed that within the 'texture features' includes "Swizzle"

Could lack of a "Swizzle" feature explain why some other members' rigs are not compiling some shaders
where the code "boxes" (so to speak) don't balance.. yet the shader on my graphics card still compiles ok
without reporting an 'ERROR' ?