blender - How does opengl texturing works? -
i have been playing opengl time , found texturing interesting. have cube face selected, can project face 2d image , map color in image 3d model. how work? algorithm involved in this? in software blender can live edit vertices in 2d projection gets automatically mapped 3d model. there options unwrap, cube project, cylinder project etc.
i not sure if right place ask such question. asking out of curiosity.
texturing mapping of texels (a texture's pixel) onto fragments (a pixel composes image of rendered geometry on screen). texel mapped onto given fragment depends on fragment's texture coordinates. in simplest case coordinates associated geometry per vertex basis , interpolated fragments after geometry projected onto screen , rasterized. coordinates normalized, meaning have values between 0 , 1. 2d texture 1 have 2 coordinates - u , v. 1 of them aligned horizontal axis of texture image, other vertical. when in fragment shader texture sampled via texture sampler color given coordinates, sampler returns value interpolated number of texels found according coordinate, mipmap level , interpolation method. coordinates applied can vary depending on type of texturing required. interpolation method changes final image quality. mipmapping used changing level of texture's detail depending on view distance (filtering).
some links more information: https://www.cs.utexas.edu/~fussell/courses/cs384g/lectures/lecture12-texture_mapping.pdf http://cg.informatik.uni-freiburg.de/course_notes/graphics_06_texturing.pdf http://www.cs.cmu.edu/~djames/15-462/fall03/notes/09-texture.pdf
Comments
Post a Comment