ios - OpenGL ES 2.0 textures for retina display? -


i have got glkview, try draw couple of cubes , create textures view , map them onto cubes. however, when start app on retina device, textures correctly sized terrible. have tried set contentscalefactor of glkview scale of main screen - no avail. have tried multiply the buffers dimensions scale, resulted in textures looked crisp, 1/4 of original size...

without further ado, may present have done (without above indicated multiplication):

glkview

- (void)setupgl {      uiscreen *mainscreen = [uiscreen mainscreen];     const cgfloat scale = mainscreen.scale;     self.contentscalefactor = scale;     self.layer.contentsscale = scale;      glgenframebuffers(1, &defaultframebuffer);     glbindframebuffer(gl_framebuffer, defaultframebuffer);      glgenrenderbuffers(1, &depthbuffer);     glbindrenderbuffer(gl_renderbuffer, depthbuffer);      glrenderbufferstorage(gl_renderbuffer, gl_depth_component16, self.bounds.size.width, self.bounds.size.height);     glframebufferrenderbuffer(gl_framebuffer, gl_depth_attachment, gl_renderbuffer, depthbuffer);      glgenrenderbuffers(1, &colorbuffer);     glbindrenderbuffer(gl_renderbuffer, colorbuffer);     glrenderbufferstorage(gl_renderbuffer, gl_rgba4, self.bounds.size.width, self.bounds.size.height);     glframebufferrenderbuffer(gl_framebuffer, gl_color_array, gl_renderbuffer, colorbuffer);      glenable(gl_depth_test);  } 

here load textures

// make space rgba image of view  glubyte *pixelbuffer = (glubyte *)malloc(                                          4 *                                          cv.bounds.size.width *                                           cv.bounds.size.height);  // create suitable coregraphics context cgcolorspaceref colourspace = cgcolorspacecreatedevicergb(); cgcontextref context = cgbitmapcontextcreate(pixelbuffer,                       cv.bounds.size.width, cv.bounds.size.height,                       8, 4*cv.bounds.size.width,                       colourspace,                       kcgimagealphapremultipliedlast | kcgbitmapbyteorder32big); cgcolorspacerelease(colourspace);  // draw view buffer [cv.layer renderincontext:context];  // upload opengl glteximage2d(gl_texture_2d, 0,              gl_rgba,              cv.bounds.size.width, cv.bounds.size.height, 0,              gl_rgba, gl_unsigned_byte, pixelbuffer);    gltexparameterf(gl_texture_2d, gl_texture_mag_filter, gl_nearest); gltexparameterf(gl_texture_2d, gl_texture_min_filter, gl_nearest); gltexparameterf(gl_texture_2d, gl_texture_wrap_s, gl_clamp_to_edge); gltexparameterf(gl_texture_2d, gl_texture_wrap_t, gl_clamp_to_edge); 

the answer question can found here

how create cgbitmapcontext works retina display , not wasting space regular display?

what did, multiply texture , buffer screen's scale factor , because yielded texture 1/4 of size, had multiply context scale factor well

cgcontextscalectm(context, scalefactor, scalefactor);


Comments

Popular posts from this blog

Ansible - ERROR! the field 'hosts' is required but was not set -

customize file_field button ruby on rails -

SoapUI on windows 10 - high DPI/4K scaling issue -