View Full Version : Surface::CopyFromMemory does not work on OpenGL ES in iOS platforms

02-29-2012, 12:23 PM
Hi all.

For my application I need to take screenshots to make textures.
I are developing on microsoft visual studio and I used CopyFromScreen from Surface class.
It works fine.
But when i compiled it in a Mac and run in iPad simulator, the textures that works ok in windows, are all black ones in iOS.
So I searched the iOS developer forums and found the following code:

- (UIImage*)snapshot
GLint l_backingWidth, l_backingHeight;

// Bind the color renderbuffer used to render the OpenGL ES view
// If your application only creates a single color renderbuffer which is already bound at this point,
// this call is redundant, but it is needed if you're dealing with multiple renderbuffers.
// Note, replace "_colorRenderbuffer" with the actual name of the renderbuffer object defined in your class.
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);//depthRenderbuffer

// Get the size of the backing CAEAGLLayer
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OE S, GL_RENDERBUFFER_WIDTH_OES, &l_backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OE S, GL_RENDERBUFFER_HEIGHT_OES, &l_backingHeight);

NSInteger x = 0, y = 0, width = l_backingWidth, height = l_backingHeight;
NSInteger dataLength = width * height * 4;
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));

// Read pixel data from the framebuffer
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);

// Create a CGImage with the pixel data
// If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel
// otherwise, use kCGImageAlphaPremultipliedLast
CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaNoneSkipLast,
ref, NULL, true, kCGRenderingIntentDefault);

// OpenGL ES measures data in PIXELS
// Create a graphics context with the target size measured in POINTS
NSInteger widthInPoints, heightInPoints;
if (NULL != UIGraphicsBeginImageContextWithOptions) {
// On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
// Set the scale parameter to your OpenGL ES view's contentScaleFactor
// so that you get a high-resolution snapshot when its value is greater than 1.0
CGFloat scale = self.contentScaleFactor;
widthInPoints = width / scale;
heightInPoints = height / scale;
UIGraphicsBeginImageContextWithOptions(CGSizeMake( widthInPoints, heightInPoints), NO, scale);
else {
// On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
widthInPoints = width;
heightInPoints = height;
UIGraphicsBeginImageContext(CGSizeMake(widthInPoin ts, heightInPoints));

CGContextRef cgcontext = UIGraphicsGetCurrentContext();

// UIKit coordinate system is upside down to GL/Quartz coordinate system
// Flip the CGImage by rendering it to the flipped bitmap context
// The size of the destination area is measured in POINTS
CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref);

// Retrieve the UIImage from the current context
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();


// Clean up

return image;

I put this code in EAGLView.mm file.
And this code

- (UIImage*)snapshot;

to the EAGLView.h file

Then I added this code

void* TakeScreenShot()
MyAppDelegate *appDelegate = (MyAppDelegate *)[[UIApplication sharedApplication] delegate];

UIImage *pImage = [appDelegate.glView snapshot];

NSData *data = UIImageJPEGRepresentation (pImage,1.0f);

if(data == nil)
return NULL;

unsigned int l = data.length;
byte* pData = (byte*)malloc(l * sizeof(byte));

memcpy(pData,data.bytes,l * sizeof(byte));
return pData;

to iOSUtils.mm file.

And last I added this
void* TakeScreenShot(); to PlatformsEnum.h file.
Now I can take a screenshot and load a Surface from the pure bytes of it like this:

byte* ppito = (byte*)TakeScreenShot();
SoftSurface surf;
surf.LoadFileFromMemory(ppito,SoftSurface::COLOR_K EY_NONE);

The only drawback, if it is, is that this is only for iOS 5.0 because is based on a new UI toolkit introduced for this version of iOS.

If this is totally wrong I do apologize.
But if it not, may someone else need to know it.


03-01-2012, 09:36 AM
Thanks for posting code, it may help someone.

I are developing on microsoft visual studio and I used CopyFromMemory from Surface class.

Hmm.. I don't see that.. you mean Surface::CopyFromScreen()? (Grabs whatever is on the primary gles rendering surface and creates a texture out of it, binding to that surface)

Strange, I am currently using that in my application Dink Smallwood on iPad and iPhone and haven't had any problems. Maybe we use it differently or it was recently broken, I haven't built Dink for a month or more.

03-01-2012, 09:43 AM
Maybe from iOS 5.0 the OpenGL ES has changed.
I have readed that from iOS 5 update Apple has changed UIKit in many ways, the alert messages handling, the snapshot to gles surfaces and other things that i right now i don't remenber :(

Yes, CopyFromScreen(). Not CopyFromMemory :sweatdrop:

And yes I promise you that I have used CopyFromScreen() in my application, and when I run it in my iPad I have black textures.
My iPad is iOS 5 so I searched and found that source code on the developers forums.