2012-03-11 38 views
5

目前我正在使用基于GLpaint的绘图应用程序。保存当前屏幕对我来说成为一种繁琐的痛苦。我有一个ViewController,在视图控制器的顶部,我已经加载了我的UIIMageView和UIView(PaintingView)。现在它看起来像我在UIImageView的顶部绘图。GLPaint保存功能(保存当前屏幕与背景图像)

我已经设法得到我目前的图纸与这个问题GLPaint save image。 当我尝试捕捉我目前的绘图时,我得到了我的绘图,但有一个黑色的屏幕。我想要的是我的背景图像(UIImageView)的绘图。我应该用UIImageView覆盖UIView吗?

回答

2

您应该使用OpenGL加载图像,而不是UIKit(如UIImageView)。否则,您只能将OpenGLView作为单独的图像捕获,并将UIKit视图捕获为不同的图像。

为此,您必须在GLpaint示例中提供的PaintingView类的Texture中呈现图像,然后通过在绘图视图上绘制四边形来加载它。

0

我用这个代码从OpenGL的抓住我的形象:

-(UIImage*)mergeImage:(UIImage*)image1 withImage:(UIImage*)image2{ 

    CGSize size = image1.size; 

    UIGraphicsBeginImageContextWithOptions(size, NO, 0); 

    [image1 drawAtPoint:CGPointMake(0.0f, 0.0f)]; 
    [image2 drawAtPoint:CGPointMake(0.0f, 0.0f)]; 

    UIImage *result = UIGraphicsGetImageFromCurrentImageContext(); 
    UIGraphicsEndImageContext(); 

    return result; 
} 

事情是这样的:

finalImage =

-(BOOL)iPhoneRetina{ 
    return ([[UIScreen mainScreen] respondsToSelector:@selector(displayLinkWithTarget:selector:)] && ([UIScreen mainScreen].scale == 2.0))?YES:NO; 
} 

void releasePixels(void *info, const void *data, size_t size) { 
    free((void*)data); 
} 

-(UIImage *) glToUIImage{ 

    int imageWidth, imageHeight; 

    int scale = [self iPhoneRetina]?2:1; 

    imageWidth = self.frame.size.width*scale; 
    imageHeight = self.frame.size.height*scale; 

    NSInteger myDataLength = imageWidth * imageHeight * 4; 

    // allocate array and read pixels into it. 
    GLubyte *buffer = (GLubyte *) malloc(myDataLength); 
    glReadPixels(0, 0, imageWidth, imageHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer); 

    // make data provider with data. 
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, releasePixels); 

    // prep the ingredients 
    int bitsPerComponent = 8; 
    int bitsPerPixel = 32; 
    int bytesPerRow = 4 * imageWidth; 
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB(); 
    CGBitmapInfo bitmapInfo = kCGImageAlphaPremultipliedLast; 
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault; 

    // make the cgimage 

    CGImageRef imageRef = CGImageCreate(imageWidth, imageHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent); 

    UIImage *myImage = [UIImage imageWithCGImage:imageRef scale:scale orientation:UIImageOrientationDownMirrored]; //Render image flipped, since OpenGL's data is mirrored 

    CGImageRelease(imageRef); 
    CGColorSpaceRelease(colorSpaceRef); 

    CGDataProviderRelease(provider); 

    return myImage; 
} 

而这一次与背景图像进行合并[self mergeImage:BackgroundImage withImage [self glToUIImage]];