布景

最近在学习图画处理相关的技术,作为一个iOSer,有点无法忍受OpenCV的默许UI,于是打算用Flutter作为UI框架,对接图画处理的相关能力。现在初步完成了App框架和H,S,L三个分量的调整功用。

Flutter对接OpenGL离屏渲染 (MacOS)

主题

本篇博客的主题是怎么运用Flutter对接macos的OpenGL离屏烘托,达到native处理图片然后在flutter页面显示的目的,首要包含如下内容

  • macos下怎么进行OpenGL离屏烘托
  • OpenGL离屏烘托的成果怎么同步给Flutter

macos下怎么进行OpenGL离屏烘托

装备OpenGL上下文

在macos上,运用NSOpenGLContext来装备和激活OpenGL

static NSOpenGLPixelFormatAttribute kDefaultAttributes[] = {
    NSOpenGLPFADoubleBuffer, //双缓冲
    NSOpenGLPFADepthSize, 24, //深度缓冲位深
    NSOpenGLPFAStencilSize, 8, //模板缓冲位深
    NSOpenGLPFAMultisample, //多重采样
    NSOpenGLPFASampleBuffers, (NSOpenGLPixelFormatAttribute)1, //多重采样buffer
    NSOpenGLPFASamples, (NSOpenGLPixelFormatAttribute)4, // 多重采样数
    NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core, // OpenGL3.2
    0};
NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:kDefaultAttributes];
_openglContext = [NSOpenGLContext.alloc initWithFormat:pixelFormat shareContext:nil];
[_openglContext makeCurrentContext];

设置FBO

由于要离屏烘托,所以需求自己创立FBO作为RenderTarget,而且运用Texture作为Color Attachment

glGenFramebuffersEXT(1, &framebuffer);
glGenTextures(1, &fboTexture);
int TEXWIDE = textureInfo.width;
int TEXHIGH = textureInfo.height;
GLenum status;
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, framebuffer);
glBindTexture(GL_TEXTURE_2D, fboTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, TEXWIDE, TEXHIGH, 0,
                GL_BGRA, GL_UNSIGNED_BYTE, NULL);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT,
                GL_TEXTURE_2D, fboTexture, 0);

读取OpenGL烘托成果

首要经过glReadPixels来读取OpenGL的烘托成果,成果存储到 _tempImageCache

glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, framebuffer);
glClearColor(1.0, 0, 0, 1);
glClear(GL_COLOR_BUFFER_BIT);
glViewport(0, 0, textureInfo.width, textureInfo.height);
[_glContext active];
[_glContext setUniform1f:@"hueAdjust" value:self.hueAdjust];
[_glContext setUniform1f:@"saturationAdjust" value:self.saturationAdjust];
[_glContext setUniform1f:@"lightnessAdjust" value:self.lightnessAdjust];
[_imagePlane draw:_glContext];
glFlush();
glReadPixels(0, 0, textureInfo.width, textureInfo.height, GL_BGRA, GL_UNSIGNED_BYTE, _tempImageCache);

OpenGL离屏烘托的成果怎么同步给Flutter

首要运用Flutter的外接纹路来同步OpenGL离屏烘托的成果

根据图片大小创立CVPixelBuffer

- (CVPixelBufferRef)copyPixelBuffer中判别,假如_pixelBuffer未初始化或许图片大小改变了,从头创立CVPixelBuffer

if (!_pixelBuffer
    || CVPixelBufferGetWidth(_pixelBuffer) != imageWidth
    || CVPixelBufferGetHeight(_pixelBuffer) != imageHeight) {
    if (_pixelBuffer) {
        CVPixelBufferRelease(_pixelBuffer);
    }
    if (_cacheImagePixels) {
        free(_cacheImagePixels);
    }
    _cacheImagePixels = (uint8_t *)malloc(imageWidth * imageHeight * 4);
    memset(_cacheImagePixels, 0xff, imageWidth * imageHeight * 4);
    CVReturn result = CVPixelBufferCreate(kCFAllocatorDefault, imageWidth, imageHeight, kCVPixelFormatType_32BGRA, (__bridge CFDictionaryRef)pixelAttributes, &_pixelBuffer);
    CVPixelBufferRetain(_pixelBuffer);
    if (result != kCVReturnSuccess) return nil;
}

填充数据

glReadPixel回来的数据写入CVPixelBuffer,这儿有点要注意,CVPixelBuffer需求字节对齐,所以一行的长度会大于等于一行像素的真实字节长度。只能逐行复制glReadPixel的成果

CVPixelBufferLockBaseAddress(pixelBuffer, 0);
size_t lines = CVPixelBufferGetHeight(pixelBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer);
int srcBytesPerRow = textureInfo.width * 4;
uint8_t *addr = CVPixelBufferGetBaseAddress(pixelBuffer);
glReadPixels(0, 0, textureInfo.width, textureInfo.height, GL_BGRA, GL_UNSIGNED_BYTE, _tempImageCache);
for (int line = 0; line < lines; ++line) {
    memcpy(addr + bytesPerRow * line, _tempImageCache + srcBytesPerRow * line, srcBytesPerRow);
}
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

Flutter侧显示

Flutter这边运用Texture组件即可,便是标准的外接纹路运用流程

更进一步

  • 能够让Flutter的CVPixelBuffer作为FBO的ColorAttachment,我现在运用glReadPixel,处理速度牵强够用,暂时没有优化
  • 运用Metal替换OpenGL,进一步提高功率