OpenGL 贴纸和磨皮理论
OpenGL 图像混合
在人脸的关键位置贴上如耳朵、鼻子等装饰,其实就是将装饰叠加在原图某个位置中。在此之前我们需要开启混合模式!
// 开启混合模式
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_ONE, GLES20.GL_ONE_MINUS_CONSTANT_ALPHA);
混合就是把两种颜色:源颜色和目标颜色混在一起。其中源是指现在要画的贴纸,目标则是已经画好的摄像头图像。
源因子和目标因子通过glBlendFunc函数设置,不同的组合方式很多。当前我们使用的组合为:
GL_ONE:表示使用源颜色的alpha值来作为因子;
GL_ONE_MINUS_SRC_ALPHA:表示用1.0减去源颜色的alpha值来作为因子;
装饰品越透明,则混合的新颜色摄像头图像占比越重!
贴纸实现
实现贴纸只需要计算好贴纸在图像中的坐标,然后利用混合模式绘制到图层中即可。可以在OpenGL中进行坐标处理,也可以直接在Java借助:
GLES20.glViewport(x, y,width, height);
贴纸坐标
人脸定位与关键点定位得到的人脸与关键点坐标都是以送检图片左上角为起点,基于图片的宽与高。
而显示在屏幕上对应的画布宽高,与图片宽高不一定一致。定位绘图区域需要根据画布宽高进行定位。
也就是近大远小的效果
时刻要注意OpenGL的世界坐标
做法,在java中做,相对画布坐标(如:720x1180)
opengl 里面着色器做,需要按照世界坐标来,效率相对高
public class StickFilter extends AbstractFrameFilter {
private Bitmap bizi;
private int[] textures;
public StickFilter(Context context) {
super(context, R.raw.base_vert, R.raw.base_frag);
textures = new int[1];
OpenGLUtils.glGenTextures(textures);
// 把图片加载到创建的纹理中
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);
//....
bizi = BitmapFactory.decodeResource(context.getResources(), R.drawable.bizi);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D,0,bizi,0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
}
@Override
public int onDraw(int texture, FilterChain filterChain) {
return super.onDraw(texture, filterChain);
}
@Override
public void afterDraw(FilterContext filterContext) {
super.afterDraw(filterContext);
//画鼻子
Face face = filterContext.face;
if (face == null) {
return;
}
//开启混合模式
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_ONE, GLES20.GL_ONE_MINUS_SRC_ALPHA);
//计算坐标
//基于画布的鼻子中心点的x
float x = face.nose_x / face.imgWidth * filterContext.width;
float y = (1.0f - face.nose_y / face.imgHeight) * filterContext.height;
//鼻子贴纸的宽与高
//通过左右嘴角的x的差作为鼻子装饰品的宽
float mrx = face.mouseRight_x / face.imgWidth * filterContext.width;
float mlx = face.mouseLeft_x / face.imgWidth * filterContext.width;
int width = (int) (mrx - mlx);
//以嘴角的Y与鼻子中心点的y的差作为鼻子装饰品的高
float mry = (1.0f - face.mouseRight_y / face.imgHeight) * filterContext.height;
int height = (int) (y - mry);
GLES20.glViewport((int) x - width / 2, (int) y - height / 2, width, height);
//画鼻子
GLES20.glUseProgram(program);
vertexBuffer.position(0);
GLES20.glVertexAttribPointer(vPosition, 2, GLES20.GL_FLOAT, false, 0, vertexBuffer);
GLES20.glEnableVertexAttribArray(vPosition);
textureBuffer.position(0);
GLES20.glVertexAttribPointer(vCoord, 2, GLES20.GL_FLOAT, false, 0, textureBuffer);
GLES20.glEnableVertexAttribArray(vCoord);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);
GLES20.glUniform1i(vTexture, 0);
//通知画画
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
//关闭混合模式
GLES20.glDisable(GLES20.GL_BLEND);
}
}
美颜磨皮
类似高斯模糊,这里就是平均模糊,就是取周围的点进行平均操作
美颜是个合成操作,看下几个着色器:
模糊操作beauty_blur.frag
其实就是取水平和竖直方向进行平均计算
precision mediump float;
uniform sampler2D vTexture;
varying vec2 aCoord;
//纹理宽、高
uniform float texelWidthOffset;
uniform float texelHeightOffset;
vec4 blurCoord[5];
void main(){
//1、 进行模糊处理
// 偏移步距 (0,0.1)
vec2 singleStepOffset = vec2(texelWidthOffset, texelHeightOffset);
blurCoord[0] = vec4(aCoord - singleStepOffset, aCoord + singleStepOffset);
blurCoord[1] = vec4(aCoord - 2.0 * singleStepOffset, aCoord + 2.0*singleStepOffset);
blurCoord[2] = vec4(aCoord - 3.0 * singleStepOffset, aCoord + 3.0*singleStepOffset);
blurCoord[3] = vec4(aCoord - 4.0 * singleStepOffset, aCoord + 4.0*singleStepOffset);
blurCoord[4] = vec4(aCoord - 5.0 * singleStepOffset, aCoord + 5.0*singleStepOffset);
// 计算当前坐标的颜色值
vec4 currentColor = texture2D(vTexture, aCoord);
vec3 sum = currentColor.rgb;
// 计算偏移坐标的颜色值总和
for (int i = 0; i < 5; i++) {
sum += texture2D(vTexture, blurCoord[i].xy).rgb;
sum += texture2D(vTexture, blurCoord[i].zw).rgb;
}
//平均值 模糊效果
vec4 blur = vec4(sum / 11.0, currentColor.a);
gl_FragColor = blur;
}
BeautyBlurFilter
public class BeautyblurFilter extends AbstractFrameFilter {
private int texelWidthOffset;
private int texelHeightOffset;
private float mTexelWidth;
private float mTexelHeight;
public BeautyblurFilter(Context context) {
super(context, R.raw.base_vert, R.raw.beauty_blur);
}
@Override
public void initGL(Context context, int vertexShaderId, int fragmentShaderId) {
super.initGL(context, vertexShaderId, fragmentShaderId);
texelWidthOffset = GLES20.glGetUniformLocation(program, "texelWidthOffset");
texelHeightOffset = GLES20.glGetUniformLocation(program, "texelHeightOffset");
}
@Override
public void beforeDraw(FilterContext filterContext) {
super.beforeDraw(filterContext);
GLES20.glUniform1f(texelWidthOffset, mTexelWidth);
GLES20.glUniform1f(texelHeightOffset, mTexelHeight);
}
/**
* 设置高斯模糊的宽高
*/
public void setTexelOffsetSize(float width, float height) {
mTexelWidth = width;
mTexelHeight = height;
if (mTexelWidth != 0) {
mTexelWidth = 1.0f / mTexelWidth;
} else {
mTexelWidth = 0;
}
if (mTexelHeight != 0) {
mTexelHeight = 1.0f / mTexelHeight;
} else {
mTexelHeight = 0;
}
}
}
高反差保留-边缘锐化
beauty_highpass.frag
precision mediump float;
uniform sampler2D vTexture;
varying vec2 aCoord;
uniform sampler2D vBlurTexture;
void main(){
//2、PS高反差保留- 边缘锐化
vec4 currentColor = texture2D(vTexture, aCoord);
vec4 blurColor = texture2D(vBlurTexture, aCoord);
// 高反差 = 原图 - 高斯模糊图
vec4 highPassColor = currentColor - blurColor;
// clamp:获得三个参数中大小处在中间的那个值
float intensity = 24.0;// 强光程度
//color = 2 * color1 * color2;
highPassColor.r = clamp(2.0 * highPassColor.r * highPassColor.r * intensity, 0.0, 1.0);
highPassColor.g = clamp(2.0 * highPassColor.g * highPassColor.g * intensity, 0.0, 1.0);
highPassColor.b = clamp(2.0 * highPassColor.b * highPassColor.b * intensity, 0.0, 1.0);
vec4 highPassBlur = vec4(highPassColor.rgb, 1.0);
gl_FragColor = highPassBlur;
}
BeautyHighpassFilter
public class BeautyHighpassFilter extends AbstractFrameFilter {
private int vBlurTexture;
private int blurTexture;
public BeautyHighpassFilter(Context context) {
super(context, R.raw.base_vert, R.raw.beauty_highpass);
}
@Override
public void initGL(Context context, int vertexShaderId, int fragmentShaderId) {
super.initGL(context, vertexShaderId, fragmentShaderId);
vBlurTexture = GLES20.glGetUniformLocation(program, "vBlurTexture");
}
@Override
public void beforeDraw(FilterContext filterContext) {
super.beforeDraw(filterContext);
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, blurTexture);
GLES20.glUniform1i(vBlurTexture, 1);
}
public void setBlurTexture(int blurTexture) {
this.blurTexture = blurTexture;
}
}
保边预处理,整个面部的轮廓不至于处理掉
beauty_highpass_blur.frag
precision mediump float;
uniform sampler2D vTexture;//高反差保留纹理(高通滤波)
varying vec2 aCoord;
uniform int width;
uniform int height;
vec4 blurCoord[2];
void main(){
//3、保边预处理 保留边沿的细节不被模糊掉
vec4 currentColor = texture2D(vTexture, aCoord);
vec2 singleStepOffset = vec2(width, height);
blurCoord[0] = vec4(aCoord - singleStepOffset, aCoord + singleStepOffset);
blurCoord[1] = vec4(aCoord - 2.0 *singleStepOffset, aCoord + 2.0*singleStepOffset);
vec3 sum = currentColor.rgb;
for (int i = 0; i < 2; i++) {
sum += texture2D(vTexture, blurCoord[i].xy).rgb;
sum += texture2D(vTexture, blurCoord[i].zw).rgb;
}
vec4 highPassBlur = vec4(sum*1.0/5.0, currentColor.a);
gl_FragColor = highPassBlur;
}
BeautyHighpassBlurFilter
public class BeautyHighpassBlurFilter extends AbstractFrameFilter {
private int widthIndex;
private int heightIndex;
public BeautyHighpassBlurFilter(Context context) {
super(context, R.raw.base_vert, R.raw.beauty_highpass_blur);
}
@Override
public void initGL(Context context, int vertexShaderId, int fragmentShaderId) {
super.initGL(context, vertexShaderId, fragmentShaderId);
widthIndex = GLES20.glGetUniformLocation(program, "width");
heightIndex = GLES20.glGetUniformLocation(program, "height");
}
@Override
public void beforeDraw(FilterContext filterContext) {
super.beforeDraw(filterContext);
GLES20.glUniform1i(widthIndex, filterContext.width);
GLES20.glUniform1i(heightIndex, filterContext.height);
}
}
磨皮
这不操作,就是混合颜色
beauty_adjust.frag
precision mediump float;
uniform sampler2D vTexture; //原图
varying vec2 aCoord;
uniform sampler2D blurTexture; //原图模糊
uniform sampler2D highpassBlurTexture; //模糊后的高反差图
//磨皮程度 0-1.0
uniform float level;
void main(){
//4、磨皮
vec4 currentColor = texture2D(vTexture, aCoord);
vec4 blurColor = texture2D(blurTexture, aCoord);
vec4 highpassBlurColor = texture2D(highpassBlurTexture, aCoord);
float value = clamp((min(currentColor.b, blurColor.b) - 0.2) * 5.0, 0.0, 1.0);
float maxChannelColor = max(max(highpassBlurColor.r, highpassBlurColor.g), highpassBlurColor.b);
float currentIntensity = (1.0 - maxChannelColor / (maxChannelColor + 0.2)) * value * level;
// 混合
vec3 resultColor = mix(currentColor.rgb, blurColor.rgb, currentIntensity);
gl_FragColor = vec4(resultColor, 1.0);
}
BeautyAdjustFilter
public class BeautyAdjustFilter extends AbstractFrameFilter {
private int level;
private int vBlurTexture;
private int vHighpassBlurTexture;
private int blurTexture;
private int highpassBlurTexture;
public BeautyAdjustFilter(Context context) {
super(context, R.raw.base_vert, R.raw.beauty_adjust);
}
@Override
public void initGL(Context context, int vertexShaderId, int fragmentShaderId) {
super.initGL(context, vertexShaderId, fragmentShaderId);
level = GLES20.glGetUniformLocation(program, "level");
vBlurTexture = GLES20.glGetUniformLocation(program, "blurTexture");
vHighpassBlurTexture = GLES20.glGetUniformLocation(program, "highpassBlurTexture");
}
@Override
public void beforeDraw(FilterContext filterContext) {
super.beforeDraw(filterContext);
GLES20.glUniform1f(level, filterContext.beautyLevel);
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, blurTexture);
GLES20.glUniform1i(vBlurTexture, 1);
GLES20.glActiveTexture(GLES20.GL_TEXTURE2);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, highpassBlurTexture);
GLES20.glUniform1i(vHighpassBlurTexture, 2);
}
public void setBlurTexture(int blurTexture) {
this.blurTexture = blurTexture;
}
public void setHighpassBlurTexture(int highpassBlurTexture) {
this.highpassBlurTexture = highpassBlurTexture;
}
}
注意:针对,一个着色器中,处理多张图片
传参注意图层,
GL_TEXTURE1,GL_TEXTURE2……
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, blurTexture);
GLES20.glUniform1i(vBlurTexture, 1);
GLES20.glActiveTexture(GLES20.GL_TEXTURE2);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, highpassBlurTexture);
GLES20.glUniform1i(vHighpassBlurTexture, 2);
根据上面4个着色器,我们用一个路径包含四个步骤进行处理
组合处理,自己不需要画,责任链中调起下一个滤镜 BeautyFilter.java
/**
* 组合, 自己不需要画,调起下一个滤镜即可
*/
public class BeautyFilter extends AbstractFilter {
private BeautyblurFilter beautyVerticalblurFilter;
private BeautyblurFilter beautyHorizontalblurFilter;
private BeautyHighpassFilter beautyHighpassFilter;
private BeautyHighpassBlurFilter beautyHighpassBlurFilter;
private BeautyAdjustFilter beautyAdjustFilter;
public BeautyFilter(Context context) {
super(context, -1, -1);
beautyVerticalblurFilter = new BeautyblurFilter(context);
beautyHorizontalblurFilter = new BeautyblurFilter(context);
beautyHighpassFilter = new BeautyHighpassFilter(context);
beautyHighpassBlurFilter = new BeautyHighpassBlurFilter(context);
beautyAdjustFilter = new BeautyAdjustFilter(context);
}
@Override
public int onDraw(int texture, FilterChain filterChain) {
filterChain.setPause(true);
//1、模糊处理
beautyVerticalblurFilter.setTexelOffsetSize(0, filterChain.filterContext.height);
int blurTexture = beautyVerticalblurFilter.onDraw(texture, filterChain);
beautyHorizontalblurFilter.setTexelOffsetSize(filterChain.filterContext.width,0);
blurTexture = beautyHorizontalblurFilter.onDraw(blurTexture,filterChain);
//2、高反差保留 边缘锐化
beautyHighpassFilter.setBlurTexture(blurTexture);
int highpassTexture = beautyHighpassFilter.onDraw(texture, filterChain);
//3、保边预处理 保留边沿的细节不被模糊掉
int highpassBlurTexture = beautyHighpassBlurFilter.onDraw(highpassTexture, filterChain);
//4、磨皮调整
beautyAdjustFilter.setBlurTexture(blurTexture);
beautyAdjustFilter.setHighpassBlurTexture(highpassBlurTexture);
int beautyTextre = beautyAdjustFilter.onDraw(texture, filterChain);
filterChain.setPause(false);
return filterChain.proceed(beautyTextre);
}
}
原文链接: https://www.jianshu.com/p/bcb9971690e5
-- END --
进技术交流群,扫码添加我的微信:Byte-Flow
获取相关资料和源码
推荐:
全网最全的 Android 音视频和 OpenGL ES 干货,都在这了
面试官:如何利用 Shader 实现 RGBA 到 NV21 图像格式转换?
项目疑难问题解答、大厂内部推荐、面试指导、简历指导、代码指导、offer 选择建议、学习路线规划,可以点击找我一对一解答。