http://www.educity.cn/wenda/92368.html

OpenGL ES教程VI之纹理贴图(原文对照)

  OpenGL ES Tutorial for Android – Part VI – Textures

  December 30th, 2010 by Per-Erik Bergman — Android, Embedded, Java

  Last tutorial we worked a bit more on meshes and we have also talked about adding colors to our mesh. The most common way of adding colors to your mesh is to add a texture. There is a couple of different steps involved with adding a texture to the mesh I will try to go through them all and explain the basics about them.

  上一教程我们生成了一些模型,而且我们已经知道如何给模型着色。但最常用的着色方式还是添加纹理。给模型添加纹理有几个不同的操作步骤。下面我将一一展开。

  Loading bitmaps

  First step would be to get a bitmap to generate a texture from. You can get hold of a bitmap in many different ways from downloading, generating or simply just load one from the resources. I'm going with the simplest one for this example witch is loading from the resources.

  第一步,我们需要得到贴图的图片,这有许多方式。你可以下载,生成,或是简单地从资源中加载,我使用了最后一种:从一个资源文件中加载。

  Bitmap bitmap = BitmapFactory.decodeResource(contect.getResources(),

  R.drawable.icon);

  One other thing about textures is that some hardware requires that the height and width are in the power of 2 (1, 2, 4, 8, 16, 32, 64...). If you run a texture with a size of 30x30pixels on a hardware that don’t support it you will just get a white square (unless you change the default color).

  需要注意的是,在某些硬件上,贴图需要的图片尺寸必须是2的n次方(1,2,4,8,16,32…)。如果你的图片是30X30的话,而且硬件不支持的话,那么你只能看到一个白色的方框(除非,你更改了默认颜色)

  Generating a texture

  After we have loaded the bitmap we need to tell OpenGL to actually create the texture.

  图片加载之后,就可以告诉OpenGL 来产生纹理了。

  First thing we need to do is to let OpenGL generate some texture id's that we will use as handles to the textures later on. In this example we will only have one texture.

  首先要做的是让OpenGL产生纹理ID,这些ID将在后面用到。例子中我们只有一个纹理。

  // Create an int array with the number of textures we want,

  // in this case 1.

  int[] textures = new int[1];

  // Tell OpenGL to generate textures.

  gl.glGenTextures(1, textures, 0);

  With the same parameters you can delete the textures:

  // Delete a texture.

  gl.glDeleteTextures(1, textures, 0)

  Now when the texture id's are generated we need to just like everything else tell OpenGL what to work with. With textures we use the command glBindTexture:

  ID产生之后,我们需要将这些ID使用glBindTexture方式进行绑定

  gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);

  From this point all commands we call on regarding textures will be applied on to your texture with the generated id.

  那么在此之后,我们后面将使用产生的ID来调用纹理

  glTexParameter

  There is a couple of parameters we need to set on the texture, the first one is to tell OpenGL what to do if the texture need to be shrunk or magnified to match the rendered image.

  在纹理映射,我们需要设置几个参数,第一个是告诉OpenGL在渲染图片时,怎么缩小或放大以适合大小。
If the texture is smaller it needs to be magnified that is done with the magnification function:

  如果贴图小的话,那我们需要使用放大函数进行放大操作。

  // Scale up if the texture if smaller.

  gl.glTexParameterf(GL10.GL_TEXTURE_2D,

  GL10.GL_TEXTURE_MAG_FILTER,

  GL10.GL_LINEAR);

  And how to scale if the texture needs to be scaled down using the minification function.

  类似,在贴图过多时,使用压缩函数进行缩小。

  // scale linearly when image smalled than texture

  gl.glTexParameterf(GL10.GL_TEXTURE_2D,

  GL10.GL_TEXTURE_MIN_FILTER,

  GL10.GL_LINEAR);

  You need to pass an argument to these functions. I'm only going to show you two of them the rest you can investigate your self

  请看上面的函数,你可以自己研究一下,该给它传递什么参数。

  If you want a crisp and clean rendering like this image you need to use the GL10.GL_NEAREST parameter.

  如果你想要清晰的渲染效果,你可以使用GL10.GL_NEAREST。

  

  If you rather want a blurred image you should use the GL10.GL_LINEAR parameter.

  如果你喜欢模糊一点,应该使用GL10.GL_LINEAR

  

  UV Mapping

  We will also need to tell OpenGL how to map this image onto the mesh this is done in two steps, fist we need to assign UV coordinates

  下面我们需要告诉OpenGL怎样将图片映射到模型上,有两个步骤。首先我们指定一个UV坐标

  UV mapping is the way we map the pixels on the bitmap to the vertices in our mesh. The UV coordinates are 0,0 in the upper left and 1,1 is the bottom right, like the left image below. The right image below illustrates how our plane is built. To get the texture mapped correctly we need to map the lower left part of the texture (0,1) to the lower left vertex (0) in our plane and we need to map the the bottom right (1,1) in the texture to the bottom right (1) to the bottom right in our plane and... you get the idea.

  我们使用UV映射将图片的每一像素映射到模型的顶点上。UV坐标中,左上角为0,0,右下角为1,1,请看下图的左半部分。右半部分是我们要创建的平面。为保证映射正确,我们将纹理左下角映射到左下角顶点0,右下角映射到顶点1…依此类推。

  注:在OpenGL教程里讲道,图片左下角为0,0坐标。不过我们这里是Android的OpenGL ES。或许Android在接口封装上,有些许改动吧。

  We put this mapping into a float array like this:

  纹理坐标数组的定义如下:

  float textureCoordinates[] = {0.0f, 1.0f,

  1.0f, 1.0f,

  0.0f, 0.0f,

  1.0f, 0.0f };

  

  If we instead used 0.5 instead of 1.0 like this:

  float textureCoordinates[] = {0.0f, 0.5f,

  0.5f, 0.5f,

  0.0f, 0.0f,

  0.5f, 0.0f };

  
The texture will be mapped so the plane will have the upper left part of it.

  那么将映射图片的左上角到平面中

  

  Back to the glTexParameterf, if we go the other way and uses values higher then 1.0 like this:

  请回想一下glTexParameterf函数。如果我们将1.0放大到2.0

  float textureCoordinates[] = {0.0f, 2.0f,

  2.0f, 2.0f,

  0.0f, 0.0f,

  2.0f, 0.0f };

  We actually tell OpenGL to use part of the texture that does not exist so we need to tell OpenGL what to do with the part that does not exist.

  那么超过图片的位置,OpenGL该如何处理呢?这正是下面我们讨论的。

  We use the glTexParameterf function to tell OpenGL what to do with the texture. By default OpenGL uses something called GL_REPEAT.

  我们使用glTexParameterf函数来告诉OpenGL该如何进行贴图,默认使用的参数项为GL_REPEAT

  GL_REPEAT means that OpenGL should repeat the texture beyond 1.0.

  GL_REPEAT意味着OpenGL应该重复纹理超过1.0的部分
GL_CLAMP_TO_EDGE means that OpenGL only will draw the image once and after that just repeat the last pixel line the rest of the image.

  GL_CLAMP_TO_EDGE表示OpenGL只画图片一次,剩下的部分将使用图片最后一行像素重复

  Since we are working with a 2D texture so we need to tell OpenGL what to do in two directions: GL_TEXTURE_WRAP_S and GL_TEXTURE_WRAP_T.

  对于一个2D纹理,我们还需要告诉它们的方向。

  Below you see a chart with the 4 combinations of GL_REPEAT and GL_CLAMP_TO_EDGE.

  下面请看它们的四种组合(第三种组合对应的图片错了。)

  

WRAP_S: GL_REPEAT
WRAP_T: GL_REPEAT
WRAP_S: GL_REPEAT
WRAP_T: GL_CLAMP_TO_EDGE
WRAP_S: GL_REPEAT
WRAP_T: GL_CLAMP_TO_EDGE
WRAP_S: GL_CLAMP_TO_EDGE
WRAP_T: GL_CLAMP_TO_EDGE

  This is how we use the glTexParameterf function:

  gl.glTexParameterf(GL10.GL_TEXTURE_2D,

  GL10.GL_TEXTURE_WRAP_S,

  GL10.GL_REPEAT);

  gl.glTexParameterf(GL10.GL_TEXTURE_2D,

  GL10.GL_TEXTURE_WRAP_T,

  GL10.GL_REPEAT);

  The last thing we need to do is to bind the bitmap we loaded to the texture id we created.

  GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);

  Using the texture

  To be able to use the texture we need just like with everything else create a byte buffer with the UV coordinates:

  对于UV坐标,我们同样使用字节缓冲

  FloatBuffer byteBuf = ByteBuffer.allocateDirect(texture.length * 4);

  byteBuf.order(ByteOrder.nativeOrder());

  textureBuffer = byteBuf.asFloatBuffer();

  textureBuffer.put(textureCoordinates);

  textureBuffer.position(0);

  Rendering

  // Telling OpenGL to enable textures.

  gl.glEnable(GL10.GL_TEXTURE_2D);

  // Tell OpenGL where our texture is located.

  gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);

  // Tell OpenGL to enable the use of UV coordinates.

  gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

  // Telling OpenGL where our UV coordinates are.

  gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);

  // ... here goes the rendering of the mesh ...

  // Disable the use of UV coordinates.

  gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

  // Disable the use of textures.

  gl.glDisable(GL10.GL_TEXTURE_2D);

  Putting it all together

  I'm using a modified version of the code from the previous tutorial. The different is mostly that I renamed some variables and functions and added more comments and all code is now under Apache License. To make the code easier to understand I removed the previous plane and added a new easier one called SimplePlane.

  Updating the Mesh class

  The first thing we need to do is to update the Mesh class (se.jaywash.Mesh). We need to add the functionality to load and render a texture.

  We need to be able to set and store the UV coordinates.

  // Our UV texture buffer.

  private FloatBuffer mTextureBuffer;

  /**

   * Set the texture coordinates.

   *

   * @param textureCoords

   */

  protected void setTextureCoordinates(float[] textureCoords) {

  // float is 4 bytes, therefore we multiply the number if

  // vertices with 4.

  ByteBuffer byteBuf = ByteBuffer.allocateDirect(

  textureCoords.length * 4);

  byteBuf.order(ByteOrder.nativeOrder());

  mTextureBuffer = byteBuf.asFloatBuffer();

  mTextureBuffer.put(textureCoords);

  mTextureBuffer.position(0);

  }

  We also need to add functions to set the bitmap and create the texture.

  // Our texture id.

  private int mTextureId = -1;

  // The bitmap we want to load as a texture.

  private Bitmap mBitmap;

  /**

   * Set the bitmap to load into a texture.

   *

   * @param bitmap

   */

  public void loadBitmap(Bitmap bitmap) {

  this.mBitmap = bitmap;

  mShouldLoadTexture = true;

  }

  /**

   * Loads the texture.

   *

   * @param gl

   */

  private void loadGLTexture(GL10 gl) {

  // Generate one texture pointer...

  int[] textures = new int[1];

  gl.glGenTextures(1, textures, 0);

  mTextureId = textures[0];

  // ...and bind it to our array

  gl.glBindTexture(GL10.GL_TEXTURE_2D, mTextureId);

  // Create Nearest Filtered Texture

  gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER,

  GL10.GL_LINEAR);

  gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER,

  GL10.GL_LINEAR);

  // Different possible texture parameters, e.g. GL10.GL_CLAMP_TO_EDGE

  gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S,

  GL10.GL_CLAMP_TO_EDGE);

  gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T,

  GL10.GL_REPEAT);

  // Use the Android GLUtils to specify a two-dimensional texture image

  // from our bitmap

  GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, mBitmap, 0);

  }

  And finally we need to add the call to the texture loading and to actually tell OpenGL to render with this texture. I removed some code so the page would not be so long but you will find the code complete in the attached zip file.

  // Indicates if we need to load the texture.

  private boolean mShouldLoadTexture = false;

  /**

   * Render the mesh.

   *

   * @param gl

   *      the OpenGL context to render to.

   */

  public void draw(GL10 gl) {

  ...

  // Smooth color

  if (mColorBuffer != null) {

  // Enable the color array buffer to be used during rendering.

  gl.glEnableClientState(GL10.GL_COLOR_ARRAY);

  gl.glColorPointer(4, GL10.GL_FLOAT, 0, mColorBuffer);

  }

  if (mShouldLoadTexture) {

  loadGLTexture(gl);

  mShouldLoadTexture = false;

  }

  if (mTextureId != -1 && mTextureBuffer != null) {

  gl.glEnable(GL10.GL_TEXTURE_2D);

  // Enable the texture state

  gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

  // Point to our buffers

  gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, mTextureBuffer);

  gl.glBindTexture(GL10.GL_TEXTURE_2D, mTextureId);

  }

  gl.glTranslatef(x, y, z);

  ...

  // Point out the where the color buffer is.

  gl.glDrawElements(GL10.GL_TRIANGLES, mNumOfIndices,

  GL10.GL_UNSIGNED_SHORT, mIndicesBuffer);

  ...

  if (mTextureId != -1 && mTextureBuffer != null) {

  gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);

  }

  ...

  }

  Creating the SimplePlane class

  We also need to create the SimplePlane.java. The code is pretty simple and self-explaining if you have read my previous tutorials. The new element is the textureCoordinates variable.

  package se.jaywash;

  /**

   * SimplePlane is a setup class for Mesh that creates a plane mesh.

   *

   * @author Per-Erik Bergman (per-erik.)

   *

   */

  public class SimplePlane extends Mesh {

  /**

      * Create a plane with a default with and height of 1 unit.

      */

  public SimplePlane() {

  this(1, 1);

  }

  /**

      * Create a plane.

      *

      * @param width

      *      the width of the plane.

      * @param height

      *      the height of the plane.

      */

  public SimplePlane(float width, float height) {

  // Mapping coordinates for the vertices

  float textureCoordinates[] = { 0.0f, 2.0f, //

  2.0f, 2.0f, //

  0.0f, 0.0f, //

  2.0f, 0.0f, //

  };

  short[] indices = new short[] { 0, 1, 2, 1, 3, 2 };

  float[] vertices = new float[] { -0.5f, -0.5f, 0.0f,

  0.5f, -0.5f, 0.0f,

  -0.5f, 0.5f, 0.0f,

  0.5f, 0.5f, 0.0f };

  setIndices(indices);

  setVertices(vertices);

  setTextureCoordinates(textureCoordinates);

  }

  }

  References

  The info used in this tutorial is collected from:
Android Developers
OpenGL ES 1.1 Reference Pages

  You can download the source for this tutorial here: Tutorial_Part_VI
You can also checkout the code from: cod

  Previous tutorial: OpenGL ES Tutorial for Android – Part V – More on Meshes

  Per-Erik Bergman 
Consultant at Jayway

OpenGL ES课程VI之纹理贴图(原文对照)的更多相关文章

  1. 基于Cocos2d-x学习OpenGL ES 2.0系列——纹理贴图(6)

    在上一篇文章中,我们介绍了如何绘制一个立方体,里面涉及的知识点有VBO(Vertex Buffer Object).IBO(Index Buffer Object)和MVP(Modile-View-P ...

  2. 04: OpenGL ES 基础教程03 纹理

    前言 1:常用类: 1:纹理的作用 正文 一:常用类 上下文 顶点数据缓存 着色器 baseEffect 一:纹理 1.1:   纹理可以控制渲染的每个像素的颜色. 1.2: 纹素:与像素一样,保存每 ...

  3. OpenGL入门1.4:纹理/贴图Texture

    每一个小步骤的源码都放在了Github 的内容为插入注释,可以先跳过 前言 游戏玩家对Texture这个词应该不陌生,我们已经知道了怎么为每个顶点添加颜色来增加图形的细节,但,如果想让图形看起来更真实 ...

  4. 在 OpenGL ES 2.0 上实现视差贴图(Parallax Mapping)

    在 OpenGL ES 2.0 上实现视差贴图(Parallax Mapping) 视差贴图 最近一直在研究如何在我的 iPad 2(只支持 OpenGL ES 2.0, 不支持 3.0) 上实现 视 ...

  5. opengl学习笔记(四):openCV读入图片,openGL实现纹理贴图

    在opengl中实现三维物体的纹理贴图的第一步就是要读入图片,然后指定该图片为纹理图片. 首先利用opencv的cvLoadImage函数把图像读入到内存中 img = cvLoadImage(); ...

  6. TextureView+SurfaceTexture+OpenGL ES来播放视频(三)

    引自:http://www.jianshu.com/p/291ff6ddc164 做好的Demo截图 opengl-video 前言 讲了这么多,可能有人要问了,播放视频用个android封装的Vid ...

  7. OpenGL ES 2.0 渲染管线 学习笔记

    图中展示整个OpenGL ES 2.0可编程管线 图中Vertex Shader和Fragment Shader 是可编程管线: Vertex Array/Buffer objects 顶点数据来源, ...

  8. Xcode OpenGL ES Frame Capture的使用

    一.使用背景 近期在Xcode中使用OpenGL ES 2.0实现一些效果,刚开始存在一些性能问题(CPU和GPU),幸运的是Xcode中自带了免费的性能工具Instruments,其中包含OpenG ...

  9. 2.x最终照着教程,成功使用OpenGL ES 绘制纹理贴图,添加了灰度图

    在之前成功绘制变色的几何图形之后,今天利用Openg ES的可编程管线绘制出第一张纹理. 学校时候不知道OpenGL的重要性,怕晦涩的语法.没有跟老师学习OpenGL的环境配置,现在仅仅能利用coco ...

随机推荐

  1. python基础之正则表达式爬虫应用,configparser模块和subprocess模块

    正则表达式爬虫应用(校花网) 1 import requests 2 import re 3 import json 4 #定义函数返回网页的字符串信息 5 def getPage_str(url): ...

  2. python lamba表达式

    lambda函数也叫匿名函数,即,函数没有具体的名称. g=lambda x:x**2 def f(x): return x**2 lambda语句中,冒号前是参数,可以有多个,用逗号隔开,冒号右边是 ...

  3. springboot 采用HttpClient获取天气预报 异常及原因

    采用httpClient调用天气预报地址获取出现异常 2018-10-04 15:18:25.815 ERROR 10868 --- [nio-8080-exec-5] o.a.c.c.C.[.[.[ ...

  4. vue理解$nextTick

    首先要明确: Vue 实现响应式并不是数据发生变化之后 DOM 立即变化,而是按一定的策略进行 DOM 的更新. $nextTick 是在下次 DOM 更新循环结束之后执行延迟回调,在修改数据之后使用 ...

  5. Python基础——安装运行

    Python是如何运行的? 像绝大多数编程语言一样,要在计算机上能够运行python程序,至少需要安装一个最小的Python包:一个Python解释器和支持的库. 安装Python 安装包下载:htt ...

  6. erlang中的原子(atom)内部实现[转]

    转自: http://www.kongqingquan.com/archives/208#more-208 Erlang中的atom由通用hash表实现,虚拟机中atom最终的用数值表示,对应表中的下 ...

  7. javascript类式继承模式#3——借用和设置原型

    <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/ ...

  8. 从事IT业一个8年老兵转行前的自我总结1——初爻

    现在,本人已离开这个呆了8年的软件行业了.回想自己从半路出家,从实施开始做起,最终在一家外企做项目经理PM结束了自己的软件职业生涯.从一张白纸的自学开始,做过项目实施,客户培训,拿过需求,开发,架构设 ...

  9. 剑指Offer - 九度1354 - 和为S的连续正数序列

    剑指Offer - 九度1354 - 和为S的连续正数序列2013-11-23 02:02 题目描述: 小明很喜欢数学,有一天他在做数学作业时,要求计算出9~16的和,他马上就写出了正确答案是100. ...

  10. Python 3基础教程15-读文件内容

    前面两篇关于写文件和更新文件内容,我们最后都是手动去打开检查是否更新了.现在我们这里通过函数读取之前文件内容,打印到屏幕终端. 运行结果: