Computer Generated
Angular Fisheye Projections

Written by Paul Bourke
May 2001

There are two main idealised fisheye projections common in
computer graphics rendering, they are the hemispherical and
angular fisheye. They are two from an
infinite number of ways of mapping wide angle of view onto
an image plane.

Hemispherical fisheye

A hemispherical fisheye, see figure 1, is a parallel projection of a
hemisphere onto a plane, the resulting image will be circular.
The widest view angle is 180 degrees (as shown) with extreme distortion
at +/- 90 degrees.
Because of the distortion introduced radially it is used less often than
the angular fisheye distortion.

Angular fisheye

An angular fisheye projection is defined so that distance from the
center of the image is proportional to the angle from the camera
view direction. In particular, in an angular fisheye (also called f-theta lens)
image the resolution is approximately equal across the whole image.
An angular fisheye projection can be used for angles
all the way up to a full 360 degree. A cross section of a 360 degree
angular fisheye is shown in figure 2 and a 180 degree angular fisheye
in figure 3. The main point to note is that the distance from the
center of the image (a circle on the image plane) maps directly onto
the angle around the projection sphere.

A 180 degree fisheye projects half the environment onto a circular
image, a 360 degree fisheye projects the whole environment onto a
circular image. As an aside, while it is quite easy to get 180 degree
fish eye lens and even up to 220 degrees, it is also feasible to photograph
almost a 360 degree fisheye. This can be accomplished by photographing
a silvered ball using a telephoto lens. Note that the images captured with
real fisheye lens will have other distortions to the ideal lens described
here.

The image below is a 180 degree angular fish eye projection rendered
using the PovRay raytracer.

Creating an angular fisheye

To create an angular fisheye projection one needs to determine the
vector from the camera into the scene for every point in the image plane.
Figures 4 through 7 outline the procedure, at least, one possible
method. First the image coordinates are transformed from pixel
coordinates (i,j) into normalised coordinates (x,y) ranging from -1 to 1,
figure 5.
Next the radius r and angle phi to the x axis is calculated, figure 6. Note
that if atan2() is supported in your maths library then that can be
used as a more direct way of calculating the angle phi. At this stage
any pixels where r > 1 are ignored (drawn black or some other background
colour). Finally, r is mapped onto theta, phi is used directly as
the polar coordinates of the direction vector from the camera into the scene.
The angle theta is just r multiplied by half the intended fisheye angle
which may be anything up to 360 degrees.

Inverse transform

One way to use fisheye projections is as environment maps. In this case
one generally uses a 180 degree angular fisheye image and maps it onto
a hemisphere centered at the virtual camera. The general requirement
is to create the correct u,v texture coordinates for each vertex making
up the polygons on a unit radius hemisphere. Most texture mapping requires u,v
coordinates between 0 and 1 in each direction, the formula based upon
the conventions used here is as follows for a unit vector (x,y,z).

u = r cos(phi) + 0.5

v = r sin(phi) + 0.5

where

r = atan2(sqrt(x*x+y*y),p.z) / pi

phi = atan2(y,x)

There are many ways to create a polygonal representation of a unit
hemisphere. Two examples are given below in pseudo C code. The first
uses a grid in the x,y plane and calculates the z value using the
equation of the sphere x2 + y2 + z2 =
r2. The second creates a sphere using polar coordinates with
one pole directly in front of the camera. In both cases the sphere is
of unit radius and centered at the origin.

Method 1 - grid

Method 2 - polar

Note that for the above the normal at each point is the same
as the point itself. The value of N above relates to the resolution of the
sphere, the higher the more polygons created. In reality one might choose
more efficient methods than the above since it computes the same point
multiple times (for shared vertices). For example both methods can be
easily modified to support quad or triangle strips for OpenGL.
The texture coordinate for any point on the sphere
might be calculated as follows.

Source Code

Paraboloid

For completeness it should be mentioned that another way to record
wide angle views for environment maps is to use a paraboloid. The
main advantage is that there are efficient numerical methods for
unmapping the image for standard perspective views. Figure 8 shows
the cross section, the formula is generally something of the form.

z = 0.5 - 0.5 (x2 + y2)
for x2 + y2 <= 1

References

Max, N.,
Computer Graphics Distortion for IMAX and OMNIMAX Projection.
Proc Nicograph 83, Dec 1983 pp 137.

Greene, N.,
Environment Mapping and Other Applications of World Projections
IEEE Computer Graphics and Applications, November 1986, vol 6, no 11, pp 21

Wyvill, G., McNaughton, C.,
Optical Models
Computer Graphics International, 1990, Springer p88

Fleck M.,
Perspective projection: The wrong imaging model
University of Iowa, Tech Report #95-01

Svoboda, T., Pajdla, T., Hlavac, V.,
Central Panoramic Cameras: Geometry and Design
Czech Technical University, Tech Report K335/97/147.

Figure 1

Figure 2

Figure 3

Figure 4

Figure 5

Figure 6

Figure 7

Figure 8

Non linearily

Real fisheye lenses rarely follow the precise linear relationship between radius
on the fisheye image plane and latitude. The most common form of deviation is a compression of the
image towards the rim of the fisheye circle.
Of course if this non-linearity is a problem it can be corrected for, at the
expense of some loss of resolution at the rim.

Appendix: Approximations

Some raytracing packages don't support a native fisheye camera, the question
is often asked "can a spherical mirror be used to create a fisheye view?".
The basic idea is to render the view of the scene by pointing the virtual
camera at a perfectly reflective spherical surface. The answer is "the
result is close but not exactly correct". Three rendering are shown below,
the scene is simply a hemisphere represented as a regular grid of lines
of latitude and longitude.

Equiangular fisheye

This is the correct result rendered with a equiangular fisheye lens. Note
that the spacing of the lines of latitude is constant.


  Aperture: 180 degrees
Camera at the origin
Unit polar grid at origin
Lines of longitude: 10 degree steps
Lines of latitude: 5 degree steps


3D polar frame hemisphere


Mirror sphere, perspective projection

  Aperture: 60 degrees
Mirror radius: 0.001
Camera positioned for 180 degree reflection


Mirror sphere, orthographic projection

  Mirror radius: 0.001
Camera position very close to mirror radius

Overlap of fisheye and orthographic mirror

The following is the perfect fisheye superimposed with the orthographic
rendering. The biggest difference is around the mid latitudes, for example,
if such a projection was used in a planetarium then objects that remain at the
same distance from the camera but move from the pole to the horizon will
appear to change size.

It should be noted that a mirror (not spherical) can be designed such
the correct result is achieved. In the authors opinion it is vastly better
to render cubic maps from which correct fisheye projections can be derived.

Offaxis fisheye projection

Written by Paul Bourke
October 2001, modified January 2004

Introduction

There exist a number of full dome environments that project angular fisheye
or other radial functions. If the angular fisheye is created correctly then
the resulting image after projection onto the dome surface appears
undistorted. Unfortunately if the projector uses a single fisheye lens
then the standard fisheye projection as described
here requires that both the lens and the
viewer are located at the center of the dome, obviously impossible.
As the viewer moves away from the center the image appears increasingly
distorted. This isn't normally a problem for large planetariums
where the radius of the dome is very much larger than the seating area and
in any case nothing can be done since there are multiple viewers.
For smaller domes the effect is more marked and the viewer can easily
move significant distances from the center. It is possible to create
a modified fisheye image so that if viewed from a particular position
the projected image will appear undistorted, this is called an off axis
fisheye image. This is the same principle employed when creating stereo
pairs where one uses an off-axis (asymmetric) perspective frustum for
each eye.

Algorithm

Creating an off-axis fisheye is quite straightforward. First, create the
fisheye projection world vector
p as described here.
The vector
p' is derived as shown in the diagram on the right.
The new ray p' is the ray
p minus the vector to the
view position.
While the diagram on the right shows the view position
along the y axis, it is normally
applied to any point in the x,y plane as the dome viewer
will most commonly move around on that plane.

Note that as the viewer moves towards the rim the image gets increasingly
stretched in that direction. This violates the usual characteristic of angular
fisheye images where a pixel in image space is proportional to the angle in
fisheye space. Or put another way, all pixels in the fisheye image are
the same dimensions in the projected image, this can be seen in the example
below.

Viewer at (0,0)
Viewer at (0.5,0.5)
 
Figure 1

A common approach to creating content for dome environments is to
render standard projections onto the faces of a cube. The off-axis
fisheye can be computed from these at interactive rates and so a
single viewer with a head tracking device can be presented with
a corrected image as they walk around the base of the dome.

Test Pattern

The following test pattern was used to test the correctness of the
off-axis fisheye. It is a rendering of a
cubic room, each wall is a
different colour and constructed with a regular grid of bars. If
the fisheye generation is correct the curved lines in the fisheye
image should appear straight when projected onto the dome. The
coordinate above each image is the position of the viewer.

Offset = (0,0)
Offset = (0,0.5)
Offset = (0,0.75)
Offset = (0,0.95)
Offset = (0.5,0.5)
Offset = (0.7,0.7)

This example is rendered in PovRay onto a
5 wall cubic environment map, the off-axis fisheye
is created from these 5 images.
(5view.ini,
5view.pov,
5viewcamera.pov)
This has the advantage of being
able to create multiple offaxis fisheye images from one set of
cubic environment maps, this would not be possible if the offaxis
fisheye was rendered directly in PovRay (something that is easy to
arrange by modifying the PovRay source code).

   

Creating offaxis projections from existing fisheye images

Given an existing fisheye image, an offaxis fisheye can be created
for any offaxis position. There is a full mapping for 180 degree
fisheye images, smaller angle images result in an incomplete
mapping.
It reads a TGA image representing a fisheye image and creates
an offaxis fisheye as another TGA file. Note that there are
important resolution issues to consider, this works best when
going from very high resolution fisheye images to lower
resolution ones. The code here is a simple command line
based UNIX utility, it supports supersampling antialiasing
and variable output image size.

offaxis tgafilename [options]
Options:
-a n set antialias level (Default: 1)
-w n width of the output image (Default: 500)
-h n height of the output image (Default: width)
-dx n x component of the offaxis vector (Default: 0)
-dy n y component of the offaxis vector (Default: 0)
-v debug/verbose mode (Default: off)
Original fisheye image
X axis offset of 0.3
Y axis offset of 0.3
X and Y axis offset of 0.3

Photos of a Prototype Dome





Example Images used in the Prototype Dome





Note that for these tests an Elumens projector was used. This is a
standard projector with their fisheye lens and as such the projector
projects a 4:3 width to height ratio while fisheye images are 1:1.
This is addressed by clipping off 25% of the image, in the case of the
Elumens dome this is normally the bottom 25%. In our case we chose to
clip the top 25%, this conveniently gave the viewers a position behind
the projector from which to view the images without being blinded.

Figure 2

Addendum

The offaxis correction as described above only applied to shifting the
observer around on the rim plane of the hemisphere. A similar correction
can be made to compensate for the observer being within the dome, or in
the more likely case, away from the rim of the dome (along the negative z axis,
see figure 1). This correction is
applied in exactly the same way but note that it results in a reduced
field of view unlike offset positions on the rim plane which result in
stretching distortion but the same field of view.

网址:

http://paulbourke.net/dome/fisheye/

Computer Generated Angular Fisheye Projections [转]的更多相关文章

  1. Fisheye projections from spherical maps [转]

    Fisheye projections from spherical maps Written by Paul Bourke May 2003, software updated January 20 ...

  2. [Angular 2] Set Values on Generated Angular 2 Templates with Template Context

    Angular 2 templates have a special let syntax that allows you to define and pass a context when they ...

  3. 鱼眼模式(Fisheye projection)的软件实现

    简单实现 鱼眼模式(Fisheye)和普通的透视投影(Perspective projection),一个很大的区别就是鱼眼的投影算法是非线性的(non-linear),实际照相机的情况是在镜头外面包 ...

  4. 鱼眼投影方式(Fisheye projection)的软件实现

    简单实现 鱼眼模式(Fisheye)和普通的透视投影(Perspective projection),一个很大的区别就是鱼眼的投影算法是非线性的(non-linear),实际照相机的情况是在镜头外面包 ...

  5. Converting a fisheye image into a panoramic, spherical or perspective projection [转]

    Converting a fisheye image into a panoramic, spherical or perspective projection Written by Paul Bou ...

  6. Fundamentals of Computer Graphics 中文版(第二版) (Peter Shirley 著)

    1 引言 2 数学知识 3 光栅算法 4 信号处理 5 线性代数 6 矩阵变换 7 观察 8 隐藏面消除 9 表面明暗处理 10 光线追踪 11 纹理映射 12 完整的图形流水线 13 图形学的数据结 ...

  7. {ICIP2014}{收录论文列表}

    This article come from HEREARS-L1: Learning Tuesday 10:30–12:30; Oral Session; Room: Leonard de Vinc ...

  8. Calculating Stereo Pairs

    Calculating Stereo Pairs Written by Paul BourkeJuly 1999 Introduction The following discusses comput ...

  9. [zz] Pixar’s OpenSubdiv V2: A detailed look

    http://www.fxguide.com/featured/pixars-opensubdiv-v2-a-detailed-look/ Pixar’s OpenSubdiv V2: A detai ...

随机推荐

  1. Good Bye 2014 E - New Year Domino 单调栈+倍增

    E - New Year Domino 思路:我用倍增写哒,离线可以不用倍增. #include<bits/stdc++.h> #define LL long long #define f ...

  2. karma配置文件参数介绍

    目录结构 参数介绍 /*** * Created by laixiangran on 2015/12/22. * karma单元测试配置文件 */ module.exports = function( ...

  3. powershell 获取 CPU 物理 / 逻辑核心数

    转载请注明: 仰望高端玩家的小清新 http://www.cnblogs.com/luruiyuan/   获取 CPU 逻辑核心数的方法为:总逻辑核心数 = 物理核心数 * 每核逻辑核心数   其中 ...

  4. Python并发编程-进程池回调函数

    回调函数不能传参数 回调函数是在主进程中执行的 from multiprocessing import Pool import os def func1(n): print('in func1', o ...

  5. 【转】LoadRunner常见问题整理

    原文出自:http://blog.csdn.net/loadrunn/article/details/7886918 1.LR 脚本为空的解决方法: 1.去掉ie设置中的第三方支持取消掉 2.在系统属 ...

  6. [ZOJ3254] MON 9.2009Secret Code

    A^x = D (mod P) 0 <= x <= M, here M is a given integer. 1 <= A, P < 2^31, 0 <= D < ...

  7. [POI2005]A Journey to Mars --- 单调队列

    [POI2005]A Journey to Mars 题目描述: Byteazar 决定去火星参加一个空间站旅行. 火星的所有空间站都位于一个圆上. Byteazar 在其中一个登陆然后变开始饶圈旅行 ...

  8. px,dp,sp以及像素密度

    px px(pixel): 像素,是指在由一个数字序列表示的图像中的一个最小单位.在Android中,无论屏幕密度多少,一个像素单位对应一个屏幕像素单位,不会根据屏幕密度自动缩放,因此一般不推荐使用p ...

  9. 在活动之间切换(隐式Intent)

    实验名称:在活动之间切换 实验现象:在主活动中点击button1可以进入下一个活动 使用技术:隐式Intent 步骤: 1.创建一个项目,加载布局并在布局中添加一个button 部分截图未截,直接Ne ...

  10. 80.Vigenère密码(模拟)

    Vigenère密码(文件名vigenere.cpp   vigenere.in    vigenere.out) 题目描述 Description 16 世纪法国外交家Blaise de Vigen ...