Computer Generated
Angular Fisheye Projections

Written by Paul Bourke
May 2001

There are two main idealised fisheye projections common in
computer graphics rendering, they are the hemispherical and
angular fisheye. They are two from an
infinite number of ways of mapping wide angle of view onto
an image plane.

Hemispherical fisheye

A hemispherical fisheye, see figure 1, is a parallel projection of a
hemisphere onto a plane, the resulting image will be circular.
The widest view angle is 180 degrees (as shown) with extreme distortion
at +/- 90 degrees.
Because of the distortion introduced radially it is used less often than
the angular fisheye distortion.

Angular fisheye

An angular fisheye projection is defined so that distance from the
center of the image is proportional to the angle from the camera
view direction. In particular, in an angular fisheye (also called f-theta lens)
image the resolution is approximately equal across the whole image.
An angular fisheye projection can be used for angles
all the way up to a full 360 degree. A cross section of a 360 degree
angular fisheye is shown in figure 2 and a 180 degree angular fisheye
in figure 3. The main point to note is that the distance from the
center of the image (a circle on the image plane) maps directly onto
the angle around the projection sphere.

A 180 degree fisheye projects half the environment onto a circular
image, a 360 degree fisheye projects the whole environment onto a
circular image. As an aside, while it is quite easy to get 180 degree
fish eye lens and even up to 220 degrees, it is also feasible to photograph
almost a 360 degree fisheye. This can be accomplished by photographing
a silvered ball using a telephoto lens. Note that the images captured with
real fisheye lens will have other distortions to the ideal lens described
here.

The image below is a 180 degree angular fish eye projection rendered
using the PovRay raytracer.

Creating an angular fisheye

To create an angular fisheye projection one needs to determine the
vector from the camera into the scene for every point in the image plane.
Figures 4 through 7 outline the procedure, at least, one possible
method. First the image coordinates are transformed from pixel
coordinates (i,j) into normalised coordinates (x,y) ranging from -1 to 1,
figure 5.
Next the radius r and angle phi to the x axis is calculated, figure 6. Note
that if atan2() is supported in your maths library then that can be
used as a more direct way of calculating the angle phi. At this stage
any pixels where r > 1 are ignored (drawn black or some other background
colour). Finally, r is mapped onto theta, phi is used directly as
the polar coordinates of the direction vector from the camera into the scene.
The angle theta is just r multiplied by half the intended fisheye angle
which may be anything up to 360 degrees.

Inverse transform

One way to use fisheye projections is as environment maps. In this case
one generally uses a 180 degree angular fisheye image and maps it onto
a hemisphere centered at the virtual camera. The general requirement
is to create the correct u,v texture coordinates for each vertex making
up the polygons on a unit radius hemisphere. Most texture mapping requires u,v
coordinates between 0 and 1 in each direction, the formula based upon
the conventions used here is as follows for a unit vector (x,y,z).

u = r cos(phi) + 0.5

v = r sin(phi) + 0.5

where

r = atan2(sqrt(x*x+y*y),p.z) / pi

phi = atan2(y,x)

There are many ways to create a polygonal representation of a unit
hemisphere. Two examples are given below in pseudo C code. The first
uses a grid in the x,y plane and calculates the z value using the
equation of the sphere x2 + y2 + z2 =
r2. The second creates a sphere using polar coordinates with
one pole directly in front of the camera. In both cases the sphere is
of unit radius and centered at the origin.

Method 1 - grid

Method 2 - polar

Note that for the above the normal at each point is the same
as the point itself. The value of N above relates to the resolution of the
sphere, the higher the more polygons created. In reality one might choose
more efficient methods than the above since it computes the same point
multiple times (for shared vertices). For example both methods can be
easily modified to support quad or triangle strips for OpenGL.
The texture coordinate for any point on the sphere
might be calculated as follows.

Source Code

Paraboloid

For completeness it should be mentioned that another way to record
wide angle views for environment maps is to use a paraboloid. The
main advantage is that there are efficient numerical methods for
unmapping the image for standard perspective views. Figure 8 shows
the cross section, the formula is generally something of the form.

z = 0.5 - 0.5 (x2 + y2)
for x2 + y2 <= 1

References

Max, N.,
Computer Graphics Distortion for IMAX and OMNIMAX Projection.
Proc Nicograph 83, Dec 1983 pp 137.

Greene, N.,
Environment Mapping and Other Applications of World Projections
IEEE Computer Graphics and Applications, November 1986, vol 6, no 11, pp 21

Wyvill, G., McNaughton, C.,
Optical Models
Computer Graphics International, 1990, Springer p88

Fleck M.,
Perspective projection: The wrong imaging model
University of Iowa, Tech Report #95-01

Svoboda, T., Pajdla, T., Hlavac, V.,
Central Panoramic Cameras: Geometry and Design
Czech Technical University, Tech Report K335/97/147.

Figure 1

Figure 2

Figure 3

Figure 4

Figure 5

Figure 6

Figure 7

Figure 8

Non linearily

Real fisheye lenses rarely follow the precise linear relationship between radius
on the fisheye image plane and latitude. The most common form of deviation is a compression of the
image towards the rim of the fisheye circle.
Of course if this non-linearity is a problem it can be corrected for, at the
expense of some loss of resolution at the rim.

Appendix: Approximations

Some raytracing packages don't support a native fisheye camera, the question
is often asked "can a spherical mirror be used to create a fisheye view?".
The basic idea is to render the view of the scene by pointing the virtual
camera at a perfectly reflective spherical surface. The answer is "the
result is close but not exactly correct". Three rendering are shown below,
the scene is simply a hemisphere represented as a regular grid of lines
of latitude and longitude.

Equiangular fisheye

This is the correct result rendered with a equiangular fisheye lens. Note
that the spacing of the lines of latitude is constant.


  Aperture: 180 degrees
Camera at the origin
Unit polar grid at origin
Lines of longitude: 10 degree steps
Lines of latitude: 5 degree steps


3D polar frame hemisphere


Mirror sphere, perspective projection

  Aperture: 60 degrees
Mirror radius: 0.001
Camera positioned for 180 degree reflection


Mirror sphere, orthographic projection

  Mirror radius: 0.001
Camera position very close to mirror radius

Overlap of fisheye and orthographic mirror

The following is the perfect fisheye superimposed with the orthographic
rendering. The biggest difference is around the mid latitudes, for example,
if such a projection was used in a planetarium then objects that remain at the
same distance from the camera but move from the pole to the horizon will
appear to change size.

It should be noted that a mirror (not spherical) can be designed such
the correct result is achieved. In the authors opinion it is vastly better
to render cubic maps from which correct fisheye projections can be derived.

Offaxis fisheye projection

Written by Paul Bourke
October 2001, modified January 2004

Introduction

There exist a number of full dome environments that project angular fisheye
or other radial functions. If the angular fisheye is created correctly then
the resulting image after projection onto the dome surface appears
undistorted. Unfortunately if the projector uses a single fisheye lens
then the standard fisheye projection as described
here requires that both the lens and the
viewer are located at the center of the dome, obviously impossible.
As the viewer moves away from the center the image appears increasingly
distorted. This isn't normally a problem for large planetariums
where the radius of the dome is very much larger than the seating area and
in any case nothing can be done since there are multiple viewers.
For smaller domes the effect is more marked and the viewer can easily
move significant distances from the center. It is possible to create
a modified fisheye image so that if viewed from a particular position
the projected image will appear undistorted, this is called an off axis
fisheye image. This is the same principle employed when creating stereo
pairs where one uses an off-axis (asymmetric) perspective frustum for
each eye.

Algorithm

Creating an off-axis fisheye is quite straightforward. First, create the
fisheye projection world vector
p as described here.
The vector
p' is derived as shown in the diagram on the right.
The new ray p' is the ray
p minus the vector to the
view position.
While the diagram on the right shows the view position
along the y axis, it is normally
applied to any point in the x,y plane as the dome viewer
will most commonly move around on that plane.

Note that as the viewer moves towards the rim the image gets increasingly
stretched in that direction. This violates the usual characteristic of angular
fisheye images where a pixel in image space is proportional to the angle in
fisheye space. Or put another way, all pixels in the fisheye image are
the same dimensions in the projected image, this can be seen in the example
below.

Viewer at (0,0)
Viewer at (0.5,0.5)
 
Figure 1

A common approach to creating content for dome environments is to
render standard projections onto the faces of a cube. The off-axis
fisheye can be computed from these at interactive rates and so a
single viewer with a head tracking device can be presented with
a corrected image as they walk around the base of the dome.

Test Pattern

The following test pattern was used to test the correctness of the
off-axis fisheye. It is a rendering of a
cubic room, each wall is a
different colour and constructed with a regular grid of bars. If
the fisheye generation is correct the curved lines in the fisheye
image should appear straight when projected onto the dome. The
coordinate above each image is the position of the viewer.

Offset = (0,0)
Offset = (0,0.5)
Offset = (0,0.75)
Offset = (0,0.95)
Offset = (0.5,0.5)
Offset = (0.7,0.7)

This example is rendered in PovRay onto a
5 wall cubic environment map, the off-axis fisheye
is created from these 5 images.
(5view.ini,
5view.pov,
5viewcamera.pov)
This has the advantage of being
able to create multiple offaxis fisheye images from one set of
cubic environment maps, this would not be possible if the offaxis
fisheye was rendered directly in PovRay (something that is easy to
arrange by modifying the PovRay source code).

   

Creating offaxis projections from existing fisheye images

Given an existing fisheye image, an offaxis fisheye can be created
for any offaxis position. There is a full mapping for 180 degree
fisheye images, smaller angle images result in an incomplete
mapping.
It reads a TGA image representing a fisheye image and creates
an offaxis fisheye as another TGA file. Note that there are
important resolution issues to consider, this works best when
going from very high resolution fisheye images to lower
resolution ones. The code here is a simple command line
based UNIX utility, it supports supersampling antialiasing
and variable output image size.

offaxis tgafilename [options]
Options:
-a n set antialias level (Default: 1)
-w n width of the output image (Default: 500)
-h n height of the output image (Default: width)
-dx n x component of the offaxis vector (Default: 0)
-dy n y component of the offaxis vector (Default: 0)
-v debug/verbose mode (Default: off)
Original fisheye image
X axis offset of 0.3
Y axis offset of 0.3
X and Y axis offset of 0.3

Photos of a Prototype Dome





Example Images used in the Prototype Dome





Note that for these tests an Elumens projector was used. This is a
standard projector with their fisheye lens and as such the projector
projects a 4:3 width to height ratio while fisheye images are 1:1.
This is addressed by clipping off 25% of the image, in the case of the
Elumens dome this is normally the bottom 25%. In our case we chose to
clip the top 25%, this conveniently gave the viewers a position behind
the projector from which to view the images without being blinded.

Figure 2

Addendum

The offaxis correction as described above only applied to shifting the
observer around on the rim plane of the hemisphere. A similar correction
can be made to compensate for the observer being within the dome, or in
the more likely case, away from the rim of the dome (along the negative z axis,
see figure 1). This correction is
applied in exactly the same way but note that it results in a reduced
field of view unlike offset positions on the rim plane which result in
stretching distortion but the same field of view.

网址:

http://paulbourke.net/dome/fisheye/

Computer Generated Angular Fisheye Projections [转]的更多相关文章

  1. Fisheye projections from spherical maps [转]

    Fisheye projections from spherical maps Written by Paul Bourke May 2003, software updated January 20 ...

  2. [Angular 2] Set Values on Generated Angular 2 Templates with Template Context

    Angular 2 templates have a special let syntax that allows you to define and pass a context when they ...

  3. 鱼眼模式(Fisheye projection)的软件实现

    简单实现 鱼眼模式(Fisheye)和普通的透视投影(Perspective projection),一个很大的区别就是鱼眼的投影算法是非线性的(non-linear),实际照相机的情况是在镜头外面包 ...

  4. 鱼眼投影方式(Fisheye projection)的软件实现

    简单实现 鱼眼模式(Fisheye)和普通的透视投影(Perspective projection),一个很大的区别就是鱼眼的投影算法是非线性的(non-linear),实际照相机的情况是在镜头外面包 ...

  5. Converting a fisheye image into a panoramic, spherical or perspective projection [转]

    Converting a fisheye image into a panoramic, spherical or perspective projection Written by Paul Bou ...

  6. Fundamentals of Computer Graphics 中文版(第二版) (Peter Shirley 著)

    1 引言 2 数学知识 3 光栅算法 4 信号处理 5 线性代数 6 矩阵变换 7 观察 8 隐藏面消除 9 表面明暗处理 10 光线追踪 11 纹理映射 12 完整的图形流水线 13 图形学的数据结 ...

  7. {ICIP2014}{收录论文列表}

    This article come from HEREARS-L1: Learning Tuesday 10:30–12:30; Oral Session; Room: Leonard de Vinc ...

  8. Calculating Stereo Pairs

    Calculating Stereo Pairs Written by Paul BourkeJuly 1999 Introduction The following discusses comput ...

  9. [zz] Pixar’s OpenSubdiv V2: A detailed look

    http://www.fxguide.com/featured/pixars-opensubdiv-v2-a-detailed-look/ Pixar’s OpenSubdiv V2: A detai ...

随机推荐

  1. lr_start_timer,lr_get_transaction_duration,lr_get_transaction_wasted_time函数使用总结

    lr_start_timer: 函数的功能: 为了计算时间更加精确,可以用这个函数去掉LR自身的检查点所浪费的时间.如text check and image time Action() { doub ...

  2. easyui layout 左右面板折叠后 显示标题

    (function($){ var buttonDir = {north:'down',south:'up',east:'left',west:'right'};    $.extend($.fn.l ...

  3. 《互联网MySQL开发规范》

    一.基础规范 使用 INNODB 存储引擎 表字符集使用 UTF8  所有表都需要添加注释 单表数据量建议控制在 5000W 以内 不在数据库中存储图⽚.文件等大数据 禁止在线上做数据库压力测试 禁⽌ ...

  4. 【ASP.NET】编写自己的Web服务器

    自己写一个简单的Web服务器,对加深理解Http协议有很好的帮助,下面就看一下一个基于TcpListener的Web服务器: class Program { static void Main(stri ...

  5. shell动画

    在印象中,好像终端就是黑白界面,加扁平输出.是不是很乏味?其实现在 Linux/Unix 系统中带的终端模拟器是支持动画和彩色输出的.下面,一起来看看字符界面下的动画魅力! 1 定点输出 1.1 回车 ...

  6. 牛客网 牛客练习赛43 C.Tachibana Kanade Loves Review-最小生成树(并查集+Kruskal)+建虚点+读入挂

    链接:https://ac.nowcoder.com/acm/contest/548/C来源:牛客网 Tachibana Kanade Loves Review 时间限制:C/C++ 2秒,其他语言4 ...

  7. GeneXus项目启动

    使用GeneXus产品开发项目时,在开始,有一些属性我会经常改一下.我现在使用的GeneXus版本是GeneXus U3,由于在做手机应用的开发,所以一般使用最新的版本,老外那边差不多两个月会有一个u ...

  8. MAC OS 10.11.1虚拟机免费下载已安装Xcode7图片后有下载地址

    MAC OS 10.11.1虚拟机免费下载已安装Xcode7图片后有下载地址 注意:已经下载过MAC OS 10.10.5虚拟机免费下载(可安装Xcode7)链接:http://www.cnblogs ...

  9. FastReport.Net使用:[35]奇偶行

    文本控件类型的奇偶行数据实现(普通) 1.普通的奇偶行数据主要使用报表对象的EvenStyle(偶数行样式)属性实现. 首先通过 报表-->样式 菜单打开样式编辑器,编辑几个备用样式. 样式的编 ...

  10. 【BZOJ 2298】 2298: [HAOI2011]problem a (DP)

    2298: [HAOI2011]problem a Time Limit: 10 Sec  Memory Limit: 256 MBSubmit: 1326  Solved: 637 Descript ...