Interaxial separation

Here is a stereo 360 of a Sydney Harbour view


Here the separation between the lenses of my twin DSLR rig is 30cm . This large value  exaggerates the depth visible in  the scene so even quite distant areas show some depth. If there were areas close to the camera there would in fact be too much apparent depth in those areas so the camera needs to be up high and away from any close poles, walls etc for such a large camera interaxial to work.

The screen window is set at about 30m, so there are some window violations in the foreground grassy area beneath the camera — but these are low contrast areas and depth discrimination overall is usually better if there is not too much overall disparity in the more important areas of the scene. (It is the range of visible depth differences rather than total depth that creates the strongest 3d impression.  Consider the furthest features in the scene, for example the Harbour Bridge. Here there are wide anaglyph fringes in the panorama but they are the same width all along the bridge, so there is no actual depth structure seen in the bridge.)

With a high, isolated viewpoint though you can be at risk of  losing some feeling of 3d immersion in the scene – even with a high interaxial separation.  Compare “immersivity” in this panorama with the depth in this close up view of some zombies (taken with a 4 camera miniature rig) where the the interaxial between adjacent pairs of cameras is only about 3.5 cm.

Some technical details of the shooting and stitching process of the panorama: Canon 5DMkIIs with 10.5mm Nikkors  using SRaw1 setting. 40 frames per camera in 16 seconds with a rotating turntable. Ie. at one exposure per camera every 0.4 seconds (using an intervalometer and a spliced cable release). This btw is the fastest rate possible to shoot Raw files with these cameras in long continuous sequences.

Here the cameras are symmetrically arranged on the camera rotator. For calibrating the primary camera (the one whose yaw, pitch and roll values I will use for the other camera) I used the masking feature in PTGui to force the program to find common points between frames only with very distant parts of the scene. This meant masking all the foreground areas, the trees and the closer building. Very often you wouldnt be able to do this and still have common areas for point finding between frames — but here with the distant views and the high viewpoint it was possible.

Twin camera configurations — options

The usual approach for shooting stereo panoramas with twin camera rigs is to have both cameras equidistant from the rotation point. When stitching regular (monocular) panoramas there is the basic concept of the No Parallax Point. (NPP)  This is a  point in space, usually some distance behind the front of the lens — a camera rotating around that point will make images that can be stitched very precisely.

So the equidistant concept is bound to produce stitching issues. This parallax problem with stereo rigs is the explanation why stereo panos with twin rigs require lots of shots for each camera. More shots means the errors are reduced for adjacent pairs.

An alternative arrangement is to have one camera rotating on axis, in a NPP fashion. And the other image is rotating with a greater parallax error than before . So the first camera can have perfect stitching and the second camera will require more shots (maybe twice as many)  in a sequence to get the same quality of stitching as in an equidistant setup. Here is an example of this kind of rig.


The camera on the right here will rotate in a NPP manner. The advantage of this approach is that one camera, the NPP one, can be calibrated perfectly so that we know the roll, pitch and yaw values of every camera position in a 360 rotation accurately. And since the L and R cameras are in a rigid assembly those values can be used in the stitching of the other camera. If the cameras are in a equidistant layout it is often impossible to get an accurate calibration of these values from either camera (unless there are distant, static areas in each shot of a sequence).  I discuss these issues in this thread on the Yahoo Panotools NG forum. )

So here is an example of a stereo panorama shot with the rig in the picture (twin 5DMkII Canon DSLRs  with 10.5mm Nikkors (with adapters)

An extension of the idea is to have 3 cameras with the center one rotating in a NPP manner and you use its calibration for the L and R cameras. The center camera is just used for calibrating the motion of the rotation. And you have minimal parallax for the L/R cameras. This is a bulky rig if the cameras are large. (If the center camera is a Gopro the rig is much smaller even with L/R DSLRs). Here is (interactive scroller) panorama of a very ferny part of the Blue Mountains near Sydney — done with a 3 DSLR rig.



“Flat” scrolling 3d panorama viewers

One of the main problems with most current methods of generating and displaying stereo panoramas is that a stitched vertically wide 360 panorama pair in the usual formats (eg. equirectangular or cubic or cylindrical) cannot work in 3d properly when dewarped in an interactive viewer with views that are  very wide angle and/or much tilted up or down. In fact with spherical 3d panoramas the view goes from stereo to pseudostereo as you rotate around the view when looking at nadir or zenith (so many people set the zero parallax point to be where the ground is with spherical stereo panoramas (and the zenith is often featureless in outdoor shots).

One way around this is to display such panoramas in unwarped form — i.e. with simple scrolling viewers. For example this Gopro 3d panorama I shot recently of a boat race preparation scene


(you can drag the view left or right or use the mouse wheel to scroll the view sideways or use the controls). This is done using a Jquery script This is much the best Javascript scroller I have found btw — and it is meant to work on mobile devices too. Such scrollers are good too for interactive interlaced 3d panorama presentations for passive 3d monitors and TVs.

Another way to view scrolling 3d panoramas is with  Stereophoto Maker (free – Windows only) Just load the panorama as a stereo image (File/OpenStereoImage). Then go View/PanoramaMode. You can have the panorama scroll automatically with View/HorizontalAutoscroll. This can be used with a variety of stereo format panoramas.

Here is a Gopro panorama image you can try this out with if you like

This is done btw with a 4 second rotation. One thing to keep in mind with anaglyph jpgs is that normal compression methods lose 3d quality (as the color information is lost with most compression methods) but you can use the lossless color output of Stereophotomaker or convert from tif etc to jpg with Irfanview with its “Disable chroma color subsampling” option checked.


Stereo panorama links

Here are some links to stereo panorama related stuff:

here is a camera rotator rig I built for twin 5DMkIIs Canon DSLRs: and here are a couple of 3d panoramas I have shot with it:

Here is a HMD art project (Conversations) I made the stereo panoramas for a while back — the application had some cool features, head tracking, video characters in the panoramas, communication between multiple users, directional sound:

Some 3d multimedia viewers that have some support for stereo textures (which is what you need for  ”dewarped” stereo panorama viewers): — Panda3d (I havent played with this) — is free

When I first got interested in stereo panoramas I played a lot  with Michel Husak’s viewer  (free)
This works great on XP for regular 3d panoramas with a Quadro card with a CRT monitor but it doesnt work on Win7. The source code is available.

There is a plugin for Unity3d which could be used for headtracked HMD stereo panoramas I think (free)!-Released

Bitmanagements BSContact Stereo is a VRML viewer with support for custom extensions providing 3d textures — which can be used to create a 3d skybox. This viewer has a native panorama navigation mode so you dont have to code that separately    I will post a VRML lfile showing  how to script that for 3d panoramas soon. This viewer is not free but the demo version works ok if you dont mind flying logos across the panoramas. The viewer also supports stereo video textures I think — which it makes it one of the few to support stereo video panoramas I think.

Vizard’s Worldviz supports stereo textures (they added it at my request) — as does Virtools

Virtools was used for scripting stereo panoramas I shot at Angkor in 2003 for a large scale vr installation at Melbourne Museum (the VROOM — a 6 screen back-projected fishtank concept)

A very nice rig for twin camera DSLR stereo capture by Aaron Spence

This looks like it would be an interesting article on runtime stereo panorama parallax adjustment

Archibald Fountain in Hyde Park, Sydney


I have been shooting 3d panoramas lately with a pair of Gopro Hero2 cameras — equipped  with the Gopro 3d kit. I have built some  motorized rigs (spinners) for shooting variable interaxial panoramas with these rigs and this is done with one of these rigs. It was shot in video mode at 24 fps using the Protune setting — this setting  Gopro refers to as “1080p 24 T”. The rotator I used here does a 360 rotation in about 4 seconds — so this is about 100 frames from each camera for a 360. It is stitched with PTGui. The lens separation here is about 20cm.  I will write some tutorials for this site soon about the PTGui workflow.