Courses

Q. How many classes have used the book previously? At which universities?

A. To our knowledge, the book has been used as the primary textbook in over eighty different graphics courses at over twenty different universities. It has been used as a secondary textbook in a number of other courses. Most of these courses have been advanced undergraduate or graduate courses focusing on rendering. (We are no longer actively maintaining a list of such courses.)

Q. Are solutions available to the exercises in the book?

A. Unfortunately, no. Many of the exercises are substantial programming projects, or even open research questions.

Q. Are digital versions of the figures of the book available for use in course lectures?

A. Unfortunately, no. Feel free to use images from the gallery if they're helpful.

Using the system

Q. Is it possible to view the image as it is being rendered?

A. With the latest version of pbrt, pbrt-v4, there are two options. First, the --display-server command-line option can be used to specify and IP address and port for pbrt to connect to. During rendering, the in-progress image is then periodically transmitted using a custom protocol that is supported by Thomas Müller's excellent tev image viewer for display. Note that rendering on a remote machine can be viewed using this approach.

Alternatively, the --interactive command-line option can be specified, in which case pbrt will open a window on the local system to display the image. "WASD" keyboard navigation can be used to move the camera and the arrow keys can be used to adjust the camera orientation. (Either a CPU with many cores or GPU rendering is necessary for reasonable performance for anything other than very small images.)

Q. How do I disable multi-threading in pbrt?

When debugging intermittent bugs that may be caused by race conditions, it can be useful to run without multi-threading; supply the command-line argument --nthreads=1 when running pbrt to do so.

Q. Does pbrt ensure that the BSDFs used in a scene are energy conserving?

A. It depends. In the latest version of pbrt, pbrt-v4, all built-in BSDFs ensure energy conservation, clamping input reflectivities as necessary to do so. Previous versions of pbrt do not make this guarantee.

Q. What changes have been made to the file format between the second and third editions of the book?

A. Most existing pbrt scene files will work unmodified with the third version of the system. Please see the User's Guide for further details.

Also see the README.md file in the pbrt source distribution for more details on changes to the system's functionality (and corresponding file format changes).

Q. What changes have been made to the file format between the third and fourth editions of the book?

A. The file format itself has a handful of small changes; see the pbrt-v4 User's Guide for details. New with pbrt-v4, there is now a --upgrade command-line option that takes an input file in pbrt-v3 format and rewrites it as necessary to conform to pbrt-v4. (In certain obscure cases it will be unable to do so but will issue an error message that points to what must be fixed manually.)

Q. Why is my image noisy with pbrt-v4 if I render with one sample per pixel, even if there is no indirect lighting and the light sources are all point lights?

A. There a number of possible reasons. First, all of pbrt-v4's integrators only take a single sample from a single light source at each ray intersection; it is no longer possible to sample all light sources at each intersection. This will generally lead to noise at low sample counts in scenes with multiple lights.

pbrt-v4 is also a spectral renderer. This implies that the RGB values in output images must be computed by integrating color response functions with the individual wavelength samples taken by the renderer. By default, pbrt-v4 uses four such samples, which is insufficient to give images without color noise at 1spp. (pbrt can be recompiled with a larger value of NSpectrumSamples in the file src/pbrt/util/spectrum.h in the pbrt-v4 source code distribution.

Finally, pbrt-v4's "coateddiffuse" and "coatedconductor" materials are stochastic: evaluating them and sampling them is done using Monte Carlo techniques. The noise from these materials is generally low, though it, too may be visible at low sampling rates. This noise can be reduced without increasing the pixel sampling rate using the "nsamples" parameter of those materials if a greater number of pixel samples are not otherwise needed for image quality.

Q. What facilities are available to help with debugging the system?

A. The latest version of pbrt, pbrt-v4, has a number of improvements that make debugging easier. Please see the debugging section in the pbrt-v4 User's Guide for more details.

Q. Is there any way to see what types are stored in TaggedPointer types in the debugger?

If you are using the lldb debugger, the file src/pbrt/pbrt_lldbdataformatters.py makes it possible to nicely print both TaggedPointer types, including the member variables of the actual type they hold, as well as various template classes from the pstd library.

System implementation and internals

Q. How can a Camera implementation indicate that no ray should be traced for the given Sample value passed to its GenerateRay() method?

A. For some Cameras, not all rays will make it through the lens system. (For example, for a camera that models a realistic lens system, some rays will be blocked by elements inside the lens system.) In order to prevent rays from being traced for these samples, the camera should return a value of zero for the ray's weight. The main rendering loop skips rays with a weight of zero.

Q. [pbrt-v3] How come the range of pixel values that the Sampler needs to compute sample values for is more than the range from (0,0) to (xResolution, yResolution)?

A. There are two factors that affect the range of pixels for which Samplers need to generate samples: the crop window, if any, and the pixel reconstruction filter's extent. The first of these can reduce the number of pixels that need samples generated for them, and the second can increase it. In particular, it is often necessary to generate samples for pixels with negative coordinates and pixels with coordinates slightly greater than x/yResolution. (See discussion on p. 487 of the third edition of Physically Based Rendering.)

Q. How do I add a new Shape (or Camera, or Material, or ...) to the system?

A. For pbrt-v3:

For pbrt-v4:

Q. How do I add a new BSDF to the system?

A. Recall that instances of the Material class create BSDFs and add them to the BSDFs they return. Therefore, you must either implement a new Material class that returns your BSDF or modify an existing material to use your BSDF based on the parameter values passsed to it. See the previous FAQ for how to add a new material to pbrt.

Q. How come only NVIDIA GPUs are supported for GPU rendering with pbrt-v4?

A. GPU rendering support in pbrt-v4 requires a GPU that is capable of executing C++17 code as well as unified memory that can be addressed by both CPU and GPU—capabilities that are available today through CUDA on NVIDIA GPUs. If other vendors' GPUs have similar capabilities, we would be delighted to include support for them, so as long as significant changes aren't required in the core C++ rendering code. Pull requests are welcome!