Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Rendering (computer graphics)
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Scientific and mathematical basis == {{Main|Unbiased rendering}} {{Unreferenced section|date=December 2024}} The implementation of a realistic renderer always has some basic element of physical simulation or emulation{{snd}} some computation which resembles or abstracts a real physical process. The term "''[[physically based rendering|physically based]]''" indicates the use of physical models and approximations that are more general and widely accepted outside rendering. A particular set of related techniques have gradually become established in the rendering community. The basic concepts are moderately straightforward, but intractable to calculate; and a single elegant algorithm or approach has been elusive for more general purpose renderers. In order to meet demands of robustness, accuracy and practicality, an implementation will be a complex combination of different techniques. Rendering research is concerned with both the adaptation of scientific models and their efficient application. Mathematics used in rendering includes: [[linear algebra]], [[calculus]], [[numerical analysis|numerical mathematics]], [[digital signal processing|signal processing]], and [[Monte Carlo methods]]. === The rendering equation === {{Main|Rendering equation}} This is the key academic/theoretical concept in rendering. It serves as the most abstract formal expression of the non-perceptual aspect of rendering. All more complete algorithms can be seen as solutions to particular formulations of this equation. : <math>L_o(x, \omega) = L_e(x, \omega) + \int_\Omega L_i(x, \omega') f_r(x, \omega', \omega) (\omega' \cdot n) \, \mathrm d \omega'</math> Meaning: at a particular position and direction, the outgoing light (L<sub>o</sub>) is the sum of the emitted light (L<sub>e</sub>) and the reflected light. The reflected light being the sum of the incoming light (L<sub>i</sub>) from all directions, multiplied by the surface reflection and incoming angle. By connecting outward light to inward light, via an interaction point, this equation stands for the whole 'light transport'{{snd}} all the movement of light{{snd}} in a scene. === The bidirectional reflectance distribution function === The '''[[bidirectional reflectance distribution function]]''' (BRDF) expresses a simple model of light interaction with a surface as follows: : <math>f_r(x, \omega', \omega) = \frac{\mathrm d L_r(x, \omega)}{L_i(x, \omega')(\omega' \cdot \vec n) \mathrm d \omega'}</math> Light interaction is often approximated by the even simpler models: diffuse reflection and specular reflection, although both can ALSO be BRDFs. === Geometric optics === Rendering is practically exclusively concerned with the particle aspect of light physics{{snd}} known as [[geometrical optics]]. Treating light, at its basic level, as particles bouncing around is a simplification, but appropriate: the wave aspects of light are negligible in most scenes, and are significantly more difficult to simulate. Notable wave aspect phenomena include diffraction (as seen in the colours of [[Compact disc|CDs]] and [[DVD]]s) and polarisation (as seen in [[Liquid-crystal display|LCDs]]). Both types of effect, if needed, are made by appearance-oriented adjustment of the reflection model. === Visual perception === Though it receives less attention, an understanding of [[human visual perception]] is valuable to rendering. This is mainly because image displays and human perception have restricted ranges. A renderer can simulate a wide range of light brightness and color, but current displays{{snd}} movie screen, computer monitor, etc.{{snd}} cannot handle so much, and something must be discarded or compressed. Human perception also has limits, and so does not need to be given large-range images to create realism. This can help solve the problem of fitting images into displays, and, furthermore, suggest what short-cuts could be used in the rendering simulation, since certain subtleties will not be noticeable. This related subject is [[tone mapping]]. === Sampling and filtering === One problem that any rendering system must deal with, no matter which approach it takes, is the '''sampling problem'''. Essentially, the rendering process tries to depict a [[continuous function]] from image space to colors by using a finite number of pixels. As a consequence of the [[Nyquist–Shannon sampling theorem]] (or Kotelnikov theorem), any spatial waveform that can be displayed must consist of at least two pixels, which is proportional to [[image resolution]]. In simpler terms, this expresses the idea that an image cannot display details, peaks or troughs in color or intensity, that are smaller than one pixel. If a naive rendering algorithm is used without any filtering, high frequencies in the image function will cause ugly [[aliasing]] to be present in the final image. Aliasing typically manifests itself as [[jaggies]], or jagged edges on objects where the pixel grid is visible. In order to remove aliasing, all rendering algorithms (if they are to produce good-looking images) must use some kind of [[low-pass filter]] on the image function to remove high frequencies, a process called [[Spatial anti-aliasing|antialiasing]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)