See also:
A material is an asset that can be applied to geometry to define its shading. This shading is usually only visible when a light hits the surface, so lighting plays a huge role in materials. The exception is in shader models such as unlit.
In a material you can define the value of parameters that are available from the shader, such as the color, roughness, and so on. Most of the time when we talk about a shader, it’s a BSDF based on the work Brian Karis. You can see the work that Brian Karis did in this paper:
2013_Siggraph_UE4_BrianKaris.pdf
This collection of papers does a great job at explaining the ground work of real-time PBR: https://blog.selfshadow.com/publications/s2012-shading-course/
Mind you, not all shaders are BSDFs! https://www.shadertoy.com/ has some great examples of crazy shaders written in HLSL.
BSDF is a superset and the generalization of the BRDF and BTDF. The concept behind all BxDF functions could be described as a 4-dimensional function: a function of light direction (l) and view direction (v), the normal of the surface (n), and surface tangent (t).
The Base Color, Roughness, or Normal inputs of a BSDF are all BRDFs in their own right, with their own calculations. These inputs are always numbers, but the way we describe those numbers can be using math, flat values, or textures. Textures are really just 2 or 3 dimensional collection of numbers.
The bidirectional scattering distribution function (BSDF) radiometrically characterizes the scatter of optical radiation from a surface as a function of the angular positions of the incident and scattered beams. By definition, it is the ratio of the scattered radiance to the incident irradiance: the unit is inverse steradian. The term bidirectional reflectance distribution function (BRDF) is used when specifically referring to reflected scatter. Likewise, bidirectional transmittance distribution function (BTDF) refers to scatter transmitted through a material. Read more here.
A shader is a small program that tells the GPU how to shade an object on the screen and the calculations required to get there.
Shaders have the following prerequisites:
A shader and a material can be used in different ways, depending on the game engine. In Unreal Engine, the shader used can be based on the target platform selected.
There are generally two ways to do materials; PBR (physically based rendering/shading) and NPR (non-photoreal rendering/shading). The type of shader you use doesn’t imply one or the other, because there are plenty of ways to “break” a PBR shader by giving it “wrong” inputs (more on that later).
Physically Based Rendering (PBR) is a method of shading and rendering that provides a more accurate representation of how light interacts with surfaces. It can be referred to as Physically Based Rendering (PBR) or Physically Based Shading (PBS). Depending on what aspect of the pipeline is being discussed, PBS is usually specific to shading concepts and PBR specific to rendering and lighting. However, both terms describe on a whole, the process of representing assets from a physically accurate standpoint. - Wes McDermott