Blender is very friendly to mixing multiple layers and Scenes in the compositor. It would open up even more options if this was also possible in material nodes. Currently, this is possible to do manually with the current steps:
1) Manually render a Render Layer (usually via the compositor)
2) Output the desired passes of the render layer to a file
3) Bring in the rendered image as a texture in a material in a second render layer
4) Manually render the second layer.
This workflow let's you conveniently render lots of useful masks and data. More importantly, it lets you fill the big limitation of Cycles: That you can't use a Shader output as a fac/mask within a material. By having this via a previous render layer, you can then do material effects based on lighting and shading.
While there likely wasn't much demand for this sort of thing in the past, the massive speed advances from the new Denoiser mean that multi-layer rendering is now much more viable. It would be a good time to make this sort of workflow more accessible and convenient.
I suggest that a Render Layer/Scene node be added to Material nodes that gives passes, exactly like in the compositor. Also let the user control the order that Render Layers and Scenes are rendered in. When a render layer is completed, it will be saved to disk or cached in memory. Then this data will be available for future layers. The node can output black if the previous layer has not been rendered yet. The render layer node will need a vector input to take a texture coordinate (default to Camera).