Three.js: NodeMaterial

Created on 3 Nov 2015  ·  161Comments  ·  Source: mrdoob/three.js

Hi.

I started developing of a THREE.NodeMaterial to reconcile the differences materials between 3D authoring software. In SEA3D Studio has options to create layers in Albedo with mask and various blend modes, Rim shader and others without need custom shader code. I would like to bring this to Three.JS with node shader.

I think that MeshPhongMaterial, MeshPhysicalMaterial and others can easily be based on NodeMaterial through of a interface for backward compatibility or proxy only.

UPDATED
http://sunag.github.io/sea3d/Labs/Three.JS-NodeMaterial/webgl_materials_nodes.html
http://sunag.github.io/sea3d/Labs/Three.JS-NodeMaterial/webgl_postprocessing_nodes.html

Syntax example for uses UV1 or UV2 for texture:

var usesUv2 = true;
var isLightmap = false;

var t = new THREE.NodeTexture( texture, new THREE.NodeUV( usesUv2 ) );

var nodemat = new THREE.NodePhongMaterial();
if (isLightmap) nodemat.light = t;
else nodemat.color = t;
nodemat.build(); // build shader

I am making an editor too, currently this would be the interface. Color is albedo and transform is the vertex position.
editor

I am also taking care that it can be used in a deferred shading. Now I will create reflection and refraction inputs.

Will be sharing the news to the PR, suggestions, tests and enhancement are welcome :+1:

Enhancement

Most helpful comment

Okay, TS support is ready with the next release R017 🙌

All 161 comments

Interesting!

@sunag We are very interested in something like this. How can I help? I can add support for Standard at least.

I am interested in creating fairly arbitrary graphs, so that the intermediate nodes also take inputs. So you can have a graph that looks like this:

A Texture(tex1, uv1)
B Texture(tex2, uv2)
C Blend(A,B, mode)
D Noise(param1, param2)
E Blend(C,D, mode)

And then use that final node E, as an input to a Material.

So it is very arbitrary, not limited to just textures.

My goal would be to help create this in the next few weeks, hopefully collaborating with you in the next week or so if possible. I was looking at doing this via the shadergraph library here: https://github.com/mrdoob/three.js/issues/7339 But I am find creating it within ThreeJS directly. Really any solution is good as long as it is flexible and it works.

I'm reading through your code, it is quite nicely designed. I have some initial feedback. Could I start a PR using your code as a base start collaborating on it?

(1) I'd have the material resolve the references. Basically references would have names that they would ask their material to resolve, and the material would give back a snippest of code on how to access that data. This would also allow the material to know what variables (uniform/varyings) are used the nodes, so it can optimize appropriately. This also allows for different materials to resolve references differently, thus making the nodes more portable, rather than having to know how to the materials implement things, especially when there are differences between fragment and vertex implementations.

(2) I try to use the GeometryContext object I created in the lights refactor, it gives consistent access to a lot of the required local variables. But of course that can be resolved by the material itself.

(3) I've have UV just be another reference, which is resolved by the material. And I would have NodeTexture should actually take a Node Input, thus allowing for procedurally generated UVs.

(4) I would call NodeCube, NodeTextureCube to be consistent with the rest of Three.JS. And I would remove the logic on how to actually go the ray casts from it. But I like the idea of standard cube maps, so I would actually not put the environment query for specular or diffuse in the nodes, but have that in the base phong material, and you can only control the normal used for the query, or the cubemap result itself (thus allowing it to be a procedurally determined color.) Does that make sense? So I would have the normal pluggable in the material and the cube texture pluggable (queriable by a direction and bias/lod, and returns a color). Thus one can provide a cube texture to the irradiance map and another to the specular map. We can swap out true sampleCube with @tschw's cubeToUV2 function as just a node swap.

(5) I'd try to add a NodeFunction that allows one to call arbitrary functions with parameters as an addition to your NodeOp (or maybe they could be merged in some fashion.)

(6) I'd get rid of all of the verbose NodeNormal, NodeTransform, NormalMap, etc individual class and just have some simple constructors that create a NodeReference with a name that is resolved by the material as appropriate. NodeReference could resolve uniforms, varyings as well as computed values in the shader.

(7) I do not understand the difference between NodeEnvironment and NodeCube. I think NodeEnvironment may be incomplete?

(8) It is confusing to have NodePhong not be derived from NodeMaterial. Although I see that NodeMaterial is derived from ShaderMaterial. I wonder if you called the direct derivative from ShaderMaterial, GraphMaterial (or NodeGraphMaterial) that would make more sense -- because all together the nodes form a graph, and it is the graph that becomes the material, not an individual node.

(9) I would suggest maybe some more varied terminology. I'd call the root node, MaterialNode, and one could derive PhongMaterialNode from it. I've have Vector3Node, FloatNode, etc derived from ValueNode -- not necessarily constant, but just a value. Thus one could pipe in three FloatNodes to a Vector3Node. I think you can have a helper that would make declaring each of these one line or so rather than the 10 or so currently.

(10) I would move the name "Node" from the start of the class names to the back because that is how it is in the rest of the ThreeJS project.

(11) I would create the new MaterialNode class and it would be initialized with a list of uniforms and varyings. It would be default be able to resolve this and also track which it has resolved so one can track which features are needed. One could thus have a limited resolve in the derived PhongMaterialNode that would resolve the special cases and rely on the underlying class to do the simple ones (varyings, uniforms.)

(12) I am sort of confused between the difference between NodePhong and NodePhongMaterial. I didn't realize there was both until now.

(13) there is code like this:

THREE.NodeGLPosition.prototype = Object.create( THREE.Node.prototype );
THREE.NodeGLPosition.prototype.constructor = THREE.NodeGLPosition;

THREE.NodeGL.prototype.generate = function( material, shader ) {

But above this snippet you already defined generate for NodeGL and you didn't define one for NodeGLPosition -- thus I think it is a copy-paste-edit error.

(14) I would get rid of NodeReflectUVW and NodeRefractVector and instead just make this something one can request from the material via a Reference resolve. Calculating a reflection vector is straight forward. I have added it to GeometryContext in my unmerged experimental ThreeJS branches.

(15) The way I would implement reflection and refraction would be to have them as color inputs on the Material. One would resolve Refect, ReflectLOD, and Refract, RefractLOD in the simple way you would resolve any variable, and then pass them into one's texture cube equivalent (procedural or samplerCube-based) and then pass the resulting color into Material. Is that how you were doing it?

(16) I'm confused about the light input -- usually one doesn't have lights being pluggable, rather the light parameters are fully defined in the light class. I guess you need this additional flexibility? How do you envision it.

@bhouston woow, thank you very much feedback.
I will need several posts to answer :)

I am interested in creating fairly arbitrary graphs, so that the intermediate nodes also take inputs. So you can have a graph that looks like this:
A Texture(tex1, uv1)
B Texture(tex2, uv2)
C Blend(A,B, mode)
D Noise(param1, param2)
E Blend(C,D, mode)

Currently the syntax like this. Uv1 offset animate example:
I think that NodeMaterial to MaterialNode and THREE.PhongMaterialNode It would be better too.

var uv2 = false;
var uv_offset = new THREE.NodeFloat(0);     
var uv = new THREE.NodeOperator( '+', new THREE.NodeUV( uv2 ), uv_offset);
var texture = new THREE.NodeTexture( imgTexture, uv );

nodematerial.color = t;

// onUpdate
uv_offset.number += .01;

I think reverse the order with your suggestion get better (mode,A,B) to (A,B,mode). I am in the process of creating the reflex maps, cubemap and others...

The environment and Cubemap are incomplete.

Currently the bugs it may happen more because of the format converter still unfinished. This is responsible by vector conversion. vec3 to vec4 or vec4 for example.

https://github.com/sunag/sea3d/blob/gh-pages/Labs/Three.JS-NodeMaterial/index.html#L365

A Blend "texture" for example: ( I have not tested this code )
It can be implemented in the same of a THREE.NodeOperator

https://github.com/sunag/sea3d/blob/gh-pages/Labs/Three.JS-NodeMaterial/index.html#L1105

THREE.NodeBlend = function( a, b, mode ) {

    THREE.NodeInput.call( this, 'blend' );

    this.mode = mode;
    this.a = a;
    this.b = b;

};

THREE.NodeBlend.prototype = Object.create( THREE.NodeInput.prototype );
THREE.NodeBlend.prototype.constructor = THREE.NodeBlend;

THREE.NodeBlend.prototype.generate = function( material, shader, output ) {

    var a = this.a.build( material, shader, output );
    var b = this.b.build( material, shader, output );

    switch(this.mode)
    {
        case 'multiply':

            return this.format( '(' + a + '*' + b + ')', this.a.type, output);

            break;
    }

    return a;

};

.generate() is the responsible for the code generator. The calcs codes are stored in a cache if you want to use in more than one input without losing performance.

Still I do not set up pointers or constant for optimization...

The compilation is done by propagation in build() for vertex and fragment code.

I can put you as a collaborator? If you want to edit the code in any way, I will be working on it as well.

I can put you as a collaborator? If you want to edit the code in any way, I will be working on it as well.

Thanks! I'll make PRs to yours so you can approve the changes.

I've added you (as well as @mrdoob, @WestLangley and @tschw) to a side project of mine that is attempting to define a set of reusable nodes and material definitions that can be transferrable between various renders. It is mappable onto this shader graph system you've created.

I do not think you have to pay attention to the repo I just gave you access to if you do not want to. It is what I am interested in implementing on top of this.

(2) I try to use the GeometryContext object I created in the lights refactor, it gives consistent access to a lot of the required local variables. But of course that can be resolved by the material itself.

I wish that the lights are one LightNode. My concern is to harness the code already developed for Three.JS.

(3) I've have UV just be another reference, which is resolved by the material. And I would have NodeTexture should actually take a Node Input, thus allowing for procedurally generated UVs.

You can replace UV to a vec2 would it be this?

(5) I'd try to add a NodeFunction that allows one to call arbitrary functions with parameters as an addition to your NodeOp (or maybe they could be merged in some fashion.)

this would be great. mainly for a BlendNode.

(6) I'd get rid of all of the verbose NodeNormal, NodeTransform, NormalMap, etc individual class and just have some simple constructors that create a NodeReference with a name that is resolved by the material as appropriate. NodeReference could resolve uniforms, varyings as well as computed values in the shader.

In this line of thought I think the MaterialNode could be a base of material Phong and Physical material.

(7) I do not understand the difference between NodeEnvironment and NodeCube. I think NodeEnvironment may be incomplete?

I still can not finish these Nodes.

(8) It is confusing to have NodePhong not be derived from NodeMaterial. Although I see that NodeMaterial is derived from ShaderMaterial. I wonder if you called the direct derivative from ShaderMaterial, GraphMaterial (or NodeGraphMaterial) that would make more sense -- because all together the nodes form a graph, and it is the graph that becomes the material, not an individual node.

NodeMaterial would be the root node material, it is necessary to use a node for vertex and fragment. NodePhong is hibrid and NodePhongMaterial is only a proxy class. This can then be merged.

(9) I would suggest maybe some more varied terminology. I'd call the root node, MaterialNode, and one could derive PhongMaterialNode from it. I've have Vector3Node, FloatNode, etc derived from ValueNode -- not necessarily constant, but just a value. Thus one could pipe in three FloatNodes to a Vector3Node. I think you can have a helper that would make declaring each of these one line or so rather than the 10 or so currently.

Sounds good.

(16) I'm confused about the light input -- usually one doesn't have lights being pluggable, rather the light parameters are fully defined in the light class. I guess you need this additional flexibility? How do you envision it.

This would be for the lightmap or a possible LightNode.

This would be for the lightmap or a possible LightNode.

I like the idea of a pluggable lightmap because one could define the UVs for it explicitly. :)

@bhouston Fix several corrections today in this file: But still has a lot to do.
https://github.com/sunag/sea3d/blob/gh-pages/Labs/Three.JS-NodeMaterial/three.node.js

This is the playground that I am creating :art: Textures and buttons is a drag and drop, works in chrome only

http://sea3d.poonya.com/flow/

Amazing, stuff! Holy crap! It is beautiful.

Would it be possible to share the code in a way that I can also contribute? As a public PR or something?

You are working with the Sea3D project here right?

https://github.com/sunag/sea3d/tree/gh-pages/Labs/Three.JS-NodeMaterial

So I can just fork it and start contributing? Would you accept PRs? How can we effectively collaborate.

I haven't asked but @mrdoob probably (?) would love to have this within the ThreeJS project itself.

Definitely!

So I can just fork it and start contributing? Would you accept PRs? How can we effectively collaborate.

Of course, I think your help would be amazing. I also have to bring other nodes types, like saturation, noise as you suggested.

You are working with the Sea3D project here right?

I think in making a PR for Three.JS with examples so all this is defined.

@mrdoob What do you about the materials names, THREE.MaterialNode or THREE.NodeMaterial?

It's a type of material, so it should be THREE.NodeMaterial.

a rim shader example
flow-rimshader-example

area reflection example
flow-areareflection

This is so awesome @sunag!

@bhouston thanks! you think it will be difficult to convert to R74?

This is very impressive work! Reminds me of shaderforge.

Very good job so far!

@sunag It will be a bit of work, but I would like to to help and most of the big structural changes in R74 shader code are my fault. :)

looks beautiful man and fun to play with: might I offer for inspiration Blender3D node editor. I find it super efficient, there are even some great videos on PBR via Blender's node system and what nodes would be most useful for creating to make PBR from scratch:

https://www.youtube.com/playlist?list=PLlH00768JwqG4__RRtKACofTztc0Owys8

It will be a bit of work, but I would like to to help and most of the big structural changes in R74 shader code are my fault. :)

@bhouston Wow. It was much more clean with the new r74 changes. I finished the first part still missing StandardMaterial. Any problem I post here : ) Thanks
https://github.com/sunag/sea3d/commit/d544ad7993272348f8bbea2337cdceb52159a6a8

Another thing we are still missing is refraction. I really like to use the rendering buffer in place of a Cubemap RTT as default. It would be much more efficient.

@GGAlanSmithee I have some references of ShaderFX that I am a big fan. Shader Forge and UE4 mainly are also great references.Thnks!

@richardanaya Amazing videos. Thks!

@mrdoob In which folder do you recommend put these files? three.js root ( src/materials/node ) or in examples ?
https://github.com/sunag/sea3d/tree/gh-pages/Labs/Three.JS-NodeMaterial/node

I'd still recommend calling this a "Material Graph" or in inverted ThreeJS style, a "GraphMaterial." Or if you insist on using the term "Node", I'd call it "NodeBasedMaterial". Both of these names make it clear that the material contains nodes, rather than being a Node itself.

I'd still recommend calling this a "Material Graph" or in inverted ThreeJS style, a "GraphMaterial." Or if you insist on using the term "Node", I'd call it "NodeBasedMaterial". Both of these names make it clear that the material contains nodes, rather than being a Node itself.

For me both look good. I leave the decision to @mrdoob what do you think?

BTW @sunag I'd stick with cubemaps for refractions if possible, it is easier and more accurate. I think that is how nearly everyone else does it and we need the RTT stuff for accurate reflections as well. I think it just needs to be a fast render at 128^2 or 256^2.

BTW @sunag I'd stick with cubemaps for refractions if possible, it is easier and more accurate. I think that is how nearly everyone else does it and we need the RTT stuff for accurate reflections as well. I think it just needs to be a fast render at 128^2 or 256^2.

Yes, we can let both. would be a discussion between performance x accuracy. Still for plane refraction (glass, water) I recommend the buffer in place of a CubeMap (for most cases).

@mrdoob In which folder do you recommend put these files? three.js root ( src/materials/node ) or in examples ?

I would put it in examples to start with. Once it gets well defined we can later move it to src 😊

I'd still recommend calling this a "Material Graph" or in inverted ThreeJS style, a "GraphMaterial." Or if you insist on using the term "Node", I'd call it "NodeBasedMaterial".

For me both look good. I leave the decision to @mrdoob what do you think?

I kind of like NodeMaterial already... Maybe NodesMaterial? NodeGraphMaterial? @WestLangley any suggestions?

I would rename the current nodes though... NodeColor, NodeFloat, NodeTexture, ... to ColorNode, FloatNode, TextureNode

http://sea3d.poonya.dev/flow/

I can't get this to load 😐

I would put it in examples to start with. Once it gets well defined we can later move it to src :blush:

@mrdoob it will be great.

I would rename the current nodes though... NodeColor, NodeFloat, NodeTexture, ... to ColorNode, FloatNode, TextureNode

I will forward it then.

this is the local url :blush: , try this:
http://sea3d.poonya.com/flow/

@WestLangley any suggestions?

I suggest THREE.FlowMaterial.

My second choice would be THREE.CustomMaterial.

As a completely random bystander. NodeMaterial sounds very intuitive to me, because that's what they are called in Blender3D

If @mrdoob likes NodeMaterial, we can stick with it. :)

NodeMaterial it is then 😁

This project is awesome! Just saw the demo on Twitter and it's very impressive.

I wonder if the nodes should be something that go into Three core. They're very implementation specific. I too am building a Three.js shader graph editor (not yet released) for ShaderFrog.com, and the solution I have is to just export the GLSL code and all needed metadata like uniform names, into a little JSON file, and load it with an external runtime library

screen shot 2015-11-20 at 12 05 26 pm

This graph editor can work with full shaders by analyzing their source code, meaning no specific shader node types are required. Could this NodeMaterial type be handled entirely outside of Three's core as well? All you really have to output is a RawShaderMaterial for someone to use it in their own project.

Could this NodeMaterial type be handled entirely outside of Three's core as well? All you really have to output is a RawShaderMaterial for someone to use it in their own project.

@DelvarWorld Hi. Yes in theory, but it is very early to make a good shader from raw. At the time is better with an initial interface. It also helps to keep compatible with Skin / Morph and others natives components of Three.JS.

I thought to note a minor issue:
In "Flow" the connectors don't pop to top level when dragging nodes over each other.
Maybe this issue is waiting for layers or something still.

I wonder if there is a way to unify both approaches? I am interested in a multi-layered shader and for that you need to have multiple BSDFs that contribute towards a final result. This means that one needs to separate out the shading model more -- right now it is pretty tightly coupled in @sunag's current design. I think we should head in a direction where one doesn't need to have a Standard or Phong material specified, it could be raw like what @DelvarWorld has. I think we can move there incrementally though, so what @sunag has is a good start.

This is not a fleshed out answer, but a while ago I created a rough proposal for a portable shader format that includes metadata, such as uniform names, their types, their type in Three.js, etc. https://github.com/DelvarWorld/ShaderFrog-Runtime/blob/master/THREE_SHADER_FORMAT.md

All a shader really needs to run in a real environment is the raw shader source code (since the GPU compiles it) and, for convenience, what uniforms the user can set with what values. This rough proposal does not include any notion of building a shader programmatically, it's just a simple delivery for GLSL and metadata. It's compatible with Three.js because you just have to put it into a RawShaderMaterial in the end.

Currently there is no way to make shaders portable in Three.js or export them from any application, which is what gave me the idea to propose a standard, and I think it could solve both of our problems, since we're building external applications that in the end spit out a predetermined material. It also means the implementation details of compiling specific combinations of a graph are left up to applications, not Three.js.

Does this proposed shader standard have a place in Three? I have no idea. Right now it's probably more useful for me than for Three's core, since Three builds its own shaders its own way.

@DelvarWorld I did start this project on a way to create a standardized set of shader graph nodes:

https://github.com/OpenMaterialGraph/OpenMaterialGraph

Node specifications here:

https://github.com/OpenMaterialGraph/OpenMaterialGraph/tree/master/spec/nodes

Minimalist BSDFs specifications here:

https://github.com/OpenMaterialGraph/OpenMaterialGraph/tree/master/spec/bsdfs

This is oriented towards Physically-based rendering though.

My feeling is that one needs to have a shell of a shader for ThreeJS that is higher level than a raw shader but lower level than Phong shader. Basically the default shader would be able to do morph targets, bones, etc. And then all these specific lighting scheme shaders (Basic, Lambert, Phong, Standard) would use that template. Right now there is an implied template shader (we include the same things in each shader) -- but I think we could make it clearer where to plug in things. You could plug in lighting schemes (phong, lambert, basic, or multilayered) and you can plug in properties to those lighting schemes which are your general nodes.

@bhouston oh nice, looks like we have a lot of overlapping requirements, aka default value, display name, human readable descriptions, etc. I don't know what the future holds but it would be a moderately easy change for me to use a format more like what you propose.

A formal shader format makes a lot of sense. It should live with the lib though, so that it can evolve with it easier. For someone that is rather new to three.js, is there any specific reason such a thing does not already exist? Should be interesting for the editor if nothing else.

*EDIT Maybe having the spec in three.js would defeat its purpose if it's meant to be consumed by other rendering pipes.

Right now it's probably more useful for me than for Three's core, since Three builds its own shaders its own way.

You can generate the same material in other language if you make modifications inside THREE.NodeGL.prototype.generate once a NodeMaterial(*Flow) is a visual language. The NodeBuilder can be the door to it. Currently is the intermediary of data between Nodes.

I cleaned an example of NodeMaterial "raw": It can still generate several bugs.
https://github.com/sunag/sea3d/blob/gh-pages/Labs/Three.JS-NodeMaterial/index.html#L179
https://github.com/sunag/sea3d/blob/gh-pages/Labs/Three.JS-NodeMaterial/node/NodeMaterial.js#L9

I think to advance this with particles created root nodes. NodePass for Multi-Pass and NodeMaterialD after the PR what should I do in a few days.

I suggested a THREE.ShaderFrogLoader instead of a runtime shader.

In "Flow" the connectors don't pop to top level when dragging nodes over each other.

@MasterJames Thank you! I'll be looking at it.

:+1:

I started to create the examples. I hope to finish this week. :sweat_smile:
http://sunag.github.io/sea3d/Labs/Three.JS-NodeMaterial/webgl_materials_nodes.html

Thoughts?

@sunag +1 : - )

+1!

This is so awesome. I'd default on the plants+wall texture rather than the cloudy-like displacement. :) That is just beautiful.

I'd default on the plants+wall texture rather than the cloudy-like displacement.

You refer to the 'layers' example? Yet I will post more examples for us

gpu soft-body
soft-body

For comparison with a different approach, the Shader Frog graph editor is now live, and it's in the "kick the tires" phase. I'd like to underscore that it doesn't require any changes to Three.js core.

ShaderFrog currently has more power than traditional node editors. It has infinite node types to work with because it can parse and understand any shader. For example, to apply a reflection only to the edges of an object, you could simple take a glow shader, which is an entirely standalone shader with a fragment and vertex shader:

screen shot 2015-12-08 at 1 48 24 pm

...and multiply it by a reflection shader...

screen shot 2015-12-08 at 1 49 39 pm

...using the shader graph...

screen shot 2015-12-08 at 1 50 40 pm

And ta-da! A reflection rim shader.

You can also, for example, mask any two shaders using any other shader, and so on:

screen shot 2015-12-08 at 1 56 03 pm

Note that the only "node type" required by this process is a multiply node (there are others in ShaderFrog), which is almost entirely independent of GLSL. ShaderFrog can work with any shaders, which is why it has infinite node types. This advanced technology approach allows for this manipulation independent of the Three.js changes.

ShaderFrog has first class export support to Three.js, for free. All without modifying the Three.js source code. Sea3D is a third party product, like ShaderFrog. Putting third party product mechanics into Three's core seems like an unfair advantage and I don't understand it politically.

@DelvarWorld I really don't see a node-based shader construction abstraction as third-party specific technology. If any thing, it allows more competitors to become involved in node-based shader tech and to all benefit from the further benefit of each other's work ( more nodes, optimized construction, etc.).

As a graphics programmer like myself who understands shaders at a high level ( from tools like Blender 3D), but not at a low level, node based abstraction seems like a great way for me to programmatically interact with shaders using my 3D modeling knowledge. I understand the node setup for PBR for instance, but god help me if I ever wanted to write that in GLSL.

@DelvarWorld Amazing. I love the new editor.

@DelvarWorld +1 Very nice! : - )

Thanks! :)

I really don't see a node-based shader construction abstraction as third-party specific technology.

I understand the node setup for PBR for instance

Part of my counter example here is that you can get both of these things for free independent of Three.js. All ShaderFrog shaders are open source so any nodes that become part of a PBR implementation can also be learned from, edited and improved over time (as well as composed with any other shaders, not just some subset in Three.js core). To me, the fact that all of this can be accomplished without Three changes means that it's unnecessary, and blurs the line between Three.js being a webgl wrapper API and now driving implementation specific features. Storing shader node data is highly app specific and as far as I know there's currently no common node editor standard format for any software? Enforcing one might be harmful to other implementations.

@DelvarWorld As I understand this change, ShaderFrog will still be able to function and would continue to be able to go beyond GraphMaterial if it so desired. I looked at the example source code for GraphMaterial, and I was able to understand the simplicity of how it worked without even having to think about a third party solution/editor. As a graphics developer, I am concerned with how to programmatically make materials that impress quickly, I feel that these are a proper concern of ThreeJS.

I think that NodeMaterial and Flow has two different focuses. Flow and ShaderFrog are both visual editors third parties solution currently in closed source (I think mainly in artists for this) otherside NodeMaterial is an open-source shader editor for programmers done to Three.JS. I agree with @richardanaya in the sense that ShaderFrog can go beyond of NodeMaterial if desired. Just like I think Flow not need be just a material editor. I really hope great improvements of ShaderFrog in the future as usual.

wip: caustic-voronoi
http://sunag.github.io/sea3d/Labs/Three.JS-NodeMaterial/webgl_materials_nodes.html
caustic

@sunag I was going through your source code tonight. I really love it. Some quick feedback from my readings this evening:

  • I would like to see a PR sooner rather than later. I can help. :)
  • It may be cool to allow for arbitrary attributes rather than just the standard ones (uv, uv2, normal, position, etc.) I know that a lot of shaders can benefit from the tangent vec4s, more than just 2 uvs and also multiple color attributes. Thus I would generalize the needsPosition into a requestedAttributes array or something.
  • I would suggest some organization of the nodes into the main NodeMaterial, the BRDFs (Phong, Standard, etc), texture nodes (NodeCubeTexture, NodeTexture), accessor (?) nodes (NodePosition, NodeNormal), math nodes (NodeOperator, etc..), utility nodes (NodeSwitch, ...), and extra (?) (NodeVelocity, ...) I think this would help a lot.
  • It would be nice to have matrix accessors for Projection, View, and World matrices.
  • You are missing some of the operators: % ^ & << >> ~ |
  • I'm not sure i see the value in NodePhongMaterial when the class is so small, I'd just have users have to create it using the NodePhong node combined with a NodeMaterial (although I still do not like that name, GraphMaterial feels a lot more nature to me, but I can live with it.)
  • I'd call NodeTimer -> NodeTime... but again personal preference.
  • I wonder if there is some way to combine many of the variable nodes into one like you've done with NodeMath#, NodeOperator. These seem so similar: NodeViewPosition, NodeWorldPosition, NodeViewNormal, NodeTransformedNormal, NodeTransformedPosition, NodeProjectPosition...
  • I would like to see the set of optional arguments for NodeMath#, NodeOperator to be in a form that can be a list, so that one can easily populate a dropdown box with it. Right now because they are members just capitalized on the classes, it is a bit harder to identify them. I am not sure of a good solution that is still convenient for coding. Maybe just collecting them into a list after their definitions as members of the class, so just add those lists?
  • NodeTransformedNormal, NodeTransformedPosition are difficult for me to figure out what they are without looking at the code. Are they in world space, view space, object space? "Transformed" is a very generic term, which just means multiplied by a matrix. I'll look at the code for them now....
  • I would allow for NodeUV and NodeColor to take an index rather than being limited to 2 and 1 channel respectively. Again I would like to see these made generic, really you pass in a string or one of a number of presets rather than having it hard coded to a set of specific channels. You can still have some dedicated code paths for some variables, but it would fall back to something generic for names it doesn't recognize for special case handling.

More feedback:

  • I would like to have a graph with more than one BRDF, basically two Standard BRDF and then do energy preservation between them. Once you get the PR in I would like to add this functionality. It will make the system more generic and more powerful.

@bhouston Wow! Thank you!

The request attributes, organization of folders, NodeTime, NodePosition, NodeNormal, NodeView was revised. Added +3 examples too.

in operators and matrix we have to add compatibility with integer and matrix conversions. For this to be done in automatic way as in float / vectors#. We have enough work for the next updates.. :sweat_smile:

Tomorrow I will make the PR. I have to review some things yet.

@sunag:

I believe this is a bug:

addGui( 'metalnessA', roughnessA.number, function( val ) {

                        roughnessA.number = val;

                    }, false, 0, 1 );

I believe this is a bug:

Thanks!

I just got giddy going to the example page and seeing a PBR node graph <3

Niice!

I was going to suggest using the nodes for post-processing but I was waiting for the first part to be done. Damn, that is beautiful!

NodeMaterial rev 5 + LightNode

http://sunag.github.io/sea3d/Labs/Three.JS-NodeMaterial/webgl_materials_nodes.html

Skin
nodematerial-rev5-skin

Toon Shading
nodematerial-rev5-toon

++ color-adjustment and plush examples.

Niiice!

subsurface scattering :sweat_smile:

This is a kind of shader that can work in future updates.

http://sunag.github.io/sea3d/Labs/Three.JS-NodeMaterial/webgl_materials_nodes.html

sss-img1
sss-img2

😮!

Thanks for awesome NodeMaterial!

I have several comments/bugs:

  • trailing whitespace breaks function node
var n1 = new THREE.FunctionNode(" float mul_2_float(float x) { return x*2.0; }"); // problem
var n2 = new THREE.FunctionNode("float mul_2_float(float x) { return x*2.0; }"); // ok
  • problem with int parameter in function node

working example with float parameter:

var floatFuncNode= new THREE.FunctionNode("float mul_2_float(float x) { return x*2.0; }");

var funcCallNode = new THREE.FunctionCallNode(floatFuncNode);
funcCallNode.inputs.x = new THREE.FloatNode(0.2);

var colorResNode = new THREE.OperatorNode(new THREE.ColorNode(0x00ff00),
    funcCallNode, THREE.OperatorNode.MUL);

var mat = new THREE.PhongNodeMaterial();
mat.color = colorResNode;

broken example with int parameter:

var intFuncNode= new THREE.FunctionNode("float mul_2_int(int x) { return float(x)*2.0; }");

var funcCallNode = new THREE.FunctionCallNode(intFuncNode);
funcCallNode.inputs.x = new THREE.IntNode(1);

var colorResNode = new THREE.OperatorNode(new THREE.ColorNode(0x00ff00),
    funcCallNode, THREE.OperatorNode.MUL);

var mat = new THREE.PhongNodeMaterial();
mat.color = colorResNode;
  • suggestion: pass function parameters in FunctionCallNode constructor
//current:
var funcCallNode = new THREE.FunctionCallNode(floatFuncNode);
funcCallNode.inputs.param1 = new THREE.FloatNode(1);
funcCallNode.inputs.param2 = new THREE.ColorNode(0x0f0f0f);

// proposed:
var funcCallNode = new THREE.FunctionCallNode(floatFuncNode, 
    { param1: new THREE.FloatNode(1), param2: new THREE.ColorNode(0x0f0f0f) } );

// and, i think suitable in some cases: 
var funcCallNode = new THREE.FunctionCallNode(floatFuncNode, 
    [ new THREE.FloatNode(1), new THREE.ColorNode(0x0f0f0f) ] );
  • does PhongNodeMaterial compatible with bump map?

few more notes:

  • problem with PositionNode.LOCAL in fragment shader in "raw" NodeMaterial:
var material = new THREE.NodeMaterial(
        new THREE.RawNode( new THREE.PositionNode( THREE.PositionNode.PROJECTION ) ),
        new THREE.RawNode( new THREE.PositionNode( THREE.PositionNode.LOCAL ) )
    );

resulting shaders:

varying vec3 vPosition;
void main(){
gl_Position = (projectionMatrix * modelViewMatrix * vec4( position, 1.0 ));
vPosition = transformed;
}
varying vec3 vPosition;
void main(){
gl_FragColor = vec4(vPosition,0.0);
}

Probably "transformed" functionality should be moved from Phong/Standart node materials to NodeMaterial

  • ConstNode usage
    var material = new THREE.NodeMaterial(
        new THREE.RawNode( 
            new THREE.OperatorNode(
                new THREE.PositionNode( THREE.PositionNode.PROJECTION ),
                new THREE.ConstNode("float TWO = 2.0;"),
                THREE.OperatorNode.MUL)
            ),
        new THREE.RawNode( new THREE.ColorNode( 0xff0000 ) )
    );

In that case result shader contains
gl_Position = ((projectionMatrix * modelViewMatrix * vec4( position, 1.0 ))*TWO);
but TWO constant is not declared. Is it my mistake or bug?
Also, ConstNode is semicolon sensitive, so "float TWO = 2.0" is not parsed.

Thanks @dimarudol ! If you want to feel free with the notes as soon as possible will make the new revision.

but TWO constant is not declared. Is it my mistake or bug?

I will consider this way of use and in material for example material.include( node )

At the time try a global constant:

var TWO = new THREE.ConstNode("float TWO = 2.0;");
THREE.NodeLib.add( TWO ); // global

 var material = new THREE.NodeMaterial(
        new THREE.RawNode( 
            new THREE.OperatorNode(
                new THREE.PositionNode( THREE.PositionNode.PROJECTION ),
                TWO,
                THREE.OperatorNode.MUL)
            ),
        new THREE.RawNode( new THREE.ColorNode( 0xff0000 ) )
    );

Hi @sunag, I've been playing with Sea3d Flow and it looks pretty cool. Just wanted to ask whether there is an official repo somewhere for it.
Regards.

Hi @rraallvv, not yet, it is already to be ready. As soon as possible I post the news.

Thanks @sunag, it’s very much appreciated.

Hi @sunag ! You did a really good job !
I've got a question : for my project it would be great to use your nodes structure but I need per vertex color (and custom per vertex float values), but in your nodes system you only got colorNode and floatNode applied to the entire mesh.. Do you think there is a way to easily implement something like a bufferColorNode (using THREE.BufferAttribute) ?

For example I've got this scene :

screenshot from 2016-08-31 16 27 06

Each vertex have its own color. It is possible with THREE.BufferAttribute and a THREE.ShaderMaterial but there's no equivalent in your code.

I would need something like :
`let material = new THREE.StandardNodeMaterial();
let texColorMap = new THREE.TextureNode( new THREE.TextureLoader().load("colorMap.jpg"));
let customData = new THREE.BufferFloatNode(new Float32Array(...), "data"); // The second parameter is the name of the attribute

let colorMap = new THREE.FunctionNode([
"vec3 colorMap(sampler2D texColorMap, float data){",
" return vec3(texture2D(texColorMap, customFunc(data)));", // customFunc return a vec2 depending on the data
"}"
].join("n"));

let colorMapCall = new THREE.FunctionCallNode(colorMap);
colorMapCall.inputs.texColorMap = texColorMap;

material.color = colorMapCall;
material.build();`

By the way, it seems like I can't use sampler2D parameters for the FunctionNode...

Am I missing something ?
I think I could help if needed :)

@martinRenou Thanks! Hmm, maybe something like for now:

bufferGeometry.addAttribute( 'color', new THREE.BufferAttribute( colors, 4 ) );
...
colorMapCall.inputs.data = new THREE.ColorsNode(); // send color.x to float slot

I had not thought yet in assign a dynamic geometry attribute with node. But I think this be a good idea... maybe something like THREE.BufferAttributeNode...

@sunag thanks for your answer :)
I'll try that. If it's ok for you I'll start work on a THREE.BufferAttributeNode, for now I'm still reading your code and trying to understand the structure.

  1. Concerning THREE.FunctionNode, as I said, I really need to set a sampler2D in my parameters as it's used as a colorMap.. The pixel color is calculated with texture2D() and the second parameter of texture2D() is calculated using my custom data (which is for example a temperature).
    I understand that the idea of using vec3 parameters was for being friendly with users, but FunctionNode could also support sampler2D parameters, don't you think ?
  2. Furthermore, THREE.FunctionNode only write in the fragment shader, but wouldn't it be nice to be able to write in the vertex shader too ?

This would be my use case, it's called IsoColor, because the color is corresponding to a data (temperature, pressure...) red ->high temperature, blue -> low temperature :
screenshot from 2016-09-01 17 29 58

  1. I'm thinking about writing a THREE.IsoColorNode (that would replace the FunctionNode of my diagram), and other nodes that would be very interesting for scientific visualizations. Are you interested in that ? @mrdoob would you be interested too ?

Concerning THREE.FunctionNode, as I said, I really need to set a sampler2D in my parameters as it's used as a colorMap.. The pixel color is calculated with texture2D() and the second parameter of texture2D() is calculated using my custom data (which is for example a temperature). I understand that the idea of using vec3 parameters was for being friendly with users, but FunctionNode could also support sampler2D parameters, don't you think ?

I understand your need now! I will create a PR for this and others notes of the @dimarudol ...

Furthermore, THREE.FunctionNode only write in the fragment shader, but wouldn't it be nice to be able to write in the vertex shader too ?

I do not tested this but I suppose so.

I added more two examples:

  • sampler2D are in triangle-blur example
  • custom BufferAttribute are in custom-attribute example

It is working but I will still give a polished in code...

https://github.com/mrdoob/three.js/pull/9636

Wow ! You really did a good job, it's working for me too :smile:

Thanks again for your work !

Hi @sunag,

For now we can't have FunctionNode with "void" type as we need to send it to the material as a parameter (color, alpha...). For example if I want to implement a clip plane FunctionNode, all I want to do is wright a shader part like this :

void clipPlane(vec4 plane){
 if(dot(position, plane.xyz) > plane.w) discard;
}

But that's not possible to give it to the material... A solution would be to return a float which is alpha canal, but that's not clean and optimized. Don't you think that it would be cool to add some void functions to a material like that:

var clipPlane = new FunctionNode([
"void clipPlane(vec 4 plane){",
" if (dot(position, plane.xyz) > plane.w) discard;",
"}"].join("\n"));
var clipPlaneCall = new FunctionCallNode(clipPlane);
clipPlaneCall.inputs.plane = myVec4Node;

var threshold = new FunctionNode([
"void threshold(float upperBound, float lowerBound, float data){",
" if(data < lowerBound) discard;",
" if(data > upperBound) discard;",
"}"].join("\n"));
var thresholdCall = new FunctionCallNode(threshold);
thresholdCall.inputs.upperBound = myFloatNode1;
thresholdCall.inputs.lowerBound = myFloatNode2;
thresholdCall.inputs.data = myAttributeNode;

var myMaterial = new StandardNodeMaterial();
myMaterial.color = ...
myMaterial.alpha = ...

// voidFunctions is not a good name I'm not inspired...
myMaterial.voidFunctions = [clipPlaneCall, thresholdCall];

For a more "design" example, if I want to make holes on my teapot with this texture:
wood-hole-texture
I would like to do something like that:

var holes = new FunctionNode([
"void holes(vec3 texColor){",
" if (/*texColor too much dark*/) discard;",
"}"].join("\n"));
var holesCall = new FunctionCallNode(holes);
holesCall.inputs.texColor = new TextureNode(LoadTexture("holes-text.jpg"));

var myMaterial = new StandardNodeMaterial();
myMaterial.voidFunctions = [holesCall];
myMaterial.side = THREE.DoubleSide;

Another thing is that we do not have control on the order of functions in shaders. If I've functions that are not commutative, I don't have control on the result...

let see this example:

var transparencyPlane = new FunctionNode([
"float transparencyPlane(vec 4 plane){",
" if (dot(position, plane.xyz) > plane.w) return 0.5.;",
" return 1.;",
"}"].join("\n"));
var transparencyPlaneCall = new FunctionCallNode(transparencyPlane);
transparencyPlaneCall.inputs.plane = myVec4Node;

var displacement = new FunctionNode([
"vec3 displacement(vec3 vector){",
" return position + vector;",
"}"].join("\n"));
var displacementCall = new FunctionCallNode(displacement);
displacementCall.inputs.vector = myVec3Node;

var myMaterial = new StandardNodeMaterial();
myMaterial.transform = displacementCall;
myMaterial.alpha = transparencyPlaneCall;

If a point is behind the transparency plane and comes beyond due to the displacement, in the case which "transparencyPlaneCall" is called before "displacementCall" alpha = 1., and in the case which it is called after alpha = 0.5.
So I would like to set the order of function calls... Do you know what I mean ?

Hi @martinRenou

Hmm, about void functions maybe a ProxyNode to use in slot because this need follow a order or initialize at start or end of the code, maybe functionsStart or functionsEnd slots...

Using alpha slot if alpha is 0 is discartted automatically.
https://github.com/mrdoob/three.js/blob/dev/examples/js/nodes/materials/PhongNode.js#L180

So I would like to set the order of function calls...

Exist a sequence see here:
https://github.com/mrdoob/three.js/blob/dev/examples/js/nodes/materials/PhongNode.js#L122

The function order is done automatically using simple dependencies algorithm, should put the FunctionCall in a previous slot, color is first specular is second for example.

This example is two differente shader code: fragment to transparencyPlaneCall and vertex to displacementCall.

It makes me think use variables (varying and locals) to expand not inputs and const only will be very interesting. Maybe a VarNode...

Using alpha slot if alpha is 0 is discartted automatically.

Sorry for that, didn't see it.

The function order is done automatically using simple dependencies algorithm, should put the FunctionCall in a previous slot, color is first specular is second for example.

I understand, in fact I just wanted to be sure that function that discard pixels was called first. Sounds good to me to use alpha if you discard it just after my functions :smiley:

This example is two differente shader code: fragment to transparencyPlaneCall and vertex to displacementCall.

Right ! So that was not a good example. My mistake.

Done r6 - #9636

Cool ! Thanks for your answers.

It makes me think use variables (varying and locals) to expand not inputs and const only will be very interesting. Maybe a VarNode...

Sounds good too, it could be interesting to be able to explicitly wright a function in the vertex shader, use a varNode, and use it with a function in the fragment shader. Don't you think ?

@sunag
I think there's an issue with the using of AttributeNode with FunctionNode in the vertex shader..

If I have a code like this :

var customData = new THREE.AttributeNode("data", "float");

var myFunc = new THREE.FunctionNode([
 "vec3 myFunc(float data){",
 " return vec3(data, 0., 0.);"
 "}"].join("\n"));
var myFuncCall = new THREE.FunctionCallNode(myFunc);
myFuncCall.inputs.data = customData;

material.transform = myFuncCall;

The shader is written in this order :

...
varying float nVdata;
attribute float data;
...
float myFunc(float data){
 return return vec3(data, 0., 0.);
}

void main(){
...
transformed = myFunc(nVdata); // We use nVdata but it is not initialized
...
nVdata = data;
...
}

A simple fix would be to initialize nVdata to data ?

Sounds good too, it could be interesting to be able to explicitly wright a function in the vertex shader, use a varNode, and use it with a function in the fragment shader. Don't you think ?

Yeah! The idea is a better communication with vertex/fragment shaders using varying as VarNode. This fits great now with keywords feature.

I create an minimal varying example but there is none solution to void function yet.

I think there's an issue with the using of AttributeNode with FunctionNode in the vertex shader..

Fixed - https://github.com/mrdoob/three.js/pull/9681

Hi @sunag, I saw your varying example, and it sounds weird to me to put the varying in mtl.transform like you did... Maybe you will change this when it will be possible to create void functions ?
However, sounds a good idea that keywords feature :smiley:

In fact, I think it would be a good idea to put those varyings in mtl.varyings (mtl.varyings is an array) like that : mtl.varyings.push(myVar). Using the same idea, we could put functions in vertex and fragment shaders like this : mtl.vertexFunctions.push(myVertexFunctionCall), same way with mtl.fragmentFunctions.

This way we could do a lot of computations on those varyings and then use them for effects. Those functions would be void functions.

As in the screenshots above where nodes are connected visually, how do we re-create this mentally with this API? I see that constructors accept other nodes. Is that the way that output from one node (passed into a constructor) is input for another node (it receives the nodes in its constructor)? Does a node have at most one output? How many nodes can be inputs to another node via constructor, or is it limited?

@trusktr Is ilimited of course how much more nodes this consumes more CPU at build time and GPU at runtime. The nodes are adjusted according with inputs and outputs values, the conversion of float to vec2 or vec3 for example is automatic, and store in cache if used more than once for optimization (see TempNode).

In screenshots each node is a class, e.g: PositionNode in display is:
https://github.com/mrdoob/three.js/blob/789efa65bafe022e178c7e93e0985a7607a54403/examples/js/nodes/accessors/PositionNode.js#L5

@mrdoob @sunag would you be willing to comment on the current status of the NodeMaterial code, and how it fits with the three.js roadmap?

In theory, we could use nodes for several things now, like:

  • Implement glTF spec/gloss PBR materials as NodeStandardSGMaterial.
  • Support per-map UV sets and transforms
  • Simplify use of builtin materials with instancing

But it is hard to justify adding a dependency in GLTFLoader for all of the node files. And probably it is hard to justify putting nodes into src/* without more active usage. Chicken/egg. 😅

It would be helpful to know the long-term plan for NodeMaterial, to guess what features are (or are not) related... does this need more development? Are there things other contributors could be doing to help?

@sunag yeah, I would be curious to know too if you have any further plans at this point.

In theory, we could use nodes for several things now, like:

  • Implement glTF spec/gloss PBR materials as NodeStandardSGMaterial.
  • Support per-map UV sets and transforms
  • Simplify use of builtin materials with instancing

These particular issues now have examples for how to be solved with the existing APIs, without the need to modify the core:

Instancing:
https://github.com/mrdoob/three.js/pull/10750 (full blown, touches core, but could be monkey-patched)
https://github.com/mrdoob/three.js/pull/14166 (simple, material extensions via onBeforeCompile)
https://github.com/mrdoob/three.js/pull/14012 (simple, library extension via mokey-patching)

per-map-uv sets
https://github.com/mrdoob/three.js/pull/14174

spec-gloss as it's own class
https://github.com/mrdoob/three.js/pull/14099

Perhaps it can serve to relieve the sense of urgency for getting NodeMaterial into the core :).

For those following this topic, there is some ongoing discussion in https://github.com/mrdoob/three.js/pull/14149.

Joining threads from #14149:

tl;dr — NodeMaterial seems to me like a very promising next step for the material system. I suspect NodeMaterial adoption is currently limited by a few things:

  • friction of including many individual files
  • lack of a visual editor
  • relatively few examples, docs

I would leave the question of when/whether NodeMaterial should be in src/ to @mrdoob and @sunag, but I think we should move forward with (at least) addressing the three issues above. It would also be helpful to know whether others see NodeMaterial as an alternative to the current material system or a full replacement.



The critical mass seems to be forming by GLTFLoader stake holders, and my (subjective) observation is that need arose because of improperly documented onBeforeRender (few examples) and even less documented onBeforeCompile (one example).

@pailhead I began maintaining GLTFLoader to address workflow problems in A-Frame, of which I'm also a developer. I agree a more useful material system is the goal, not loading a particular format. Deciding how to integrate (or not integrate) node materials should be based on merits for end-users of the three.js library — features, flexibility, an understandable API.

You're right that onBeforeCompile and onBeforeRender hooks are infinitely flexible; there's no practical limit to how far you can manipulate or replace the underlying shaders. But both are similarly flawed as a primary material API:

  1. Forces WebGL syntax into the material API even for relatively simple features
  2. Depends on string manipulation of core material shaders, making changes to the core materials harder over time, and more fragile
  3. Less approachable to artists
  4. Practically unserializable

Decorators (e.g. #14206) certainly improve this from a user's perspective — but if future feature development of materials will be based on increasingly complicated decorators atop the existing material shaders, that's a maintainability hazard in my opinion.

Node materials are not perfect, but they are widely adopted in other graphics tools, and have key advantages:

  1. Flexible and composeable
  2. Preserves a layer of abstraction from the library's internal WebGL shaders without removing access to plain shaders where needed
  3. Widely popular material representation for artists and game developers
  4. At least somewhat serializable (although conversion to other formats is still hard)

This probably sounds like I am saying onBeforeCompile should be deprecated, and I don't mean to — I'd be glad to see it stick around, and examples are welcome. But it's not an API I want to point A-Frame or three.js users to for more cases than necessary; I don't think it compares well with the material systems in other 3D tools.

  • friction of including many individual files

This shouldn't be friction. I think it would be useful to do some kind of a survey or something, to see how many people use bundlers and build tools. Importing js files in a html file manually should not really be an approach in 2018. This is the only number i can find, and i think an assumption could be made that all these users are using it with something that runs with node.

I'd put more effort, into tree shaking and properly being able to import/export this, than worry about how this would work when manually including script files.

  • lack of a visual editor
  • relatively few examples, docs

I think these should happen before this makes it into the core. The specular gloss material should be an example that is not coupled with GLTFLoader. The fact that it's in /src or /examples should be irrelevant?

@donmccurdy

You're right that onBeforeCompile and onBeforeRender hooks are infinitely flexible; there's no practical limit to how far you can manipulate or replace the underlying shaders.

I'm not actually claiming this, i'm claiming that a hook like onBeforeRender allows for a different approach to structuring code, but that it shouldn't be used as it is used now in the GLTFLoader (hence #14099).

onBeforeCompile definitely has limitations, and might be improperly named. A good thing to point out in this context is a recent PR #14214. As soon as i saw what went away (build() in favor of needsUpdate) i knew i was going to see onBeforeCompile in there. I used it as onBeforeParse since if i don't replace any #include <foo> statements, they still get parsed after this function and before the actual compilation. In this PR it's being used as onWillRefreshMaterial.

The philosophy that i'm trying to get across is that there should probably be more granularity with these kind of callbacks. There should be more of them, and great care should be taken when naming them.

#14214 is a great example of why NodeMaterial doesn't have to be in the core. At least not coupled with anything in /renderer, and not bundled with it.

  1. Forces WebGL syntax into the material API even for relatively simple features

I've heard this issue before but it was not elaborated on. Why is this a problem? What is the "material API" anyways? I always saw it as "here's are some common surface shading materials".

If you want a dynamic map, i see no reason why material.map, or 'material.color' couldn't take NodeTexture, GLSLTexture, TextureTexture.

myMaterial.color = new SomeNodeGraph() //outputs vec4 in the end

myMaterial.color = new GLSLInput() // inputs various documented variables, outputs some vec4

myMaterial.color = new THREE.Color()

myMaterial.color = new CustomColorGradientTopBottom() // dunno if its Node, GLSL, regular color whatever, i just know it's compatible with the slot
  1. Depends on string manipulation of core material shaders, making changes to the core materials harder over time, and more fragile

My point is that this is already somewhat possible with onBeforeCompile but makes these changes even harder, and more fragile than what you're describing. I already reached out for this in my professional life and have some code that relies on it. How fragile it is, causes a lot of stress :(

The decorators you pointed out #14206 have stalled for more than a year because of this PR, which itself has stalled for 3 years. My subjective feeling is that #14206 would make writing these decorators possible, less tedious, and less fragile. It's 30 lines of code that don't impact absolutely anything, unless these phantom properties are defined. If we're already given a rope to hang ourselves with in the form of onBeforeCompile why not make it slightly more humane? :smile:

In the end, i'd like to decide for myself what is an acceptable level of fragile and what not.

  1. Less approachable to artists

This is why artists should have a visual tool, and some kind of a format to save the graphs to. Consuming them as "baked" GLSL or something that builds the NodeMaterial at runtime should be a choice given to the user.

An artist could create a shader, an engineer could ask for it to be "compiled" so to optimize the loading process. An artist working alone could just load the graph, assuming that is for some reason easier ("compiled" takes more effort).

Also, why and how are artists being considered here? Are there some statistics, what does an average "artist" user look like, why are they working with three.js and not unity, ie. what kind of a pipeline is it. Are there engineers involved, why not a standalone tool that communicates with three.js somehow that some tech directory adopts etc. I'm assuming the 40k npm downloads per week are not artists. The assumption there is that that sort of an install is being used by some build tools and highly unlikely to be manually imported.

Both on this and point 2, my use case involves no artists, and usually no other users. I'm growing a material system, built on top of built-in materials. I use my own normal unpacking with tangent attributes, depth peeling, different clipping etc. All i care is structuring that code in-house, and exposing as little internals to the classes above.

  1. Practically unserializable

I don't know much about this and i've heard this argument before. I'd be more than happy to look at some code if someone could point me to the problematic areas. With my very limited understanding (next to none) of the problem, i don't understand why it's simply not enough to write

"materialFoo": {
  "color":"#ff0000",
  "specularMap": "some_path.jpg",
  "glossiness": "0.5"
}

Say you have a spec gloss material built with NodeMaterial , you wouldn't want to serialize the entire graph, only the inputs? Whatever the serialization of a SpecGloss material is, it should look the same? I'm probably missing a lot here, would appreciate some pointers.

Re: serialization

Would it be enough to add:

https://github.com/mrdoob/three.js/blob/dev/src/materials/Material.js#L160

customSerializer( data )

And then something like

SpecularGlossMaterial.prototype.toJSON = ()=>{
  Material.prototype.call(this, undefined, data =>{
    if ( this.roughness !== undefined ) data.roughness = this.roughness;
  })
}

specularMap would actually be serialized because it's hardcoded:
https://github.com/mrdoob/three.js/blob/dev/src/materials/Material.js#L213

But this feels kinda weird, the Material class is aware of the internals from all the built-in materials it seems?

@donmccurdy if you want to have this discussion it might be better to do it in the decorator PR, #14206. Offline i'm available for a working session over beer or coffee anywhere in the Bay Area :slightly_smiling_face:

Agreed that multiple files _shouldn't_ be friction, and that it's not an argument by itself for putting NodeMaterial into src/. It's simply one of the several reasons NodeMaterial isn't heavily used yet, and there are other ways to solve that. And I certainly agree that more docs would be a prerequisite for putting NodeMaterial in src/, or even for pushing adoption in examples/js or a separate build output.

I am optimistic about NodeMaterial as a general direction, although I suspect there are a few more things needed before it could make sense in src/. I'm very worried about using onBeforeCompile and string manipulation as a path to put new features in the material system, as described above, and especially as we move toward supporting WebGL2 and (eventually) WebGPU. That's not to say these two choices are the only options, but (of the two) these are my thoughts.

On the other points, it seems like we understand one another's concerns and preferences, but still disagree. I've written all that I have the time to write at this point, and will sit this out for a while and let others weigh in. Thank you for a thorough and polite discussion on this. 🙂

I would like to put an ambient occlusion texture with Multiply on a diffuse texture, so I'm trying this:

const aoNode = new OperatorNode(
        new Math1Node( aoTexture, Math1Node.INVERT ),
        aoScale,
        OperatorNode.MUL
    );
material.ao = aoNode;

This is my AO texture
ao sx

Unfortunately I've been seeing ambient occlusion on my model.

Have you any idea for do that?
Hope this is the right place to post my question.

Thanks!

I would like to be able to think in the direction of a mapping from existing node based tools to materials in threejs. Consequently I would love to see NodeMaterial in src/ along with a proper documentation.

@IARI

Can you please list the existing node based tools?

@donmccurdy

Thank you for the reply :)

Unfortunately I think we still don't understand each other. If you have time to read this post, this stuff:

I'm very worried about using onBeforeCompile and string manipulation as a path to put new features in the material system,

onBeforeCompile is horrible, i agree, the only reason im involved in any of these threads is that i want #13198 which i see as superior to onBeforeCompile's string manipulation.

This shouldn't be a pattern to add new features to three.js. It should allow for users to add new features to their applications. I'm sad that @bhouston didn't share any findings on maintaining a fork to do some of this stuff. I'd be curious if they made a better framework to change the shaders, or have actually hacked the core for this feature.

In the case of per channel uv transforms, i'd make that an example, or npm module. If 3 people download it every week, it's probably something thats not supposed to be in the core. If 20k people download it, it's probably how three.js should have been designed in the first place.

onBeforeCompile seems like it will be crucial to node materials, https://github.com/mrdoob/three.js/pull/14214, so it's not only used for string manipulation.

I have onBeforeCompile and NodeMaterial to do these, i request a third option. Much more convenient than onBeforeCompile and much much smaller than NodeMaterial.

With recent examples:
https://github.com/mrdoob/three.js/pull/14206

Old discussion:
https://github.com/mrdoob/three.js/pull/13198

@pailhead For starters I would think about a few basic Blender Cycles Nodes.
I would also find it interesting to play around with Unitys Shadergraph.

@claudioviola You can add in material.shadow or multiply with diffuse material.color = diffuse * oclusion. Occlusion depende of indirect light, AmbientLight, HemisphereLight ...

@IARI

I'm downloading 7.5 gigs of Unity to check out the ShaderGraph :slightly_smiling_face: : I'm not sure if that will qualify as a tool for building shader graphs, i think it's a tool for building shader graphs for unity. Are there any standalone tools for authoring graphs? I want to see what kind of a file this saves the graph too, and how will it transfer over to three.js and NodeMaterial in /src.

I think that @pailhead is in isolation against this move of NodeMaterials into core at this current point. I am do not think it is useful for everyone to argue with @pailhead for days. I think the project shouldn't be beholden to a single individual.

This has not been rushed and it has not been forced. Sunag added this years ago. There is right now a need for this and other alternatives have not worked out (e.g. merged) so let's try this path.

Every major tool, 3DS Max, UE4, Blender, Unity, Maya, Sketchfab use shader graphs. Arguing that three.js should be at a permanent disadvantage in this regards seems unfair to three.js.

I am not sure if this graph-based shader system is perfect at this time, but once it is in core, we can all continue to work with it.

I do insist that we continue to support the API of PhoneMaterial, StandardMaterial, etc in some fashion going forward so that we do not break 95% of three.js projects, doing so would be irresponsible. It is also nice to work with such simplified straight forward interfaces.

I don't have neither the capability nor the desire to prevent this from moving into the core. I'm just hoping it could unblock some unrelated PRs, because it has been referenced as a blocker. @bhouston, never meant to argue with you either, i was just curious about your fork and experiences with it.

When I made ends meet as a starving 3d artist, i like all things node based. Nuke was my favorite composer, and i'd like to learn how to make tools like that. I just wish this unblocks this in particular #14231. I'm using this as a learning platform for myself, but if it's distruptive i'll stop posting, Thank you.

@pailhead I have never worked with it, but you've probably heard of substance designer? As far as I understand, that tool also goes beyond authoring graphs/shaders - but its primarily focused on the procedural design of materials - maybe that's an interesting clue?

Other than that, I dont know standalone tools from the top of my head, but a quick google search yielded these two:

However, there's also this tool for the threejs nodes, which I found in a twitter post by @mrdoob back from 2015 - though it seems to never have been updated, I wonder why ....

That tool seems to have been referenced here, but i don't see a way to export anything. The first link looks amazing, but the output is pretty gnarly, not very readable compared to GLSL. I wonder if that can be used already to reassemble the NodeMeterial, or even a ShaderMaterial.

I took a look at unity but i couldnt actually find the shader graph tool,

Btw if i somehow blocked this, by all means let it be unblocked. The PR is from 2015, i first responded 8 days ago.

(Sorry this goes is a bit offtopic..)
@pailhead For Unity shadergraph: You need to start a unity project with the lightweight render pipeline template. Make sure you open the Unity Package Manager and check that you are using the latest version of the Render-pipelines.light package.
Here's a tutorial to set it up starting with a new project being set up: https://youtu.be/Ar9eIn4z6XE?t=98

That Sea3D Flow material node editor (http://sea3d.poonya.com/flow/) is by @sunag, the same guy that wrote NodeMaterial.

Thanks so much @sunag
How can I get the indirect light? Is it a value in renderer or how can I calculate it?

I would like to apply a postprocessing effect to a selection of objects only (i.e. change the saturation).
However, I cannot find anything related to masking (if there's not another way) with nodes.
Currently I'm reading through the (non-node) OutlinePass which writes a bunch of stuff to a bunch of buffers, which i can see as a feasible method for achieving what i want - but I have not the slightest idea of how to approach this with nodes.

The Sea3D Flow material node editor will be standalone the same as the three.js editor, or will be a part of it? Will it still be named Sea3D or something else three.js related? Thank you @sunag for releasing that visual editor to the three.js community.

but I have not the slightest idea of how to approach this with nodes.

@IARI

It seems like you know GLSL since you are able to read the OutlinePass and understand what it does. You also seem to understand it enough to see that you can't achieve it with nodes. I thought if you know how to code it, you'll know how to node it. Even more, if you dont know how to code it, you might know how to node it. Where is the disconnect?

I didn't see these posts from 2016 at all. Interesting to see Shader Frog. I wonder if this can be seen as vim vs emacs or something. I like to use Sublime Text, you like to use something else. One likes to use shader frog, another likes to use three.js's Sea3D editor.

I like how Shader Frog just outputs a shader, but it isn't open source like Sea3D.

Then there's this.

Putting third party product mechanics into Three's core seems like an unfair advantage and I don't understand it politically.

To me, the fact that all of this can be accomplished without Three changes means that it's unnecessary, and blurs the line between Three.js being a webgl wrapper API and now driving implementation specific features. Storing shader node data is highly app specific and as far as I know there's currently no common node editor standard format for any software? Enforcing one might be harmful to other implementations.

@bhouston
I can't wait for NodeMaterial to land, since it may unblock other PRs and life can go back to normal. It seems other voices have tried to block this in the past.

I think that @pailhead is in isolation against this move of NodeMaterials into core at this current point.
I think the project shouldn't be beholden to a single individual.

Thank you.

I don't see any reason why it could not be achieved with nodes exactly the same way as outlinepass does it - but in a way that is readable, understandable, and modular such that parts can be reused.
Why shouldn't there be for instance output nodes that write to buffers and stuff.

And I really don't know glsl very well, and also I find it - pardon my wording - a super pain in the arse to read and write shader code. And on top of that its extra painful if its wrapped in a stupid array in strings in javascript code.

Do you use webpack or something?

stupid array

It's only stupid if you have to import it into an .html file. When you work with webpack and you have the modern JS syntax, you can keep your GLSL in .glsl, .vs, .fs files and have syntax highlighting and such.

shader.fs:

void main(){
  gl_FragColor = vec4(1.);
} 

javascript (look mom, no stupid arrays)

import mShader from './shader.fs'

My concern is that i don't see much value in your need since you're asking specifically how to create an effect, not how to code something. Will NodeMaterial open doors to thousands of "how do i shade this effect" questions, that will just have a generic answer:

Use the nodes, Luke!

No, having to read (let alone write) code in two different languages within the same file to me is always a pain to see. Also webpack helps nothing, when you're trying to study threejs example shader code which is all wrapped in ugly stringarrays joined by linebreaks.
Anyways, this Issue is about the node system, and I asked a simple question about nodes - not for a discussion about personal coding preferences - that doesn't lead anywhere.

@IARI

Apologies, i somehow didn't see any question marks in your post so i missed your question.

This should definitely not be a discussion about personal coding preferences, but the fact that there are different should be considered. I'm also a bit concerned about the flurry of "How do i make this shader" questions.

Before, this bar was pretty high, people just give up with i don't know glsl or when they see "a stupid array in strings". But if it's lower, then everyone will try to write shaders.

I didn't find your question, but i'm curious what it is. Is it:

How do i make a ToonShader with NodeMaterial?

In which case i'm curious what the answer is.

You can't because NodeMaterial doesn't support some abstraction.
You can, just use NodeMaterial, to connect nodes

Maybe this discussion could benefit from being broken out? It's a hight number of posts, going really far back (2015). NodeMaterial is in /examples, so you can probably build your toon shader from it. Maybe you could open another issue or post on stack overflow for that specifically, and the rest of the conversation could move into a new issue?

We can gladly take this discussion somewhere else, like some slack channel or whatever - I really have a feeling it doesn't belong here, but will mostly annoy people.

The forum might be better than slack for this particular case.

@pailhead, this whole discussion has been derailed from being productive. Might as well close this issue and make a new one focused on the actual issue at hand.

@bhouston

Agreed, except that I think it’s multiple issues.

  • how to implement nodematerial
  • should other related and unrelated PRs be blocked
  • how to allow people who don’t know how to write shaders to implement certain shader effects

How do i make a ToonShader with NodeMaterial?
In which case i'm curious what the answer is.
You can't because NodeMaterial doesn't support some abstraction.
You can, just use NodeMaterial, to connect nodes

LoL. I agree with @bhouston... this can only be a joke. You could have see the example webgl_materials_nodes before you said it?

And if you want to use code, see expression using FunctionNode.

toon

@sunag i think you misunderstood me. I was curious as to why, if this is currently in /examples and not /src is this being made into a first class citizen. This issue seems to have been raised by @AndrewRayCode back in 2015.

I just said that this issue is overloaded, from extending far back into 2015 and from discussing various different topics. If someone asked "how do i make a whatever" they would be referred to stack overflow.

If this is going to be a massive overhaul of three.js _(the fact that unity has to be configured specifically for this somewhat scares me)_ then i think there should be an ongoing issue dedicated here on github for such questions - usage of NodeMaterial. As is, people who are asking for random things (#14232) are being referred to this thread, it makes the communication messy :)

I recently had to write a toon shader but it was based on lines, and more specifically @WestLangley 's fat lines. So even with a need for "toon" shader mine would have been completely different. It would also involve porting fat lines to NodeMaterial which could take who knows how long. These are all various topic that are convoluted here.

And if you want to use code, see expression using FunctionNode.

If i code i'll use sublime text, ShaderMaterial is fine for me no need for FunctionNode. If i create visual effects, i'll probably look into a visual editor, but this is my own preference. I'm only here because this blocked other PRs :) I've realized not so long ago that it is actually in three's best interest for this to land as quickly as possible. Untangling this conversation would probably help hasten things. In this i agree with @bhouston this should be yolod asap!

Thank you.

For the record: I am not interested in a toon shader or anything like that.
I'm implementing a system, which allows people who don't write code to plug in simple predefined effects in certain positions.

Sadly i'm afraid I won't have the time to study how to write nodes right now, and I have to catch up on basic glsl coding anyways, so for now I will be implementing things without nodes: it won't be extendable, it will use multiple render passes where a single one would probably be sufficient with a good node implementation, it will be less readable, ...

All in all sadly i'm just to new to the general topic and need to do get a lot more basic understanding, but judging on what I know, nodes would be the best way to do this in a truely elegant and extendable fashion.
I'd totally love to have some information on the implementation of nodes like a blog article or tutorial.

Disclaimer: I'm not demanding anything, just hope to clarify my vague request from above.
Thank all of you for all the work being done here - it is absolutely amazing.

I was asked in thread https://github.com/mrdoob/three.js/issues/14232 to comment on Node-based shader implementation. I think it's very poor. I love the idea, but the implementation is entirely inadequate in my view, so, since I'm being so negative, I'm going to provide some concrete reasons as to why.

Understandability

After reading some of the code for the Nodes and various utility classes - I can decidedly say that it's a huge challenge to understand.

  • variable names are non-descriptive
  • no comments at all
  • method names are obscure or downright misleading. (e.g. what do you think method "build" does? - make a guess)

Code structure is downright weird. Consider this snippet:
https://github.com/mrdoob/three.js/blob/e2b49c017b2d1f37ab34317033a27df7b5a71d4d/examples/js/nodes/FunctionNode.js#L39-L50

You could achieve the same with:

return this.inputs.find(input => input.name === name);

"Emergent Architecture"

What is a "Node"? I'm asking for a definition of the data structure, because it seems that it's just a nebulous concept as there is little commonality between different "Nodes", it looks like ducktyping, if it quacks like a Node - it's a Node.

Separation of concerns

How do you go from Node to a ShaderMaterial? Right now it's a mix of responsibilities between various pieces of logic from each node, some builders, some cache, parsers etc. There is no clear responsibility for this process. Node uses NodeBuilder? What the ...? From the name i would have though NodeBuilder is a factory that produces or constructs nodes, as it turns out it's actually an auxiliary structure. As far as I can tell there are fairly distinct jobs to be done here:

  • Build a graph
  • Validate
  • Build intermediate representation
  • optimize
  • compile to GLSL

That's not at all what you will find in current code.

Conclusion

Without good documentation and comments I see this implementation as being entirely inadequate. With those things in place, it's a matter of spaghetti code and lack of design. With that being said - I acknowledge that it's a powerful tool and a working implementation of a very useful concept.

@Usnul we will never code like this in Three.JS "return this.inputs.find(input => input.name === name);"

The reason is that the "=>" creates a function that must be called on each input. Function calls are very slow in JavaScript compared to a for-loop or while-loop. Thus we always do explicit loops instead of callback oriented code. I know it is more verbose but it is necessary. All of Three..JS is like this and on purpose.

@Usnul, your design concerns are very useful, could you help with some refactoring then? I think everyone would appreciate your design input and contributions. :)

The reason is that the "=>" creates a function that must be called on each input. Function calls are very slow in JavaScript compared to a for-loop or while-loop. Thus we always do explicit loops instead of callback oriented code. I know it is more verbose but it is necessary. All of Three..JS is like this and on purpose.

that's a fair point. Personally, i'm of the philosophy that performance and readability should be on a scale and not one or the other, but consistency is good. My main issue with that specific example is that it is not a common way of iterating over an array. Here's an idiomatic way:

for(var i=0, count=array.length; i<count; i++){
    var element = array[i];
    ...
}

or let's say you have a piece of code that is 50 lines long of the form:

function complex(a){
  if(a){
      //do a ton of stuff
      ...
      return result;
  }
}

that's not as readable as:

function complex(a){
  if(!a){
      //omg! not A?!11 
      return;
  }
  //do a ton of stuff
  ...
  return result;
}

@Usnul could you help refactor the code to take these into account? I think refactoring should be easy, right? I am a big supporter of Code Complete, that book influenced me a lot.

@usnul if you don’t like this you’ll free to refactor this? It doesn’t matter if it looks like this or not it’s just important for it to be built on top of this, is my understanding.
I don’t know though if refractors should happen while this is in examples (or possibly another repo) or should it be moved to src straightaway as @bhouston is suggesting.

@IARI

I really don't know glsl very well, and also I find it - pardon my wording - a super pain in the arse to read and write shader code.

but judging on what I know, nodes would be the best way to do this in a truely elegant and extendable fashion.

I can't claim that i know GLSL very well either, but i know it enough to solve my problems. From what i know, i mostly agree with what @usnul wrote here, and what @AndrewRayCode wrote before. I find the node thing to be verbose compared to GLSL.

If you don't know GLSL i think it would be very hard to compare it with something else. As you wrote it yourself, if we judge on what you know, and you said it yourself you don't know glsl i think your input should be taken with a disclaimer :)

It's like if i said:

Japan is the worst place on this planet... but i've never actually been to Japan".

Thank you.

method names are obscure or downright misleading. (e.g. what do you think method "build" does? - make a guess)

Wow, I didn't even realize that this is a ShaderMaterial underneath. My guess is that this does all the stuff that THREE.WebGLRenderer does with shader templates.

I really don't know glsl very well, and also I find it - pardon my wording - a super pain in the arse to read and write shader code.
but judging on what I know, nodes would be the best way to do this in a truely elegant and extendable fashion.

@pailhead What he means is that there are intermediate levels between things.

I do not understand why this alarm. @mrdoob will never merger something other than a great solution. All great PR, improvements from anyone in NodeMaterial or not are very useful for community. If this still has not been merger in core it is because we need to improve with more work.

@sunag is there anything particular others can do to help you with this project? I would be willing to write documentation, if you’re ready for docs and could review them.

@sunag is there anything particular others can do to help you with this project? I would be willing to write documentation, if you’re ready for docs and could review them.

This will be great. No doubt helps a lot.

Node based material system looks amazing, it'd expand the power of expression.

How will we switch from existing material system to node based material system in core? Will we both two systems or replace the existing one with the node one?

Sorry, maybe I don't think I catch up this thread because this thread is too huge...

Will we both two systems or replace the existing one with the node one?

I see replacement as the ultimate goal, is better both for performance and maintenance. The next step is make a MeshStandardNodeMaterial and others node materials out in the core run exactly like the core in visual, syntax and performace.

I've to show shadowMap in my project.

I've a scene with a directional light and 2 geometry that using MeshStandardNodeMaterial.

I've switched on
castShadow = true for light
castShadow and receivingShadow for both objects.
shadowEnabled = true for Renderer

Why doesn't works it?
Do I have to use shadow property of the MeshStandardNodeMaterial? Is this the purpose?

Anyone can help me?
Thanks!

@claudioviola
support questions go to forum of stackoverflow.

@sunag Do you think NodeMaterial is stable enough so we can start with providing TypeScript declaration files? If you are not planning major API changes in the near future, I would start with this task since NodeMaterial is the last group of modules with no TS support.

@sunag Do you think NodeMaterial is stable enough so we can start with providing TypeScript declaration files? If you are not planning major API changes in the near future, I would start with this task since NodeMaterial is the last group of modules with no TS support.

@Mugen87 Yes, there must be more additions than changes. Add TS support will be great. 👍

Okay, TS support is ready with the next release R017 🙌

Was this page helpful?
0 / 5 - 0 ratings