Qt
Internal/Contributor docs for the Qt SDK. Note: These are NOT official API docs; those are found at https://doc.qt.io/
Loading...
Searching...
No Matches
qtquick3d-custom.qdoc
Go to the documentation of this file.
1
// Copyright (C) 2020 The Qt Company Ltd.
2
// SPDX-License-Identifier: LicenseRef-Qt-Commercial OR GFDL-1.3-no-invariants-only
3
4
/*!
5
\page qtquick3d-custom.html
6
\title Programmable Materials, Effects, Geometry, and Texture data
7
\brief Custom materials, effects, geometry and texture data providers in Qt Quick 3D
8
9
While the built-in materials of Qt Quick 3D, \l DefaultMaterial and \l PrincipledMaterial,
10
allow a wide degree of customization via their properties, they do not provide
11
programmability on the vertex and fragment shader level. To allow that, the \l
12
CustomMaterial type is provided.
13
14
\table
15
\header
16
\li A model with PrincipledMaterial
17
\li With a CustomMaterial transforming the vertices
18
\row
19
\li \image quick3d-custom-mat1.jpg {Teapot rendered with standard material}
20
\li \image quick3d-custom-mat2.jpg
21
{Teapot with vertices transformed by custom material}
22
\endtable
23
24
Post-processing effects, where one or more passes of processing on the color buffer are
25
performed, optionally taking the depth buffer into account, before the View3D's output is
26
passed on to Qt Quick, also exist in two varieties:
27
\list
28
\li built-in post-processing steps that can be configured via \l ExtendedSceneEnvironment, such as
29
glow/bloom, depth of field, vignette, lens flare,
30
\li \c custom effects implemented by the application in form of fragment shader code and a
31
specification of the processing passes in an \l Effect object.
32
\endlist
33
34
In practice there is a third category of post-processing effects: 2D effects
35
implemented via Qt Quick, operating on the output of the \l View3D item without
36
any involvement from the 3D renderer. For example, to apply a blur to a \l
37
View3D item, the simplest approach is to use Qt Quick's existing facilities,
38
such as \l MultiEffect. The 3D post-processing system becomes beneficial for
39
complex effects that involve 3D scene concepts such as the depth buffer or the
40
screen texture, or need to deal with HDR tonemapping or need multiple passes
41
with intermediate buffers, etc. Simple 2D effects that do not require any
42
insight into the 3D scene and renderer can always be implemented with \l
43
ShaderEffect or \l MultiEffect instead.
44
45
\table
46
\header
47
\li Scene without effect
48
\li The same scene with a custom post-processing effect applied
49
\row
50
\li \image quick3d-custom-effect1.jpg
51
{Scene with sphere, cone, and cube without post-processing effect}
52
\li \image quick3d-custom-effect2.jpg
53
{Scene with warped geometry from custom post-processing effect}
54
\endtable
55
56
In addition to programmable materials and post-processing, there are two types of data that is
57
normally provided in form of files (\c{.mesh} files or images such as \c{.png}):
58
59
\list
60
61
\li vertex data, including the geometry for the mesh to be rendered, texture coordinates,
62
normals, colors, and other data,
63
64
\li the content for textures that are then used as texture maps for the rendered
65
objects, or used with skybox or image based lighting.
66
67
\endlist
68
69
If they so wish, applications can provide such data from C++ in form of a QByteArray. Such
70
data can also be changed over time, allowing to procedurally generate and later alter the
71
data for a \l Model or \l Texture.
72
73
\table
74
\header
75
\li A grid, rendered by specifying vertex data dynamically from C++
76
\li A cube textured with image data generated from C++
77
\row
78
\li \image quick3d-custom-geom.jpg {Grid generated from custom geometry data}
79
\li \image quick3d-custom-tex.jpg
80
{Cube with procedurally generated gradient texture}
81
\endtable
82
83
These four approaches to customizing and making materials, effects, geometry, and textures
84
dynamic enable the programmability of shading and procedural generation of the data the
85
shaders get as their input. The following sections provide an overview of these
86
features. The full reference is available in the documentation pages for the respective
87
types:
88
89
\table
90
\header
91
\li Feature
92
\li Reference Documentation
93
\li Relevant Examples
94
\row
95
\li Custom materials
96
\li \l CustomMaterial
97
\li \l {Qt Quick 3D - Custom Shaders Example}, \l {Qt Quick 3D - Custom Materials
98
Example}
99
\row
100
\li Custom post-processing effects
101
\li \l Effect
102
\li \l {Qt Quick 3D - Custom Effect Example}
103
\row
104
\li Custom geometry
105
\li \l QQuick3DGeometry, \l{Model::geometry}
106
\li \l {Qt Quick 3D - Custom Geometry Example}
107
\row
108
\li Custom texture data
109
\li \l QQuick3DTextureData, \l{Texture::textureData}
110
\li \l {Qt Quick 3D - Procedural Texture Example}
111
\endtable
112
113
\section1 Programmability for Materials
114
115
Let's have a scene with a cube, and start with a default \l PrincipledMaterial and
116
\l CustomMaterial:
117
118
\table
119
\header
120
\li PrincipledMaterial
121
\li CustomMaterial
122
\row
123
\li
124
\qml
125
import QtQuick
126
import QtQuick3D
127
Item {
128
View3D {
129
anchors.fill: parent
130
environment: SceneEnvironment {
131
backgroundMode: SceneEnvironment.Color
132
clearColor: "black"
133
}
134
PerspectiveCamera { z: 600 }
135
DirectionalLight { }
136
Model {
137
source: "#Cube"
138
scale: Qt.vector3d(2, 2, 2)
139
eulerRotation.x: 30
140
materials: PrincipledMaterial { }
141
}
142
}
143
}
144
\endqml
145
\li
146
\qml
147
import QtQuick
148
import QtQuick3D
149
Item {
150
View3D {
151
anchors.fill: parent
152
environment: SceneEnvironment {
153
backgroundMode: SceneEnvironment.Color
154
clearColor: "black"
155
}
156
PerspectiveCamera { z: 600 }
157
DirectionalLight { }
158
Model {
159
source: "#Cube"
160
scale: Qt.vector3d(2, 2, 2)
161
eulerRotation.x: 30
162
materials: CustomMaterial { }
163
}
164
}
165
}
166
\endqml
167
\endtable
168
169
These both lead to the exact same result, because a \l CustomMaterial is effectively a \l
170
PrincipledMaterial, when no vertex or fragment shader code is added to it.
171
172
\image quick3d-custom-cube1.jpg {White cube with default material}
173
174
\note Properties, such as, \l{PrincipledMaterial::baseColor}{baseColor},
175
\l{PrincipledMaterial::metalness}{metalness},
176
\l{PrincipledMaterial::baseColorMap}{baseColorMap}, and many others, have no equivalent
177
properties in the \l CustomMaterial QML type. This is by design: customizing the material
178
is done via shader code, not by merely providing a few fixed values.
179
180
\section2 Our first vertex shader
181
182
Let's add a custom vertex shader snippet. This is done by referencing a file in the
183
\l{CustomMaterial::vertexShader}{vertexShader} property. The approach will be the same for
184
fragment shaders. These references work like \l{Image::source}{Image.source} or
185
\l{ShaderEffect::vertexShader}{ShaderEffect.vertexShader}: they are local or \c qrc URLs,
186
and a relative path is treated relative to the \c{.qml} file's location. The common
187
approach is therefore to place the \c{.vert} and \c{.frag} files into the Qt resource
188
system (\c qt_add_resources when using CMake) and reference them using a relative path.
189
190
In Qt 6.0 inline shader strings are no longer supported, neither in Qt Quick nor in Qt
191
Quick 3D. (make note of the fact that these properties are URLs, not strings) However, due
192
to their intrinsically dynamic nature, custom materials and post-processing effects in Qt
193
Quick 3D still provide shader snippets in source form in the referenced files. This is a
194
difference to \l ShaderEffect where the shaders are complete on their own, with no further
195
amending by the engine, and so are expected to be provided as pre-conditioned \c{.qsb}
196
shader packs.
197
198
\note In Qt Quick 3D URLs can only refer to local resources. Schemes for remote content
199
are not supported.
200
201
\note The shading language used is Vulkan-compatible GLSL. The \c{.vert} and \c{.frag}
202
files are not complete shaders on their own, hence being often called \c snippets. That is
203
why there are no uniform blocks, input and output variables, or sampler uniforms provided
204
directly by these snippets. Rather, the Qt Quick 3D engine will amend them as appropriate.
205
206
\table
207
\header
208
\li Change in main.qml, material.vert
209
\li Result
210
\row
211
\li \qml
212
materials: CustomMaterial {
213
vertexShader: "material.vert"
214
}
215
\endqml
216
\badcode
217
void MAIN()
218
{
219
}
220
\endcode
221
\li \image quick3d-custom-cube1-small.jpg {White cube with default material}
222
\endtable
223
224
A custom vertex or fragment shader snippet is expected to provide one or more functions
225
with pre-defined names, such as \c MAIN, \c DIRECTIONAL_LIGHT, \c POINT_LIGHT, \c
226
SPOT_LIGHT, \c AMBIENT_LIGHT, \c SPECULAR_LIGHT. For now let's focus on \c MAIN.
227
228
As shown here, the end result with an empty MAIN() is exactly the same as before.
229
230
Before making it more interesting, let's look at an overview of the most commonly used
231
special keywords in custom vertex shader snippets. This is not the full list. For a full
232
reference, check the \l CustomMaterial page.
233
234
\table
235
\header
236
\li Keyword
237
\li Type
238
\li Description
239
\row
240
\li MAIN
241
\li
242
\li void MAIN() is the entry point. This function must always be present in a custom
243
vertex shader snippet, there is no point in providing one otherwise.
244
\row
245
\li VERTEX
246
\li vec3
247
\li The vertex position the shader receives as input. A common use case for vertex shaders
248
in custom materials is to change (displace) the x, y, or z values of this vector, by simply
249
assigning a value to the whole vector, or some of its components.
250
\row
251
\li NORMAL
252
\li vec3
253
\li The vertex normal from the input mesh data, or all zeroes if there were no normals provided.
254
As with VERTEX, the shader is free to alter the value as it sees fit. The altered value is then
255
used by the rest of the pipeline, including the lighting calculations in the fragment stage.
256
\row
257
\li UV0
258
\li vec2
259
\li The first set of texture coordinates from the input mesh data, or all zeroes if there
260
were no UV values provided. As with VERTEX and NORMAL, the value can altered.
261
\row
262
\li MODELVIEWPROJECTION_MATRIX
263
\li mat4
264
\li The model-view-projection matrix. To unify the behavior regardless of which graphics API
265
rendering happens with, all vertex data and transformation matrices follow OpenGL conventions
266
on this level. (Y axis pointing up, OpenGL-compatible projection matrix) Read only.
267
\row
268
\li MODEL_MATRIX
269
\li mat4
270
\li The model (world) matrix. Read only.
271
\row
272
\li NORMAL_MATRIX
273
\li mat3
274
\li The transposed inverse of the top-left 3x3 slice of the model matrix. Read only.
275
\row
276
\li CAMERA_POSITION
277
\li vec3
278
\li The camera position in world space. In the examples on this page this is \c{(0, 0, 600)}. Read only.
279
\row
280
\li CAMERA_DIRECTION
281
\li vec3
282
\li The camera direction vector. In the examples on this page this is \c{(0, 0, -1)}. Read only.
283
\row
284
\li CAMERA_PROPERTIES
285
\li vec2
286
\li The near and far clip values of the camera. In the examples on this page this is \c{(10, 10000)}. Read only.
287
\row
288
\li POINT_SIZE
289
\li float
290
\li Relevant only when rendering with a topology of points, for example because the
291
\l{QQuick3DGeometry}{custom geometry} provides such a geometry for the mesh. Writing to
292
this value is equivalent to setting \l{PrincipledMaterial::pointSize}{pointSize on a
293
PrincipledMaterial}.
294
\row
295
\li POSITION
296
\li vec4
297
\li Like \c gl_Position. When not present, a default assignment statement is generated
298
automatically using \c MODELVIEWPROJECTION_MATRIX and \c VERTEX. This is why an empty
299
MAIN() is functional, and in most cases there will be no need to assign a custom value to
300
it.
301
\endtable
302
303
Let's make a custom material that displaces the vertices according to some pattern. To
304
make it more interesting, have some animated QML properties, the values of which end up
305
being exposed as uniforms in the shader code. (to be precise, most properties are going to
306
be mapped to members in a uniform block, backed by a uniform buffer at run time, but Qt
307
Quick 3D conveniently makes such details transparent to the custom material author)
308
309
\table
310
\header
311
\li Change in main.qml, material.vert
312
\li Result
313
\row
314
\li \qml
315
materials: CustomMaterial {
316
vertexShader: "material.vert"
317
property real uAmplitude: 0
318
NumberAnimation on uAmplitude {
319
from: 0; to: 100; duration: 5000; loops: -1
320
}
321
property real uTime: 0
322
NumberAnimation on uTime {
323
from: 0; to: 100; duration: 10000; loops: -1
324
}
325
}
326
\endqml
327
\badcode
328
void MAIN()
329
{
330
VERTEX.x += sin(uTime + VERTEX.y) * uAmplitude;
331
}
332
\endcode
333
\li \image quick3d-custom-cube2-anim.gif
334
{Cube with ground plane showing vertex displacement animation}
335
\endtable
336
337
\section2 Uniforms from QML properties
338
339
Custom properties in the CustomMaterial object get mapped to uniforms. In the above
340
example this includes \c uAmplitude and \c uTime. Any time the values change, the updated
341
value will become visible in the shader. This concept may already be familiar from \l
342
ShaderEffect.
343
344
The name of the QML property and the GLSL variable must match. There is no separate
345
declaration in the shader code for the individual uniforms. Rather, the QML property name
346
can be used as-is. This is why the example above can just reference \c uTime and \c
347
uAmplitude in the vertex shader snippet without any previous declaration for them.
348
349
The following table lists how the types are mapped:
350
351
\table
352
\header
353
\li QML Type
354
\li Shader Type
355
\li Notes
356
\row
357
\li real, int, bool
358
\li float, int, bool
359
\li
360
\row
361
\li color
362
\li vec4
363
\li sRGB to linear conversion is performed implicitly
364
\row
365
\li vector2d
366
\li vec2
367
\li
368
\row
369
\li vector3d
370
\li vec3
371
\li
372
\row
373
\li vector4d
374
\li vec4
375
\li
376
\row
377
\li matrix4x4
378
\li mat4
379
\li
380
\row
381
\li quaternion
382
\li vec4
383
\li scalar value is \c w
384
\row
385
\li rect
386
\li vec4
387
\li
388
\row
389
\li point, size
390
\li vec2
391
\li
392
\row
393
\li TextureInput
394
\li sampler2D
395
\li
396
\endtable
397
398
\section2 Improving the example
399
400
Before moving further, let's make the example somewhat better looking. By adding a rotated
401
rectangle mesh and making the \l DirectionalLight cast shadows, we can verify that the
402
alteration to the cube's vertices is correctly reflected in all rendering passes,
403
including shadow maps. To get a visible shadow, the light is now placed a bit higher on
404
the Y axis, and a rotation is applied to have it pointing partly downwards. (this being a
405
\c directional light, the rotation matters)
406
407
\table
408
\header
409
\li main.qml, material.vert
410
\li Result
411
\row \li \qml
412
import QtQuick
413
import QtQuick3D
414
Item {
415
View3D {
416
anchors.fill: parent
417
environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
418
PerspectiveCamera { z: 600 }
419
DirectionalLight {
420
y: 200
421
eulerRotation.x: -45
422
castsShadow: true
423
}
424
Model {
425
source: "#Rectangle"
426
y: -250
427
scale: Qt.vector3d(5, 5, 5)
428
eulerRotation.x: -45
429
materials: PrincipledMaterial { baseColor: "lightBlue" }
430
}
431
Model {
432
source: "#Cube"
433
scale: Qt.vector3d(2, 2, 2)
434
eulerRotation.x: 30
435
materials: CustomMaterial {
436
vertexShader: "material.vert"
437
property real uAmplitude: 0
438
NumberAnimation on uAmplitude {
439
from: 0; to: 100; duration: 5000; loops: -1
440
}
441
property real uTime: 0
442
NumberAnimation on uTime {
443
from: 0; to: 100; duration: 10000; loops: -1
444
}
445
}
446
}
447
}
448
}
449
\endqml
450
\badcode
451
void MAIN()
452
{
453
VERTEX.x += sin(uTime + VERTEX.y) * uAmplitude;
454
}
455
\endcode
456
\li \image quick3d-custom-cube3-anim.gif
457
{Cube with ground plane showing vertex displacement animation}
458
\endtable
459
460
\section2 Adding a fragment shader
461
462
Many custom materials will want to have a fragment shader as well. In fact, many will want
463
only a fragment shader. If there is no extra data to be passed from the vertex to fragment
464
stage, and the default vertex transformation is sufficient, setting the \c vertexShader
465
property can be left out from the \l CustomMaterial.
466
467
\table
468
\header
469
\li Change in main.qml, material.frag
470
\li Result
471
\row \li \qml
472
materials: CustomMaterial {
473
fragmentShader: "material.frag"
474
}
475
\endqml
476
\badcode
477
void MAIN()
478
{
479
}
480
\endcode
481
\li \image quick3d-custom-cube4.jpg {White cube with empty fragment shader}
482
\endtable
483
484
Our first fragment shader contains an empty MAIN() function. This is no different than not
485
specifying a fragment shader snippet at all: what we get looks like what we get with a
486
default PrincipledMaterial.
487
488
Let's look at some of the commonly used keywords in fragment shaders. This is not the full
489
list, refer to the \l CustomMaterial documentation for a complete reference. Many of these
490
are read-write, meaning they have a default value, but the shader can, and often will want
491
to, assign a different value to them.
492
493
As the names suggest, many of these map to similarly named \l PrincipledMaterial
494
properties, with the same meaning and semantics, following the
495
\l{https://github.com/KhronosGroup/glTF/tree/master/specification/2.0#metallic-roughness-material}{metallic-roughness
496
material model}. It is up the custom material implementation to decide how these values
497
are calculated: for example, a value for BASE_COLOR can be hard coded in the shader, can
498
be based on sampling a texture, or can be calculated based on QML properties exposed as
499
uniforms or on interpolated data passed along from the vertex shader.
500
501
\table
502
\header
503
\li Keyword
504
\li Type
505
\li Description
506
\row
507
\li BASE_COLOR
508
\li vec4
509
\li The base color and alpha value. Corresponds to \l{PrincipledMaterial::baseColor}. The
510
final alpha value of the fragment is the model opacity multiplied by the base color
511
alpha. The default value is \c{(1.0, 1.0, 1.0, 1.0)}.
512
\row
513
\li EMISSIVE_COLOR
514
\li vec3
515
\li The color of self-illumination. Corresponds to
516
\l{PrincipledMaterial::emissiveFactor}. The default value is \c{(0.0, 0.0, 0.0)}.
517
\row
518
\li METALNESS
519
\li float
520
\li \l{PrincipledMaterial::metalness}{Metalness} value in range 0-1. Default to 0, which
521
means the material is dielectric (non-metallic).
522
\row
523
\li ROUGHNESS
524
\li float
525
\li \l{PrincipledMaterial::roughness}{Roughness} value in range 0-1. The default value is
526
0. Larger values soften specular highlights and blur reflections.
527
\row
528
\li SPECULAR_AMOUNT
529
\li float
530
\li \l{PrincipledMaterial::specularAmount}{The strength of specularity} in range 0-1. The
531
default value is \c 0.5. For metallic objects with \c metalness set to \c 1 this value
532
will have no effect. When both \c SPECULAR_AMOUNT and \c METALNESS have values larger than
533
0 but smaller than 1, the result is a blend between the two material models.
534
\row
535
\li NORMAL
536
\li vec3
537
\li The interpolated normal in world space, adjusted for double-sidedness when face culling is disabled. Read only.
538
\row
539
\li UV0
540
\li vec2
541
\li The interpolated texture coordinates. Read only.
542
\row
543
\li VAR_WORLD_POSITION
544
\li vec3
545
\li Interpolated vertex position in world space. Read only.
546
\endtable
547
548
Let's make the cube's base color red:
549
550
\table
551
\header
552
\li Change in main.qml, material.frag
553
\li Result
554
\row \li \qml
555
materials: CustomMaterial {
556
fragmentShader: "material.frag"
557
}
558
\endqml
559
\badcode
560
void MAIN()
561
{
562
BASE_COLOR = vec4(1.0, 0.0, 0.0, 1.0);
563
}
564
\endcode
565
\li \image quick3d-custom-cube5.jpg {Red cube with custom base color}
566
\endtable
567
568
Now strengthen the level of self-illumination a bit:
569
570
\table
571
\header
572
\li Change in main.qml, material.frag
573
\li Result
574
\row \li \qml
575
materials: CustomMaterial {
576
fragmentShader: "material.frag"
577
}
578
\endqml
579
\badcode
580
void MAIN()
581
{
582
BASE_COLOR = vec4(1.0, 0.0, 0.0, 1.0);
583
EMISSIVE_COLOR = vec3(0.4);
584
}
585
\endcode
586
\li \image quick3d-custom-cube6.jpg {Bright red cube with emissive color}
587
\endtable
588
589
Instead of having values hardcoded in the shader, we could also use QML properties exposed
590
as uniforms, even animated ones:
591
592
\table
593
\header
594
\li Change in main.qml, material.frag
595
\li Result
596
\row \li \qml
597
materials: CustomMaterial {
598
fragmentShader: "material.frag"
599
property color baseColor: "black"
600
ColorAnimation on baseColor {
601
from: "black"; to: "purple"; duration: 5000; loops: -1
602
}
603
}
604
\endqml
605
\badcode
606
void MAIN()
607
{
608
BASE_COLOR = vec4(baseColor.rgb, 1.0);
609
EMISSIVE_COLOR = vec3(0.4);
610
}
611
\endcode
612
\li \image quick3d-custom-cube7-anim.gif
613
{Cube with animated color transition from black to purple}
614
\endtable
615
616
Let's do something less trivial, something that is not implementable with a
617
PrincipledMaterial and its standard, built-in properties. The following material
618
visualizes the texture UV coordinates of the cube mesh. U runs 0 to 1, so from black to
619
red, while V is also 0 to 1, black to green.
620
621
\table
622
\header
623
\li Change in main.qml, material.frag
624
\li Result
625
\row \li \qml
626
materials: CustomMaterial {
627
fragmentShader: "material.frag"
628
}
629
\endqml
630
\badcode
631
void MAIN()
632
{
633
BASE_COLOR = vec4(UV0, 0.0, 1.0);
634
}
635
\endcode
636
\li \image quick3d-custom-cube8.jpg
637
{Cube showing UV coordinates as red and green color gradient}
638
\endtable
639
640
While we are at it, why not visualize normals as well, this time on a sphere. Like with
641
UVs, if a custom vertex shader snippet were to alter the value of NORMAL, the interpolated
642
per-fragment value in the fragment shader, also exposed under the name NORMAL, would
643
reflect those adjustments.
644
645
\table
646
\header
647
\li Change in main.qml, material.frag
648
\li Result
649
\row \li \qml
650
Model {
651
source: "#Sphere"
652
scale: Qt.vector3d(2, 2, 2)
653
materials: CustomMaterial {
654
fragmentShader: "material.frag"
655
}
656
}
657
\endqml
658
\badcode
659
void MAIN()
660
{
661
BASE_COLOR = vec4(NORMAL, 1.0);
662
}
663
\endcode
664
\li \image quick3d-custom-cube9.jpg
665
{Sphere visualizing surface normals as RGB colors}
666
\endtable
667
668
\section2 Colors
669
670
Let's switch over to a teapot model for a moment, make the material a blend of metallic
671
and dielectric, and try to set a green base color for it. The \c green QColor value maps
672
to \c{(0, 128, 0)}, based on which our first attempt could be:
673
674
\table
675
\header
676
\li main.qml, material.frag
677
\row \li \qml
678
import QtQuick
679
import QtQuick3D
680
Item {
681
View3D {
682
anchors.fill: parent
683
environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
684
PerspectiveCamera { z: 600 }
685
DirectionalLight { }
686
Model {
687
source: "teapot.mesh"
688
scale: Qt.vector3d(60, 60, 60)
689
eulerRotation.x: 30
690
materials: CustomMaterial {
691
fragmentShader: "material.frag"
692
}
693
}
694
}
695
}
696
\endqml
697
\badcode
698
void MAIN()
699
{
700
BASE_COLOR = vec4(0.0, 0.5, 0.0, 1.0);
701
METALNESS = 0.6;
702
SPECULAR_AMOUNT = 0.4;
703
ROUGHNESS = 0.4;
704
}
705
\endcode
706
\endtable
707
708
\image quick3d-custom-color1.jpg {Green teapot with direct RGB color values}
709
710
This does not look entirely right. Compare with the second approach:
711
712
\table
713
\header
714
\li Change in main.qml, material.frag
715
\li Result
716
\row \li \qml
717
materials: CustomMaterial {
718
fragmentShader: "material.frag"
719
property color uColor: "green"
720
}
721
\endqml
722
\badcode
723
void MAIN()
724
{
725
BASE_COLOR = vec4(uColor.rgb, 1.0);
726
METALNESS = 0.6;
727
SPECULAR_AMOUNT = 0.4;
728
ROUGHNESS = 0.4;
729
}
730
\endcode
731
\li \image quick3d-custom-color2.jpg
732
{Green teapot with QML color property and sRGB conversion}
733
\endtable
734
735
Switching to a PrincipledMaterial, we can confirm that setting the
736
\l{PrincipledMaterial::baseColor} to "green" and following the metalness and other
737
properties, the result is identical to our second approach:
738
739
\table
740
\header
741
\li Change in main.qml
742
\li Result
743
\row \li \qml
744
materials: PrincipledMaterial {
745
baseColor: "green"
746
metalness: 0.6
747
specularAmount: 0.4
748
roughness: 0.4
749
}
750
\endqml
751
\li \image quick3d-custom-color3.jpg
752
{Green teapot with standard material showing correct color}
753
\endtable
754
755
If the type of the \c uColor property was changed to \c vector4d, or any type other than
756
\c color, the results would suddenly change and become identical to our first approach.
757
758
Why is this?
759
760
The answer lies in the sRGB to linear conversion that is performed implicitly for color
761
properties of DefaultMaterial, PrincipledMaterial, and also for custom properties with a
762
\c color type in a CustomMaterial. Such conversion is not performed for any other value,
763
so if the shader hardcodes a color value, or bases it on a QML property with a type
764
different from \c color, it will be up to the shader to perform linearization in case the
765
source value was in sRGB color space. Converting to linear is important since Qt Quick 3D
766
performs \l{SceneEnvironment::tonemapMode}{tonemapping} on the results of fragment
767
shading, and that process assumes values in the sRGB space as its input.
768
769
The built-in QColor constants, such as, \c{"green"}, are all given in sRGB
770
space. Therefore, just assigning \c{vec4(0.0, 0.5, 0.0, 1.0)} to BASE_COLOR in the first
771
attempt is insufficient if we wanted a result that matches an RGB value \c{(0, 128, 0)} in
772
the sRGB space. See the \c BASE_COLOR documentation in \l CustomMaterial for a formula for
773
linearizing such color values. The same applies to color values retrieved by sampling
774
textures: if the source image data is not in the sRGB color space, a conversion is needed
775
(unless \l{SceneEnvironment::tonemapMode}{tonemapping} is disabled).
776
777
\section2 Blending
778
779
Just writing a value less than \c 1.0 to \c{BASE_COLOR.a} is not sufficient if the
780
expectation is to get alpha blending. Such materials will very often change the values of
781
\l{CustomMaterial::sourceBlend}{sourceBlend} and
782
\l{CustomMaterial::destinationBlend}{destinationBlend} properties to get the desired
783
results.
784
785
Also keep in mind that the combined alpha value is the \l{Node::opacity}{Node opacity}
786
multiplied by the material alpha.
787
788
To visualize, let's use a shader that assigns red with alpha \c 0.5 to \c BASE_COLOR:
789
790
\table
791
\header
792
\li main.qml, material.frag
793
\li Result
794
\row \li \qml
795
import QtQuick
796
import QtQuick3D
797
Item {
798
View3D {
799
anchors.fill: parent
800
environment: SceneEnvironment {
801
backgroundMode: SceneEnvironment.Color
802
clearColor: "white"
803
}
804
PerspectiveCamera {
805
id: camera
806
z: 600
807
}
808
DirectionalLight { }
809
Model {
810
source: "#Cube"
811
x: -150
812
eulerRotation.x: 60
813
eulerRotation.y: 20
814
materials: CustomMaterial {
815
fragmentShader: "material.frag"
816
}
817
}
818
Model {
819
source: "#Cube"
820
eulerRotation.x: 60
821
eulerRotation.y: 20
822
materials: CustomMaterial {
823
sourceBlend: CustomMaterial.SrcAlpha
824
destinationBlend: CustomMaterial.OneMinusSrcAlpha
825
fragmentShader: "material.frag"
826
}
827
}
828
Model {
829
source: "#Cube"
830
x: 150
831
eulerRotation.x: 60
832
eulerRotation.y: 20
833
materials: CustomMaterial {
834
sourceBlend: CustomMaterial.SrcAlpha
835
destinationBlend: CustomMaterial.OneMinusSrcAlpha
836
fragmentShader: "material.frag"
837
}
838
opacity: 0.5
839
}
840
}
841
}
842
\endqml
843
\badcode
844
void MAIN()
845
{
846
BASE_COLOR = vec4(1.0, 0.0, 0.0, 0.5);
847
}
848
\endcode
849
\li \image quick3d-custom-blend.jpg
850
{Three cubes showing blend mode opacity variations}
851
\endtable
852
853
The first cube is writing 0.5 to the alpha value of the color but it does not bring
854
visible results since alpha blending is not enabled. The second cube enables simple alpha
855
blending via the CustomMaterial properties. The third one also assigns an opacity of 0.5
856
to the Model, which means that the effective opacity is 0.25.
857
858
\section2 Passing data between the vertex and fragment shader
859
860
Calculating a value per vertex (for example, assuming a single triangle, for the 3 corners
861
of the triangle), and then passing it on to the fragment stage, where for each fragment
862
(for example, every fragment covered by the rasterized triangle) an interpolated value is
863
made accessible. In custom material shader snippets this is made possible by the \c
864
VARYING keyword. This provides a syntax similar to GLSL 120 and GLSL ES 100, but will work
865
regardless of the graphics API used at run time. The engine will take care of rewriting
866
the varying declaration as appropriate.
867
868
Let's see how the classic texture sampling with UV coordinates would look like. Textures
869
are going to be covered in an upcoming section, for now let's focus on how we get the UV
870
coordinates that can be passed to the \c{texture()} function in the shader.
871
872
\table
873
\header
874
\li main.qml, material.vert, material.frag
875
\row \li \qml
876
import QtQuick
877
import QtQuick3D
878
Item {
879
View3D {
880
anchors.fill: parent
881
environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
882
PerspectiveCamera { z: 600 }
883
DirectionalLight { }
884
Model {
885
source: "#Sphere"
886
scale: Qt.vector3d(4, 4, 4)
887
eulerRotation.x: 30
888
materials: CustomMaterial {
889
vertexShader: "material.vert"
890
fragmentShader: "material.frag"
891
property TextureInput someTextureMap: TextureInput {
892
texture: Texture {
893
source: "qt_logo_rect.png"
894
}
895
}
896
}
897
}
898
}
899
}
900
\endqml
901
\badcode
902
VARYING vec2 uv;
903
void MAIN()
904
{
905
uv = UV0;
906
}
907
\endcode
908
\badcode
909
VARYING vec2 uv;
910
void MAIN()
911
{
912
BASE_COLOR = texture(someTextureMap, uv);
913
}
914
\endcode
915
\endtable
916
917
\table
918
\header
919
\li qt_logo_rect.png
920
\li Result
921
\row \li \image quick3d-custom-varying-map.png
922
{Qt logo texture used as source data}
923
\li \image quick3d-custom-varying1.jpg
924
{Cube with Qt logo texture applied to faces}
925
\endtable
926
927
Note that \c VARYING declarations. The name and type must match, \c uv in the fragment
928
shader will expose the interpolated UV coordinate for the current fragment.
929
930
Any other type of data can be passed on to the fragment stage in a similar manner. It is
931
worth noting that in many cases setting up the material's own varyings is not necessary
932
because there are builtins provided that cover many of typical needs. This includes making
933
the (interpolated) normals, UVs, world position (\c VAR_WORLD_POSITION), or the vector
934
pointing towards the camera (\c VIEW_VECTOR).
935
936
The above example can in fact be simplified to the following as \c UV0 is automatically
937
available in the fragment stage as well:
938
939
\table
940
\header
941
\li Change in main.qml, material.frag
942
\li Result
943
\row \li \qml
944
materials: CustomMaterial {
945
fragmentShader: "material.frag"
946
property TextureInput someTextureMap: TextureInput {
947
texture: Texture {
948
source: "qt_logo_rect.png"
949
}
950
}
951
\endqml
952
\badcode
953
void MAIN()
954
{
955
BASE_COLOR = texture(someTextureMap, UV0);
956
}
957
\endcode
958
\li \image quick3d-custom-varying1.jpg
959
{Cube with Qt logo texture applied to faces}
960
\endtable
961
962
To disable interpolation for a variable, use the \c flat keyword in both the
963
vertex and fragment shader snippet. For example:
964
\badcode
965
VARYING flat vec2 v;
966
\endcode
967
968
\section2 Textures
969
970
A \l CustomMaterial has no built-in texture maps, meaning there is no equivalent of, for
971
example, \l{PrincipledMaterial::baseColorMap}. This is because implementing the same is
972
often trivial, while giving a lot more flexibility than what DefaultMaterial and
973
PrincipledMaterial has built in. Besides simply sampling a texture, custom fragment shader
974
snippets are free to combine and blend data from various sources when calculating the
975
values they assign to \c BASE_COLOR, \c EMISSIVE_COLOR, \c ROUGHNESS, etc. They can base
976
these calculations on data provided via QML properties, interpolated data sent on from the
977
vertex stage, values retrieved from sampling textures, and on hardcoded values.
978
979
As the previous example shows, exposing a texture to the vertex, fragment, or both shaders
980
is very similar to scalar and vector uniform values: a QML property with the type \l
981
TextureInput will automatically get associated with a \c sampler2D in the shader code. As
982
always, there is no need to declare this sampler in the shader code.
983
984
A \l TextureInput references a \l Texture, with an additional
985
\l{TextureInput::enabled}{enabled} property. A \l Texture can source its data in three
986
ways: \l{Texture::source}{from an image file}, \l{Texture::sourceItem}{from a texture with
987
live Qt Quick content}, or \l{Texture::textureData}{can be provided from C++} via
988
QQuick3DTextureData.
989
990
\note When it comes to \l Texture properties, the source, tiling, and filtering related
991
ones are the only ones that are taken into account implicitly with custom materials, as
992
the rest (such as, UV transformations) is up to the custom shaders to implement as they
993
see fit.
994
995
Let's see an example where a model, a sphere in this case, is textured using live Qt Quick
996
content:
997
998
\table
999
\header
1000
\li main.qml, material.frag
1001
\row \li \qml
1002
import QtQuick
1003
import QtQuick3D
1004
Item {
1005
View3D {
1006
anchors.fill: parent
1007
environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
1008
PerspectiveCamera { z: 600 }
1009
DirectionalLight { }
1010
Model {
1011
source: "#Sphere"
1012
scale: Qt.vector3d(4, 4, 4)
1013
eulerRotation.x: 30
1014
materials: CustomMaterial {
1015
fragmentShader: "material.frag"
1016
property TextureInput someTextureMap: TextureInput {
1017
texture: Texture {
1018
sourceItem: Rectangle {
1019
width: 512; height: 512
1020
color: "red"
1021
Rectangle {
1022
width: 32; height: 32
1023
anchors.horizontalCenter: parent.horizontalCenter
1024
y: 150
1025
color: "gray";
1026
NumberAnimation on rotation { from: 0; to: 360; duration: 3000; loops: -1 }
1027
}
1028
Text {
1029
anchors.centerIn: parent
1030
text: "Texture Map"
1031
font.pointSize: 16
1032
}
1033
}
1034
}
1035
}
1036
}
1037
}
1038
}
1039
}
1040
\endqml
1041
\badcode
1042
void MAIN()
1043
{
1044
vec2 uv = vec2(UV0.x, 1.0 - UV0.y);
1045
vec4 c = texture(someTextureMap, uv);
1046
BASE_COLOR = c;
1047
}
1048
\endcode
1049
\endtable
1050
1051
\image quick3d-custmat-tex1-anim.gif {Red sphere with animated texture map}
1052
1053
Here the 2D subtree (Rectangle with two children: another Rectangle and the Text) is
1054
rendered in to an 512x512 2D texture every time this mini-scene changes. The texture is
1055
then exposed to the custom material under the name of \c someTextureMap.
1056
1057
Note the flipping of the V coordinate in the shader. As noted above, custom materials,
1058
where there is full programmability on shader level, do not offer the "fixed" features of
1059
\l Texture and \l PrincipledMaterial. This means that any transformations to the UV
1060
coordinates will need to be applied by the shader. Here we know that the texture is
1061
generated via \l{Texture::sourceItem} and so V needs to be flipped to get something that
1062
matches the UV set of the mesh we are using.
1063
1064
What this example shows is possible to do with a \l PrincipledMaterial too. Let's make it
1065
more interesting by doing a simple emboss effect in addition:
1066
1067
\table
1068
\header
1069
\li material.frag
1070
\li Result
1071
\row \li \badcode
1072
void MAIN()
1073
{
1074
vec2 uv = vec2(UV0.x, 1.0 - UV0.y);
1075
vec2 size = vec2(textureSize(someTextureMap, 0));
1076
vec2 d = vec2(1.0 / size.x, 1.0 / size.y);
1077
vec4 diff = texture(someTextureMap, uv + d) - texture(someTextureMap, uv - d);
1078
float c = (diff.x + diff.y + diff.z) + 0.5;
1079
BASE_COLOR = vec4(c, c, c, 1.0);
1080
}
1081
\endcode
1082
\li \image quick3d-custmat-tex2-anim.gif
1083
{Sphere with animated procedural texture}
1084
\endtable
1085
1086
With the features covered so far a wide range of possibilities are open for creating
1087
materials that shade the meshes in visually impressive ways. To finish the basic tour,
1088
let's look at an example that applies height and normal maps to a plane mesh. (a dedicated
1089
\c{.mesh} file is used here because the builtin \c{#Rectangle} does not have enough
1090
subdivisions) For better lighting results, we will use image based lighting with a 360
1091
degree HDR image. The image is also set as the skybox to make it more clear what is
1092
happening.
1093
1094
First let's start with an empty CustomMaterial:
1095
1096
\table
1097
\header
1098
\li main.qml
1099
\li Result
1100
\row \li \qml
1101
import QtQuick
1102
import QtQuick3D
1103
Item {
1104
View3D {
1105
anchors.fill: parent
1106
environment: SceneEnvironment {
1107
backgroundMode: SceneEnvironment.SkyBox
1108
lightProbe: Texture {
1109
source: "00489_OpenfootageNET_snowfield_low.hdr"
1110
}
1111
}
1112
PerspectiveCamera {
1113
z: 600
1114
}
1115
Model {
1116
source: "plane.mesh"
1117
scale: Qt.vector3d(400, 400, 400)
1118
z: 400
1119
y: -50
1120
eulerRotation.x: -90
1121
materials: CustomMaterial { }
1122
}
1123
}
1124
}
1125
\endqml
1126
\li \image quick3d-custom-tex3.jpg
1127
{Plane with winter landscape from custom texture data}
1128
\endtable
1129
1130
Now let's make some shaders that apply a height and normal map to the mesh:
1131
1132
\table
1133
\header
1134
\li Height map
1135
\li Normap map
1136
\row
1137
\li \image quick3d-custom-heightmap.png {Grayscale height map texture}
1138
\li \image quick3d-custom-normalmap.jpg
1139
{Plane with height map applied as displacement}
1140
\endtable
1141
1142
\table
1143
\header
1144
\li material.vert, material.frag
1145
\row
1146
\li \badcode
1147
float getHeight(vec2 pos)
1148
{
1149
return texture(heightMap, pos).r;
1150
}
1151
1152
void MAIN()
1153
{
1154
const float offset = 0.004;
1155
VERTEX.y += getHeight(UV0);
1156
TANGENT = normalize(vec3(0.0, getHeight(UV0 + vec2(0.0, offset)) - getHeight(UV0 + vec2(0.0, -offset)), offset * 2.0));
1157
BINORMAL = normalize(vec3(offset * 2.0, getHeight(UV0 + vec2(offset, 0.0)) - getHeight(UV0 + vec2(-offset, 0.0)), 0.0));
1158
NORMAL = cross(TANGENT, BINORMAL);
1159
}
1160
\endcode
1161
\badcode
1162
void MAIN()
1163
{
1164
vec3 normalValue = texture(normalMap, UV0).rgb;
1165
normalValue.xy = normalValue.xy * 2.0 - 1.0;
1166
normalValue.z = sqrt(max(0.0, 1.0 - dot(normalValue.xy, normalValue.xy)));
1167
NORMAL = normalize(mix(NORMAL, TANGENT * normalValue.x + BINORMAL * normalValue.y + NORMAL * normalValue.z, 1.0));
1168
}
1169
\endcode
1170
\endtable
1171
1172
\table
1173
\header
1174
\li Change in main.qml
1175
\li Result
1176
\row
1177
\li \qml
1178
materials: CustomMaterial {
1179
vertexShader: "material.vert"
1180
fragmentShader: "material.frag"
1181
property TextureInput normalMap: TextureInput {
1182
texture: Texture { source: "normalmap.jpg" }
1183
}
1184
property TextureInput heightMap: TextureInput {
1185
texture: Texture { source: "heightmap.png" }
1186
}
1187
}
1188
\endqml
1189
\li \image quick3d-custom-tex4.jpg
1190
{Plane with normal map showing surface detail}
1191
\endtable
1192
1193
\note The \l WasdController object can be immensely helpful during development and
1194
troubleshooting as it allows navigating and looking around in the scene with the keyboard
1195
and mouse in a familiar manner. Having a camera controlled by the WasdController is as
1196
simple as:
1197
1198
\qml
1199
import QtQuick3D.Helpers
1200
View3D {
1201
PerspectiveCamera {
1202
id: camera
1203
}
1204
// ...
1205
}
1206
WasdController {
1207
controlledObject: camera
1208
}
1209
\endqml
1210
1211
\section2 Depth and screen textures
1212
1213
When a custom shader snippet uses the \c DEPTH_TEXTURE or \c SCREEN_TEXTURE keywords, it
1214
opts in to generating the corresponding textures in a separate render pass, which is not
1215
necessarily a cheap operation, but allows implementing a variety of techniques, such as
1216
refraction for glass-like materials.
1217
1218
\c DEPTH_TEXTURE is a \c sampler2D that allows sampling a texture with the contents of the
1219
depth buffer with all the \c opaque objects in the scene rendered. Similarly, \c
1220
SCREEN_TEXTURE is a \c sampler2D that allows sampling a texture containing the contents of
1221
the scene excluding any transparent materials or any materials also using the
1222
SCREEN_TEXTURE. The texture can be used for materials that require the contents of the
1223
framebuffer they are being rendered to. The SCREEN_TEXTURE texture uses the same clear mode
1224
as the View3D. The size of these textures matches the size of the View3D in pixels.
1225
1226
Let's have a simple demonstration by visualizing the depth buffer contents via \c
1227
DEPTH_TEXTURE. The camera's \l{PerspectiveCamera::clipFar}{far clip value} is reduced here from the
1228
default 10000 to 2000, in order to have a smaller range, and so have the visualized depth
1229
value differences more obvious. The result is a rectangle that happens to visualize the
1230
depth buffer for the scene over its surface.
1231
1232
\table
1233
\header
1234
\li main.qml, material.frag
1235
\li Result
1236
\row
1237
\li \qml
1238
import QtQuick
1239
import QtQuick3D
1240
import QtQuick3D.Helpers
1241
Rectangle {
1242
width: 400
1243
height: 400
1244
color: "black"
1245
View3D {
1246
anchors.fill: parent
1247
PerspectiveCamera {
1248
id: camera
1249
z: 600
1250
clipNear: 1
1251
clipFar: 2000
1252
}
1253
DirectionalLight { }
1254
Model {
1255
source: "#Cube"
1256
scale: Qt.vector3d(2, 2, 2)
1257
position: Qt.vector3d(150, 200, -1000)
1258
eulerRotation.x: 60
1259
eulerRotation.y: 20
1260
materials: PrincipledMaterial { }
1261
}
1262
Model {
1263
source: "#Cylinder"
1264
scale: Qt.vector3d(2, 2, 2)
1265
position: Qt.vector3d(400, 200, -1000)
1266
materials: PrincipledMaterial { }
1267
opacity: 0.3
1268
}
1269
Model {
1270
source: "#Sphere"
1271
scale: Qt.vector3d(2, 2, 2)
1272
position: Qt.vector3d(-150, 200, -600)
1273
materials: PrincipledMaterial { }
1274
}
1275
Model {
1276
source: "#Cone"
1277
scale: Qt.vector3d(2, 2, 2)
1278
position: Qt.vector3d(0, 400, -1200)
1279
materials: PrincipledMaterial { }
1280
}
1281
Model {
1282
source: "#Rectangle"
1283
scale: Qt.vector3d(3, 3, 3)
1284
y: -150
1285
materials: CustomMaterial {
1286
fragmentShader: "material.frag"
1287
}
1288
}
1289
}
1290
WasdController {
1291
controlledObject: camera
1292
}
1293
}
1294
\endqml
1295
\badcode
1296
void MAIN()
1297
{
1298
float zNear = CAMERA_PROPERTIES.x;
1299
float zFar = CAMERA_PROPERTIES.y;
1300
float zRange = zFar - zNear;
1301
vec4 depthSample = texture(DEPTH_TEXTURE, vec2(UV0.x, 1.0 - UV0.y));
1302
float zn = 2.0 * depthSample.r - 1.0;
1303
float d = 2.0 * zNear * zFar / (zFar + zNear - zn * zRange);
1304
d /= zFar;
1305
BASE_COLOR = vec4(d, d, d, 1.0);
1306
}
1307
\endcode
1308
\li \image quick3d-custom-depth-anim.gif
1309
{Animation showing depth buffer values as grayscale}
1310
\endtable
1311
1312
Note how the cylinder is not present in \c DEPTH_TEXTURE due to its reliance on
1313
semi-transparency, which puts it into a different category than the other objects that are
1314
all opaque. These objects do not write into the depth buffer, although they do test
1315
against the depth values written by opaque objects, and rely on being rendered in back to
1316
front order. Hence they are not present in \c DEPTH_TEXTURE either.
1317
1318
What happens if we switch the shader to sample \c SCREEN_TEXTURE instead?
1319
1320
\table
1321
\header
1322
\li material.frag
1323
\li Result
1324
\row \li \badcode
1325
void MAIN()
1326
{
1327
vec4 c = texture(SCREEN_TEXTURE, vec2(UV0.x, 1.0 - UV0.y));
1328
if (c.a == 0.0)
1329
c.rgb = vec3(0.2, 0.1, 0.3);
1330
BASE_COLOR = c;
1331
}
1332
\endcode
1333
\li \image quick3d-custom-screen.jpg
1334
{Scene showing screen texture with distortion effect}
1335
\endtable
1336
1337
Here the rectangle is textured with \c SCREEN_TEXTURE, while replacing transparent pixels
1338
with purple.
1339
1340
\section2 Light processor functions
1341
1342
An advanced feature of \l CustomMaterial is the ability to define functions in the
1343
fragment shader that reimplement the lighting equations that are used to calculate the
1344
fragment color. A light processor function, when present, is called once per each light in
1345
the scene, for each fragment. There is a dedicated function for different light types, as
1346
well as the ambient and specular contribution. When no corresponding light processor
1347
function is present, the standard calculations are used, just like a PrincipledMaterial
1348
would do. When a light processor is present, but the function body is empty, it means
1349
there will be no contribution from a given type of lights in the scene.
1350
1351
Refer to the \l CustomMaterial documentation for details on functions such as \c
1352
DIRECTIONAL_LIGHT, \c POINT_LIGHT, \c SPOT_LIGHT, \c AMBIENT_LIGHT, and \c SPECULAR_LIGHT.
1353
1354
\section2 Unshaded custom materials
1355
1356
There is another type of \l CustomMaterial: \c unshaded custom materials. All the example
1357
so far used \c shaded custom materials, with the
1358
\l{CustomMaterial::shadingMode}{shadingMode} property left at its default
1359
CustomMaterial.Shaded value.
1360
1361
What happens if we switch this property to CustomMaterial.Unshaded?
1362
1363
First of all, keywords like \c BASE_COLOR, \c EMISSIVE_COLOR, \c METALNESS, etc. no longer
1364
have the desired effect. This is because an unshaded material, as the name suggests, does
1365
not automatically get amended with much of the standard shading code, thus ignoring
1366
lights, image based lighting, shadows, and ambient occlusion in the scene. Rather, an
1367
unshaded material gives full control to the shader via the \c FRAGCOLOR keyword. This is
1368
similar to gl_FragColor: the color assigned to \c FRAGCOLOR is the result and the final
1369
color of the fragment, without any further adjustments by Qt Quick 3D.
1370
1371
\table
1372
\header
1373
\li main.qml, material.frag, material2.frag
1374
\li Result
1375
\row \li \qml
1376
import QtQuick
1377
import QtQuick3D
1378
Item {
1379
View3D {
1380
anchors.fill: parent
1381
environment: SceneEnvironment {
1382
backgroundMode: SceneEnvironment.Color
1383
clearColor: "black"
1384
}
1385
PerspectiveCamera { z: 600 }
1386
DirectionalLight { }
1387
Model {
1388
source: "#Cylinder"
1389
x: -100
1390
eulerRotation.x: 30
1391
materials: CustomMaterial {
1392
fragmentShader: "material.frag"
1393
}
1394
}
1395
Model {
1396
source: "#Cylinder"
1397
x: 100
1398
eulerRotation.x: 30
1399
materials: CustomMaterial {
1400
shadingMode: CustomMaterial.Unshaded
1401
fragmentShader: "material2.frag"
1402
}
1403
}
1404
}
1405
}
1406
\endqml
1407
\badcode
1408
void MAIN()
1409
{
1410
BASE_COLOR = vec4(1.0);
1411
}
1412
\endcode
1413
\badcode
1414
void MAIN()
1415
{
1416
FRAGCOLOR = vec4(1.0);
1417
}
1418
\endcode
1419
\li \image quick3d-custom-unshaded1.jpg
1420
{Cube and cylinder with unshaded custom material}
1421
\endtable
1422
1423
Notice how the right cylinder ignores the DirectionalLight in the scene. Its shading knows
1424
nothing about scene lighting, the final fragment color is all white.
1425
1426
The vertex shader in an unshaded material still has the typical inputs available: \c
1427
VERTEX, \c NORMAL, \c MODELVIEWPROJECTION_MATRIX, etc. and can write to \c POSITION. The
1428
fragment shader no longer has the similar conveniences available, however: \c NORMAL, \c
1429
UV0, or \c VAR_WORLD_POSITION are not available in an unshaded material's fragment
1430
shader. Rather, it is now up to the shader code to calculate and pass on using \c VARYING
1431
everything it needs to determine the final fragment color.
1432
1433
Let's look at an example that has both a vertex and fragment shader. The altered vertex
1434
position is passed on to the fragment shader, with an interpolated value made available to
1435
every fragment.
1436
1437
\table
1438
\header
1439
\li main.qml, material.vert, material.frag
1440
\row \li \qml
1441
import QtQuick
1442
import QtQuick3D
1443
Item {
1444
View3D {
1445
anchors.fill: parent
1446
environment: SceneEnvironment {
1447
backgroundMode: SceneEnvironment.Color
1448
clearColor: "black"
1449
}
1450
PerspectiveCamera { z: 600 }
1451
Model {
1452
source: "#Sphere"
1453
scale: Qt.vector3d(3, 3, 3)
1454
materials: CustomMaterial {
1455
property real time: 0.0
1456
NumberAnimation on time { from: 0; to: 100; duration: 20000; loops: -1 }
1457
property real amplitude: 10.0
1458
shadingMode: CustomMaterial.Unshaded
1459
vertexShader: "material.vert"
1460
fragmentShader: "material.frag"
1461
}
1462
}
1463
}
1464
}
1465
\endqml
1466
\badcode
1467
VARYING vec3 pos;
1468
void MAIN()
1469
{
1470
pos = VERTEX;
1471
pos.x += sin(time * 4.0 + pos.y) * amplitude;
1472
POSITION = MODELVIEWPROJECTION_MATRIX * vec4(pos, 1.0);
1473
}
1474
\endcode
1475
\badcode
1476
VARYING vec3 pos;
1477
void MAIN()
1478
{
1479
FRAGCOLOR = vec4(vec3(pos.x * 0.02, pos.y * 0.02, pos.z * 0.02), 1.0);
1480
}
1481
\endcode
1482
\endtable
1483
1484
\image quick3d-custom-unshaded-anim.gif
1485
{Animation showing lighting toggle between shaded and unshaded}
1486
1487
Unshaded materials are useful when interacting with scene lighting is not necessary or
1488
desired, and the material needs full control on the final fragment color. Notice how the
1489
example above has neither a DirectionalLight nor any other lights, but the sphere with the
1490
custom material shows up as expected.
1491
1492
\note An unshaded material that only has a vertex shader snippet, but does not specify the
1493
fragmentShader property, will still be functional but the results are as if the
1494
shadingMode was set to Shaded. Therefore it makes little sense to switch shadingMode for
1495
materials that only have a vertex shader.
1496
1497
\section1 Programmability for Effects
1498
1499
Post-processing effects apply one or more fragment shaders to the result of a \l
1500
View3D. The output from these fragment shaders is then displayed instead of the original
1501
rendering results. This is conceptually very similar to Qt Quick's \l ShaderEffect and \l
1502
ShaderEffectSource.
1503
1504
\note Post-processing effects are only available when the
1505
\l{View3D::renderMode}{renderMode} for the View3D is set to View3D.Offscreen.
1506
1507
Custom vertex shader snippets can also be specified for an effect, but they have limited
1508
usefulness and therefore are expected to be used relatively rarely. The vertex input for a
1509
post-processing effect is a quad (either two triangles or a triangle strip), transforming
1510
or displacing the vertices of that is often not helpful. It can however make sense to have
1511
a vertex shader in order to calculate and pass on data to the fragment shader using the \c
1512
VARYING keyword. As usual, the fragment shader will then receive an interpolated value
1513
based on the current fragment coordinate.
1514
1515
The syntax of the shader snippets associated with a \l Effect is identical to the shaders
1516
for an unshaded \l CustomMaterial. When it comes to the built-in special keywords, \c
1517
VARYING, \c MAIN, \c FRAGCOLOR (fragment shader only), \c POSITION (vertex shader only), \c
1518
VERTEX (vertex shader only), and \c MODELVIEWPROJECTION_MATRIX work identically to \l
1519
CustomMaterial.
1520
1521
The most important special keywords for \l Effect fragment shaders are the following:
1522
1523
\table
1524
\header
1525
\li Name
1526
\li Type
1527
\li Description
1528
\row
1529
\li INPUT
1530
\li sampler2D or sampler2DArray
1531
\li The sampler for the input texture. An effect will typically sample this using \c INPUT_UV.
1532
\row
1533
\li INPUT_UV
1534
\li vec2
1535
\li UV coordinates for sampling \c INPUT.
1536
\row
1537
\li INPUT_SIZE
1538
\li vec2
1539
\li The size of the \c INPUT texture, in pixels. This is a convenient alternative to calling textureSize().
1540
\row
1541
\li OUTPUT_SIZE
1542
\li vec2
1543
\li The size of the output texture, in pixels. Equal to \c INPUT_SIZE in many cases, but a multi-pass effect
1544
may have passes that output to intermediate textures with different sizes.
1545
\row
1546
\li DEPTH_TEXTURE
1547
\li sampler2D
1548
\li Depth texture with the depth buffer contents with the opaque objects in the scene. Like with CustomMaterial,
1549
the presence of this keyword in the shader triggers generating the depth texture automatically.
1550
\endtable
1551
1552
\note When multiview rendering is enabled, the input texture is a 2D texture
1553
array. GLSL functions such as texture() and textureSize() take/return a
1554
vec3/ivec3, respectively, then. Use \c VIEW_INDEX for the layer. In VR/AR
1555
applications that wish to function both with and without multiview rendering, the
1556
portable approach is to write the shader code like this:
1557
\badcode
1558
#if QSHADER_VIEW_COUNT >= 2
1559
vec4 c = texture(INPUT, vec3(INPUT_UV, VIEW_INDEX));
1560
#else
1561
vec4 c = texture(INPUT, INPUT_UV);
1562
#endif
1563
\endcode
1564
1565
\section2 A post-processing effect
1566
1567
Let's start with a simple scene, this time using a few more objects, including a textured
1568
rectangle that uses a checkerboard texture as its base color map.
1569
1570
\table
1571
\header
1572
\li main.qml
1573
\li Result
1574
\row \li \qml
1575
import QtQuick
1576
import QtQuick3D
1577
Item {
1578
View3D {
1579
anchors.fill: parent
1580
environment: SceneEnvironment {
1581
backgroundMode: SceneEnvironment.Color
1582
clearColor: "black"
1583
}
1584
1585
PerspectiveCamera { z: 400 }
1586
1587
DirectionalLight { }
1588
1589
Texture {
1590
id: checkerboard
1591
source: "checkerboard.png"
1592
scaleU: 20
1593
scaleV: 20
1594
tilingModeHorizontal: Texture.Repeat
1595
tilingModeVertical: Texture.Repeat
1596
}
1597
1598
Model {
1599
source: "#Rectangle"
1600
scale: Qt.vector3d(10, 10, 1)
1601
eulerRotation.x: -45
1602
materials: PrincipledMaterial {
1603
baseColorMap: checkerboard
1604
}
1605
}
1606
1607
Model {
1608
source: "#Cone"
1609
position: Qt.vector3d(100, -50, 100)
1610
materials: PrincipledMaterial { }
1611
}
1612
1613
Model {
1614
source: "#Cube"
1615
position.y: 100
1616
eulerRotation.y: 20
1617
materials: PrincipledMaterial { }
1618
}
1619
1620
Model {
1621
source: "#Sphere"
1622
position: Qt.vector3d(-150, 200, -100)
1623
materials: PrincipledMaterial { }
1624
}
1625
}
1626
}
1627
\endqml
1628
\li \image quick3d-custom-effect-section-scene.jpg
1629
{Reference scene with sphere, cone, and cube}
1630
\endtable
1631
1632
Now let's apply an affect to the entire scene. More precisely, to the View3D. When there
1633
are multiple View3D items in the scene, each has its own SceneEnvironment and therefore
1634
have their own post-processing effect chain. In the example there is one single View3D
1635
covering the entire window.
1636
1637
\table
1638
\header
1639
\li Change in main.qml
1640
\li effect.frag
1641
\row \li \qml
1642
environment: SceneEnvironment {
1643
backgroundMode: SceneEnvironment.Color
1644
clearColor: "black"
1645
effects: redEffect
1646
}
1647
1648
Effect {
1649
id: redEffect
1650
property real uRed: 1.0
1651
NumberAnimation on uRed { from: 1; to: 0; duration: 5000; loops: -1 }
1652
passes: Pass {
1653
shaders: Shader {
1654
stage: Shader.Fragment
1655
shader: "effect.frag"
1656
}
1657
}
1658
}
1659
\endqml
1660
\li \badcode
1661
void MAIN()
1662
{
1663
vec4 c = texture(INPUT, INPUT_UV);
1664
c.r = uRed;
1665
FRAGCOLOR = c;
1666
}
1667
\endcode
1668
\endtable
1669
1670
This simple effect alters the red color channel value. Exposing QML properties as uniforms
1671
works the same way with effects as with custom materials. The shader starts with a line
1672
that is going to be very common when writing fragment shaders fro effects: sampling \c
1673
INPUT at the UV coordinates \c INPUT_UV. It then performs its desired calculations, and
1674
assigns the final fragment color to \c FRAGCOLOR.
1675
1676
\image quick3d-custom-first-effect-anim.gif
1677
{Animation showing color inversion post-processing effect}
1678
1679
Many properties set in the example are in plural (effects, passes, shaders). While the
1680
list \c{[ ]} syntax can be omitted when having a single element only, all these properties
1681
are lists, and can hold more than one element. Why is this?
1682
1683
\list
1684
1685
\li \l{SceneEnvironment::effects}{effects} is a list, because View3D allows chaining
1686
multiple effects together. The effects are applied in the order in which they are added to
1687
the list. This allows easily applying two or more effects together to the View3D, and is
1688
similar to what one can achieve in Qt Quick by nesting \l ShaderEffect items. The \c INPUT
1689
texture of the next effect is always a texture that contains the previous effect's
1690
output. The output of the last effect in what gets used as the final output of the View3D.
1691
1692
\li \l{Effect::passes}{passes} is a list, because unlike ShaderEffect, Effect has built-in
1693
support for multiple passes. A multi-pass effect is more powerful than chaining together
1694
multiple, independent effects in \l{SceneEnvironment::effects}{effects}: a pass can output
1695
to a temporary, intermediate texture, which can then be used as input to subsequent
1696
passes, in addition to the original input texture of the effect. This allows creating
1697
complex effects that calculate, render, and blend together multiple textures in order to
1698
get to the final fragment color. This advanced use case is not going to be covered
1699
here. Refer to the \l Effect documentation page for details.
1700
1701
\li \l{Pass::shaders}{shaders} is a list, because an effect may have both a vertex and a
1702
fragment shader associated.
1703
1704
\endlist
1705
1706
\section2 Chaining multiple effects
1707
1708
Let's look at an example where the effect from the previous example gets complemented by
1709
another effect similar to the built-in \l DistortionSpiral effect.
1710
1711
\table
1712
\header
1713
\li Change in main.qml
1714
\li effect2.frag
1715
\row \li \qml
1716
environment: SceneEnvironment {
1717
backgroundMode: SceneEnvironment.Color
1718
clearColor: "black"
1719
effects: [redEffect, distortEffect]
1720
}
1721
1722
Effect {
1723
id: redEffect
1724
property real uRed: 1.0
1725
NumberAnimation on uRed { from: 1; to: 0; duration: 5000; loops: -1 }
1726
passes: Pass {
1727
shaders: Shader {
1728
stage: Shader.Fragment
1729
shader: "effect.frag"
1730
}
1731
}
1732
}
1733
1734
Effect {
1735
id: distortEffect
1736
property real uRadius: 0.1
1737
NumberAnimation on uRadius { from: 0.1; to: 1.0; duration: 5000; loops: -1 }
1738
passes: Pass {
1739
shaders: Shader {
1740
stage: Shader.Fragment
1741
shader: "effect2.frag"
1742
}
1743
}
1744
}
1745
\endqml
1746
\li \badcode
1747
void MAIN()
1748
{
1749
vec2 center_vec = INPUT_UV - vec2(0.5, 0.5);
1750
center_vec.y *= INPUT_SIZE.y / INPUT_SIZE.x;
1751
float dist_to_center = length(center_vec) / uRadius;
1752
vec2 texcoord = INPUT_UV;
1753
if (dist_to_center <= 1.0) {
1754
float rotation_amount = (1.0 - dist_to_center) * (1.0 - dist_to_center);
1755
float r = radians(360.0) * rotation_amount / 4.0;
1756
float cos_r = cos(r);
1757
float sin_r = sin(r);
1758
mat2 rotation = mat2(cos_r, sin_r, -sin_r, cos_r);
1759
texcoord = vec2(0.5, 0.5) + rotation * (INPUT_UV - vec2(0.5, 0.5));
1760
}
1761
vec4 c = texture(INPUT, texcoord);
1762
FRAGCOLOR = c;
1763
}
1764
\endcode
1765
\endtable
1766
1767
\image quick3d-custom-chained-effect-anim.gif
1768
{Animation showing combined blur and color inversion effects}
1769
1770
Now the perhaps surprising question: why is this a bad example?
1771
1772
More precisely, it is not bad, but rather shows a pattern that can often be beneficial to
1773
avoid.
1774
1775
Chaining effects this way can be useful, but it is important to keep in mind the
1776
performance implications: doing two render passes (one to generate a texture with the
1777
adjusted red color channel, and then another one two calculate the distortion) is quite
1778
wasteful when one would be enough. If the fragment shader snippets were combined, the same
1779
result could have been achieved with one single effect.
1780
1781
\section1 Defining Mesh and Texture Data from C++
1782
1783
Procedurally generating mesh and texture image data both follow similar steps:
1784
1785
\list
1786
\li Subclass \l QQuick3DGeometry or \l QQuick3DTextureData
1787
\li Set the desired vertex or image data upon construction by calling the protected member functions
1788
from the base class
1789
\li If dynamic changes are needed afterwards at some point, set the new data and call update()
1790
\li Once the implementation is done, the class needs to be registered to make it visible in QML
1791
\li \l Model and \l Texture objects in QML can now use the custom vertex or image data provider by
1792
setting the \l{Model::geometry} or \l{Texture::textureData} property
1793
\endlist
1794
1795
\section2 Custom vertex data
1796
1797
Vertex data refers to the sequence of (typically \c float) values that make up a
1798
mesh. Instead of loading \c{.mesh} files, a custom geometry provider is responsible for
1799
providing the same data. The vertex data consist of \c attributes, such as position,
1800
texture (UV) coordinates, or normals. The specification of attributes describes what kind
1801
of attributes are present, the component type (for example, a 3 component float vector for
1802
vertex position consisting of x, y, z values), which offset they start at in the provided
1803
data, and what the stride (the increment that needs to be added to the offset to point to
1804
the next element for the same attribute) is.
1805
1806
This may seem familiar if one has worked with graphics APIs, such as OpenGL or Vulkan
1807
directly, because the way vertex input is specified with those APIs maps loosely to what a
1808
\c{.mesh} file or a \l QQuick3DGeometry instance defines.
1809
1810
In addition, the mesh topology (primitive type) must be specified too. For indexed
1811
drawing, the data for an index buffer must be provided as well.
1812
1813
There is one built-in custom geometry implementation: the QtQuick3D.Helpers module
1814
includes a \l GridGeometry type. This allows rendering a grid in the scene with line
1815
primitives, without having to implement a custom \l QQuick3DGeometry subclass.
1816
1817
One other common use cases is rendering points. This is fairly simple to do since the
1818
attribute specification is going to be minimal: we provide three floats (x, y, z) for each
1819
vertex, nothing else. A QQuick3DGeometry subclass could implement a geometry consisting of
1820
2000 points similarly to the following:
1821
1822
\badcode
1823
clear();
1824
const int N = 2000;
1825
const int stride = 3 * sizeof(float);
1826
QByteArray v;
1827
v.resize(N * stride);
1828
float *p = reinterpret_cast<float *>(v.data());
1829
QRandomGenerator *rg = QRandomGenerator::global();
1830
for (int i = 0; i < N; ++i) {
1831
const float x = float(rg->bounded(200.0f) - 100.0f) / 20.0f;
1832
const float y = float(rg->bounded(200.0f) - 100.0f) / 20.0f;
1833
*p++ = x;
1834
*p++ = y;
1835
*p++ = 0.0f;
1836
}
1837
setVertexData(v);
1838
setStride(stride);
1839
setPrimitiveType(QQuick3DGeometry::PrimitiveType::Points);
1840
addAttribute(QQuick3DGeometry::Attribute::PositionSemantic, 0, QQuick3DGeometry::Attribute::F32Type);
1841
\endcode
1842
1843
Combined with a material of
1844
1845
\qml
1846
DefaultMaterial {
1847
lighting: DefaultMaterial.NoLighting
1848
cullMode: DefaultMaterial.NoCulling
1849
diffuseColor: "yellow"
1850
pointSize: 4
1851
}
1852
\endqml
1853
1854
the end result is similar to this (here viewed from an altered camera angle, with the help
1855
of \l WasdController):
1856
1857
\image quick3d-custom-points.jpg
1858
{Point cloud geometry rendered as individual points}
1859
1860
\note Be aware that point sizes and line widths other than 1 may not be supported at run
1861
time, depending on the underlying graphics API. This is not something Qt has control
1862
over. Therefore, it can become necessary to implement alternative techniques instead of
1863
relying on point and line drawing.
1864
1865
\section2 Custom texture data
1866
1867
With textures, the data that needs to be provided is a lot simpler structurally: it is the
1868
raw pixel data, with a varying number of bytes per pixel, depending on the texture
1869
format. For example, an \c RGBA texture expects four bytes per pixel, whereas \c RGBA16F
1870
is four half-floats per pixel. This is similar to what a \l QImage stores
1871
internally. However, Qt Quick 3D textures can have formats the data for which cannot be
1872
represented by a QImage. For example, floating point HDR textures, or compressed
1873
textures. Therefore the data for \l QQuick3DTextureData is always provided as a raw
1874
sequence of bytes. This may seem familiar if one has worked with graphics APIs, such as
1875
OpenGL or Vulkan directly.
1876
1877
For details, refer to the \l QQuick3DGeometry and \l QQuick3DTextureData documentation pages.
1878
1879
\sa CustomMaterial, Effect, QQuick3DGeometry, QQuick3DTextureData, {Qt Quick 3D - Custom
1880
Effect Example}, {Qt Quick 3D - Custom Shaders Example}, {Qt Quick 3D - Custom Materials
1881
Example}, {Qt Quick 3D - Custom Geometry Example}, {Qt Quick 3D - Procedural Texture
1882
Example}
1883
1884
*/
qtquick3d
src
quick3d
doc
src
qtquick3d-custom.qdoc
Generated on
for Qt by
1.16.1