Qt
Internal/Contributor docs for the Qt SDK. Note: These are NOT official API docs; those are found at https://doc.qt.io/
Loading...
Searching...
No Matches
qtquick3d-custom.qdoc
Go to the documentation of this file.
1
// Copyright (C) 2020 The Qt Company Ltd.
2
// SPDX-License-Identifier: LicenseRef-Qt-Commercial OR GFDL-1.3-no-invariants-only
3
4
/*!
5
\page qtquick3d-custom.html
6
\title Programmable Materials, Effects, Geometry, and Texture data
7
\brief Custom materials, effects, geometry and texture data providers in Qt Quick 3D
8
9
While the built-in materials of Qt Quick 3D, \l DefaultMaterial and \l PrincipledMaterial,
10
allow a wide degree of customization via their properties, they do not provide
11
programmability on the vertex and fragment shader level. To allow that, the \l
12
CustomMaterial type is provided.
13
14
\table
15
\header
16
\li A model with PrincipledMaterial
17
\li With a CustomMaterial transforming the vertices
18
\row
19
\li \image quick3d-custom-mat1.jpg
20
\li \image quick3d-custom-mat2.jpg
21
\endtable
22
23
Post-processing effects, where one or more passes of processing on the color buffer are
24
performed, optionally taking the depth buffer into account, before the View3D's output is
25
passed on to Qt Quick, also exist in two varieties:
26
\list
27
\li built-in post-processing steps that can be configured via \l ExtendedSceneEnvironment, such as
28
glow/bloom, depth of field, vignette, lens flare,
29
\li \c custom effects implemented by the application in form of fragment shader code and a
30
specification of the processing passes in an \l Effect object.
31
\endlist
32
33
In practice there is a third category of post-processing effects: 2D effects
34
implemented via Qt Quick, operating on the output of the \l View3D item without
35
any involvement from the 3D renderer. For example, to apply a blur to a \l
36
View3D item, the simplest approach is to use Qt Quick's existing facilities,
37
such as \l MultiEffect. The 3D post-processing system becomes beneficial for
38
complex effects that involve 3D scene concepts such as the depth buffer or the
39
screen texture, or need to deal with HDR tonemapping or need multiple passes
40
with intermediate buffers, etc. Simple 2D effects that do not require any
41
insight into the 3D scene and renderer can always be implemented with \l
42
ShaderEffect or \l MultiEffect instead.
43
44
\table
45
\header
46
\li Scene without effect
47
\li The same scene with a custom post-processing effect applied
48
\row
49
\li \image quick3d-custom-effect1.jpg
50
\li \image quick3d-custom-effect2.jpg
51
\endtable
52
53
In addition to programmable materials and post-processing, there are two types of data that is
54
normally provided in form of files (\c{.mesh} files or images such as \c{.png}):
55
56
\list
57
58
\li vertex data, including the geometry for the mesh to be rendered, texture coordinates,
59
normals, colors, and other data,
60
61
\li the content for textures that are then used as texture maps for the rendered
62
objects, or used with skybox or image based lighting.
63
64
\endlist
65
66
If they so wish, applications can provide such data from C++ in form of a QByteArray. Such
67
data can also be changed over time, allowing to procedurally generate and later alter the
68
data for a \l Model or \l Texture.
69
70
\table
71
\header
72
\li A grid, rendered by specifying vertex data dynamically from C++
73
\li A cube textured with image data generated from C++
74
\row
75
\li \image quick3d-custom-geom.jpg
76
\li \image quick3d-custom-tex.jpg
77
\endtable
78
79
These four approaches to customizing and making materials, effects, geometry, and textures
80
dynamic enable the programmability of shading and procedural generation of the data the
81
shaders get as their input. The following sections provide an overview of these
82
features. The full reference is available in the documentation pages for the respective
83
types:
84
85
\table
86
\header
87
\li Feature
88
\li Reference Documentation
89
\li Relevant Examples
90
\row
91
\li Custom materials
92
\li \l CustomMaterial
93
\li \l {Qt Quick 3D - Custom Shaders Example}, \l {Qt Quick 3D - Custom Materials
94
Example}
95
\row
96
\li Custom post-processing effects
97
\li \l Effect
98
\li \l {Qt Quick 3D - Custom Effect Example}
99
\row
100
\li Custom geometry
101
\li \l QQuick3DGeometry, \l{Model::geometry}
102
\li \l {Qt Quick 3D - Custom Geometry Example}
103
\row
104
\li Custom texture data
105
\li \l QQuick3DTextureData, \l{Texture::textureData}
106
\li \l {Qt Quick 3D - Procedural Texture Example}
107
\endtable
108
109
\section1 Programmability for Materials
110
111
Let's have a scene with a cube, and start with a default \l PrincipledMaterial and
112
\l CustomMaterial:
113
114
\table
115
\header
116
\li PrincipledMaterial
117
\li CustomMaterial
118
\row
119
\li
120
\qml
121
import QtQuick
122
import QtQuick3D
123
Item {
124
View3D {
125
anchors.fill: parent
126
environment: SceneEnvironment {
127
backgroundMode: SceneEnvironment.Color
128
clearColor: "black"
129
}
130
PerspectiveCamera { z: 600 }
131
DirectionalLight { }
132
Model {
133
source: "#Cube"
134
scale: Qt.vector3d(2, 2, 2)
135
eulerRotation.x: 30
136
materials: PrincipledMaterial { }
137
}
138
}
139
}
140
\endqml
141
\li
142
\qml
143
import QtQuick
144
import QtQuick3D
145
Item {
146
View3D {
147
anchors.fill: parent
148
environment: SceneEnvironment {
149
backgroundMode: SceneEnvironment.Color
150
clearColor: "black"
151
}
152
PerspectiveCamera { z: 600 }
153
DirectionalLight { }
154
Model {
155
source: "#Cube"
156
scale: Qt.vector3d(2, 2, 2)
157
eulerRotation.x: 30
158
materials: CustomMaterial { }
159
}
160
}
161
}
162
\endqml
163
\endtable
164
165
These both lead to the exact same result, because a \l CustomMaterial is effectively a \l
166
PrincipledMaterial, when no vertex or fragment shader code is added to it.
167
168
\image quick3d-custom-cube1.jpg
169
170
\note Properties, such as, \l{PrincipledMaterial::baseColor}{baseColor},
171
\l{PrincipledMaterial::metalness}{metalness},
172
\l{PrincipledMaterial::baseColorMap}{baseColorMap}, and many others, have no equivalent
173
properties in the \l CustomMaterial QML type. This is by design: customizing the material
174
is done via shader code, not by merely providing a few fixed values.
175
176
\section2 Our first vertex shader
177
178
Let's add a custom vertex shader snippet. This is done by referencing a file in the
179
\l{CustomMaterial::vertexShader}{vertexShader} property. The approach will be the same for
180
fragment shaders. These references work like \l{Image::source}{Image.source} or
181
\l{ShaderEffect::vertexShader}{ShaderEffect.vertexShader}: they are local or \c qrc URLs,
182
and a relative path is treated relative to the \c{.qml} file's location. The common
183
approach is therefore to place the \c{.vert} and \c{.frag} files into the Qt resource
184
system (\c qt_add_resources when using CMake) and reference them using a relative path.
185
186
In Qt 6.0 inline shader strings are no longer supported, neither in Qt Quick nor in Qt
187
Quick 3D. (make note of the fact that these properties are URLs, not strings) However, due
188
to their intrinsically dynamic nature, custom materials and post-processing effects in Qt
189
Quick 3D still provide shader snippets in source form in the referenced files. This is a
190
difference to \l ShaderEffect where the shaders are complete on their own, with no further
191
amending by the engine, and so are expected to be provided as pre-conditioned \c{.qsb}
192
shader packs.
193
194
\note In Qt Quick 3D URLs can only refer to local resources. Schemes for remote content
195
are not supported.
196
197
\note The shading language used is Vulkan-compatible GLSL. The \c{.vert} and \c{.frag}
198
files are not complete shaders on their own, hence being often called \c snippets. That is
199
why there are no uniform blocks, input and output variables, or sampler uniforms provided
200
directly by these snippets. Rather, the Qt Quick 3D engine will amend them as appropriate.
201
202
\table
203
\header
204
\li Change in main.qml, material.vert
205
\li Result
206
\row
207
\li \qml
208
materials: CustomMaterial {
209
vertexShader: "material.vert"
210
}
211
\endqml
212
\badcode
213
void MAIN()
214
{
215
}
216
\endcode
217
\li \image quick3d-custom-cube1-small.jpg
218
\endtable
219
220
A custom vertex or fragment shader snippet is expected to provide one or more functions
221
with pre-defined names, such as \c MAIN, \c DIRECTIONAL_LIGHT, \c POINT_LIGHT, \c
222
SPOT_LIGHT, \c AMBIENT_LIGHT, \c SPECULAR_LIGHT. For now let's focus on \c MAIN.
223
224
As shown here, the end result with an empty MAIN() is exactly the same as before.
225
226
Before making it more interesting, let's look at an overview of the most commonly used
227
special keywords in custom vertex shader snippets. This is not the full list. For a full
228
reference, check the \l CustomMaterial page.
229
230
\table
231
\header
232
\li Keyword
233
\li Type
234
\li Description
235
\row
236
\li MAIN
237
\li
238
\li void MAIN() is the entry point. This function must always be present in a custom
239
vertex shader snippet, there is no point in providing one otherwise.
240
\row
241
\li VERTEX
242
\li vec3
243
\li The vertex position the shader receives as input. A common use case for vertex shaders
244
in custom materials is to change (displace) the x, y, or z values of this vector, by simply
245
assigning a value to the whole vector, or some of its components.
246
\row
247
\li NORMAL
248
\li vec3
249
\li The vertex normal from the input mesh data, or all zeroes if there were no normals provided.
250
As with VERTEX, the shader is free to alter the value as it sees fit. The altered value is then
251
used by the rest of the pipeline, including the lighting calculations in the fragment stage.
252
\row
253
\li UV0
254
\li vec2
255
\li The first set of texture coordinates from the input mesh data, or all zeroes if there
256
were no UV values provided. As with VERTEX and NORMAL, the value can altered.
257
\row
258
\li MODELVIEWPROJECTION_MATRIX
259
\li mat4
260
\li The model-view-projection matrix. To unify the behavior regardless of which graphics API
261
rendering happens with, all vertex data and transformation matrices follow OpenGL conventions
262
on this level. (Y axis pointing up, OpenGL-compatible projection matrix) Read only.
263
\row
264
\li MODEL_MATRIX
265
\li mat4
266
\li The model (world) matrix. Read only.
267
\row
268
\li NORMAL_MATRIX
269
\li mat3
270
\li The transposed inverse of the top-left 3x3 slice of the model matrix. Read only.
271
\row
272
\li CAMERA_POSITION
273
\li vec3
274
\li The camera position in world space. In the examples on this page this is \c{(0, 0, 600)}. Read only.
275
\row
276
\li CAMERA_DIRECTION
277
\li vec3
278
\li The camera direction vector. In the examples on this page this is \c{(0, 0, -1)}. Read only.
279
\row
280
\li CAMERA_PROPERTIES
281
\li vec2
282
\li The near and far clip values of the camera. In the examples on this page this is \c{(10, 10000)}. Read only.
283
\row
284
\li POINT_SIZE
285
\li float
286
\li Relevant only when rendering with a topology of points, for example because the
287
\l{QQuick3DGeometry}{custom geometry} provides such a geometry for the mesh. Writing to
288
this value is equivalent to setting \l{PrincipledMaterial::pointSize}{pointSize on a
289
PrincipledMaterial}.
290
\row
291
\li POSITION
292
\li vec4
293
\li Like \c gl_Position. When not present, a default assignment statement is generated
294
automatically using \c MODELVIEWPROJECTION_MATRIX and \c VERTEX. This is why an empty
295
MAIN() is functional, and in most cases there will be no need to assign a custom value to
296
it.
297
\endtable
298
299
Let's make a custom material that displaces the vertices according to some pattern. To
300
make it more interesting, have some animated QML properties, the values of which end up
301
being exposed as uniforms in the shader code. (to be precise, most properties are going to
302
be mapped to members in a uniform block, backed by a uniform buffer at run time, but Qt
303
Quick 3D conveniently makes such details transparent to the custom material author)
304
305
\table
306
\header
307
\li Change in main.qml, material.vert
308
\li Result
309
\row
310
\li \qml
311
materials: CustomMaterial {
312
vertexShader: "material.vert"
313
property real uAmplitude: 0
314
NumberAnimation on uAmplitude {
315
from: 0; to: 100; duration: 5000; loops: -1
316
}
317
property real uTime: 0
318
NumberAnimation on uTime {
319
from: 0; to: 100; duration: 10000; loops: -1
320
}
321
}
322
\endqml
323
\badcode
324
void MAIN()
325
{
326
VERTEX.x += sin(uTime + VERTEX.y) * uAmplitude;
327
}
328
\endcode
329
\li \image quick3d-custom-cube2-anim.gif
330
\endtable
331
332
\section2 Uniforms from QML properties
333
334
Custom properties in the CustomMaterial object get mapped to uniforms. In the above
335
example this includes \c uAmplitude and \c uTime. Any time the values change, the updated
336
value will become visible in the shader. This concept may already be familiar from \l
337
ShaderEffect.
338
339
The name of the QML property and the GLSL variable must match. There is no separate
340
declaration in the shader code for the individual uniforms. Rather, the QML property name
341
can be used as-is. This is why the example above can just reference \c uTime and \c
342
uAmplitude in the vertex shader snippet without any previous declaration for them.
343
344
The following table lists how the types are mapped:
345
346
\table
347
\header
348
\li QML Type
349
\li Shader Type
350
\li Notes
351
\row
352
\li real, int, bool
353
\li float, int, bool
354
\li
355
\row
356
\li color
357
\li vec4
358
\li sRGB to linear conversion is performed implicitly
359
\row
360
\li vector2d
361
\li vec2
362
\li
363
\row
364
\li vector3d
365
\li vec3
366
\li
367
\row
368
\li vector4d
369
\li vec4
370
\li
371
\row
372
\li matrix4x4
373
\li mat4
374
\li
375
\row
376
\li quaternion
377
\li vec4
378
\li scalar value is \c w
379
\row
380
\li rect
381
\li vec4
382
\li
383
\row
384
\li point, size
385
\li vec2
386
\li
387
\row
388
\li TextureInput
389
\li sampler2D
390
\li
391
\endtable
392
393
\section2 Improving the example
394
395
Before moving further, let's make the example somewhat better looking. By adding a rotated
396
rectangle mesh and making the \l DirectionalLight cast shadows, we can verify that the
397
alteration to the cube's vertices is correctly reflected in all rendering passes,
398
including shadow maps. To get a visible shadow, the light is now placed a bit higher on
399
the Y axis, and a rotation is applied to have it pointing partly downwards. (this being a
400
\c directional light, the rotation matters)
401
402
\table
403
\header
404
\li main.qml, material.vert
405
\li Result
406
\row \li \qml
407
import QtQuick
408
import QtQuick3D
409
Item {
410
View3D {
411
anchors.fill: parent
412
environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
413
PerspectiveCamera { z: 600 }
414
DirectionalLight {
415
y: 200
416
eulerRotation.x: -45
417
castsShadow: true
418
}
419
Model {
420
source: "#Rectangle"
421
y: -250
422
scale: Qt.vector3d(5, 5, 5)
423
eulerRotation.x: -45
424
materials: PrincipledMaterial { baseColor: "lightBlue" }
425
}
426
Model {
427
source: "#Cube"
428
scale: Qt.vector3d(2, 2, 2)
429
eulerRotation.x: 30
430
materials: CustomMaterial {
431
vertexShader: "material.vert"
432
property real uAmplitude: 0
433
NumberAnimation on uAmplitude {
434
from: 0; to: 100; duration: 5000; loops: -1
435
}
436
property real uTime: 0
437
NumberAnimation on uTime {
438
from: 0; to: 100; duration: 10000; loops: -1
439
}
440
}
441
}
442
}
443
}
444
\endqml
445
\badcode
446
void MAIN()
447
{
448
VERTEX.x += sin(uTime + VERTEX.y) * uAmplitude;
449
}
450
\endcode
451
\li \image quick3d-custom-cube3-anim.gif
452
\endtable
453
454
\section2 Adding a fragment shader
455
456
Many custom materials will want to have a fragment shader as well. In fact, many will want
457
only a fragment shader. If there is no extra data to be passed from the vertex to fragment
458
stage, and the default vertex transformation is sufficient, setting the \c vertexShader
459
property can be left out from the \l CustomMaterial.
460
461
\table
462
\header
463
\li Change in main.qml, material.frag
464
\li Result
465
\row \li \qml
466
materials: CustomMaterial {
467
fragmentShader: "material.frag"
468
}
469
\endqml
470
\badcode
471
void MAIN()
472
{
473
}
474
\endcode
475
\li \image quick3d-custom-cube4.jpg
476
\endtable
477
478
Our first fragment shader contains an empty MAIN() function. This is no different than not
479
specifying a fragment shader snippet at all: what we get looks like what we get with a
480
default PrincipledMaterial.
481
482
Let's look at some of the commonly used keywords in fragment shaders. This is not the full
483
list, refer to the \l CustomMaterial documentation for a complete reference. Many of these
484
are read-write, meaning they have a default value, but the shader can, and often will want
485
to, assign a different value to them.
486
487
As the names suggest, many of these map to similarly named \l PrincipledMaterial
488
properties, with the same meaning and semantics, following the
489
\l{https://github.com/KhronosGroup/glTF/tree/master/specification/2.0#metallic-roughness-material}{metallic-roughness
490
material model}. It is up the custom material implementation to decide how these values
491
are calculated: for example, a value for BASE_COLOR can be hard coded in the shader, can
492
be based on sampling a texture, or can be calculated based on QML properties exposed as
493
uniforms or on interpolated data passed along from the vertex shader.
494
495
\table
496
\header
497
\li Keyword
498
\li Type
499
\li Description
500
\row
501
\li BASE_COLOR
502
\li vec4
503
\li The base color and alpha value. Corresponds to \l{PrincipledMaterial::baseColor}. The
504
final alpha value of the fragment is the model opacity multiplied by the base color
505
alpha. The default value is \c{(1.0, 1.0, 1.0, 1.0)}.
506
\row
507
\li EMISSIVE_COLOR
508
\li vec3
509
\li The color of self-illumination. Corresponds to
510
\l{PrincipledMaterial::emissiveFactor}. The default value is \c{(0.0, 0.0, 0.0)}.
511
\row
512
\li METALNESS
513
\li float
514
\li \l{PrincipledMaterial::metalness}{Metalness} value in range 0-1. Default to 0, which
515
means the material is dielectric (non-metallic).
516
\row
517
\li ROUGHNESS
518
\li float
519
\li \l{PrincipledMaterial::roughness}{Roughness} value in range 0-1. The default value is
520
0. Larger values soften specular highlights and blur reflections.
521
\row
522
\li SPECULAR_AMOUNT
523
\li float
524
\li \l{PrincipledMaterial::specularAmount}{The strength of specularity} in range 0-1. The
525
default value is \c 0.5. For metallic objects with \c metalness set to \c 1 this value
526
will have no effect. When both \c SPECULAR_AMOUNT and \c METALNESS have values larger than
527
0 but smaller than 1, the result is a blend between the two material models.
528
\row
529
\li NORMAL
530
\li vec3
531
\li The interpolated normal in world space, adjusted for double-sidedness when face culling is disabled. Read only.
532
\row
533
\li UV0
534
\li vec2
535
\li The interpolated texture coordinates. Read only.
536
\row
537
\li VAR_WORLD_POSITION
538
\li vec3
539
\li Interpolated vertex position in world space. Read only.
540
\endtable
541
542
Let's make the cube's base color red:
543
544
\table
545
\header
546
\li Change in main.qml, material.frag
547
\li Result
548
\row \li \qml
549
materials: CustomMaterial {
550
fragmentShader: "material.frag"
551
}
552
\endqml
553
\badcode
554
void MAIN()
555
{
556
BASE_COLOR = vec4(1.0, 0.0, 0.0, 1.0);
557
}
558
\endcode
559
\li \image quick3d-custom-cube5.jpg
560
\endtable
561
562
Now strengthen the level of self-illumination a bit:
563
564
\table
565
\header
566
\li Change in main.qml, material.frag
567
\li Result
568
\row \li \qml
569
materials: CustomMaterial {
570
fragmentShader: "material.frag"
571
}
572
\endqml
573
\badcode
574
void MAIN()
575
{
576
BASE_COLOR = vec4(1.0, 0.0, 0.0, 1.0);
577
EMISSIVE_COLOR = vec3(0.4);
578
}
579
\endcode
580
\li \image quick3d-custom-cube6.jpg
581
\endtable
582
583
Instead of having values hardcoded in the shader, we could also use QML properties exposed
584
as uniforms, even animated ones:
585
586
\table
587
\header
588
\li Change in main.qml, material.frag
589
\li Result
590
\row \li \qml
591
materials: CustomMaterial {
592
fragmentShader: "material.frag"
593
property color baseColor: "black"
594
ColorAnimation on baseColor {
595
from: "black"; to: "purple"; duration: 5000; loops: -1
596
}
597
}
598
\endqml
599
\badcode
600
void MAIN()
601
{
602
BASE_COLOR = vec4(baseColor.rgb, 1.0);
603
EMISSIVE_COLOR = vec3(0.4);
604
}
605
\endcode
606
\li \image quick3d-custom-cube7-anim.gif
607
\endtable
608
609
Let's do something less trivial, something that is not implementable with a
610
PrincipledMaterial and its standard, built-in properties. The following material
611
visualizes the texture UV coordinates of the cube mesh. U runs 0 to 1, so from black to
612
red, while V is also 0 to 1, black to green.
613
614
\table
615
\header
616
\li Change in main.qml, material.frag
617
\li Result
618
\row \li \qml
619
materials: CustomMaterial {
620
fragmentShader: "material.frag"
621
}
622
\endqml
623
\badcode
624
void MAIN()
625
{
626
BASE_COLOR = vec4(UV0, 0.0, 1.0);
627
}
628
\endcode
629
\li \image quick3d-custom-cube8.jpg
630
\endtable
631
632
While we are at it, why not visualize normals as well, this time on a sphere. Like with
633
UVs, if a custom vertex shader snippet were to alter the value of NORMAL, the interpolated
634
per-fragment value in the fragment shader, also exposed under the name NORMAL, would
635
reflect those adjustments.
636
637
\table
638
\header
639
\li Change in main.qml, material.frag
640
\li Result
641
\row \li \qml
642
Model {
643
source: "#Sphere"
644
scale: Qt.vector3d(2, 2, 2)
645
materials: CustomMaterial {
646
fragmentShader: "material.frag"
647
}
648
}
649
\endqml
650
\badcode
651
void MAIN()
652
{
653
BASE_COLOR = vec4(NORMAL, 1.0);
654
}
655
\endcode
656
\li \image quick3d-custom-cube9.jpg
657
\endtable
658
659
\section2 Colors
660
661
Let's switch over to a teapot model for a moment, make the material a blend of metallic
662
and dielectric, and try to set a green base color for it. The \c green QColor value maps
663
to \c{(0, 128, 0)}, based on which our first attempt could be:
664
665
\table
666
\header
667
\li main.qml, material.frag
668
\row \li \qml
669
import QtQuick
670
import QtQuick3D
671
Item {
672
View3D {
673
anchors.fill: parent
674
environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
675
PerspectiveCamera { z: 600 }
676
DirectionalLight { }
677
Model {
678
source: "teapot.mesh"
679
scale: Qt.vector3d(60, 60, 60)
680
eulerRotation.x: 30
681
materials: CustomMaterial {
682
fragmentShader: "material.frag"
683
}
684
}
685
}
686
}
687
\endqml
688
\badcode
689
void MAIN()
690
{
691
BASE_COLOR = vec4(0.0, 0.5, 0.0, 1.0);
692
METALNESS = 0.6;
693
SPECULAR_AMOUNT = 0.4;
694
ROUGHNESS = 0.4;
695
}
696
\endcode
697
\endtable
698
699
\image quick3d-custom-color1.jpg
700
701
This does not look entirely right. Compare with the second approach:
702
703
\table
704
\header
705
\li Change in main.qml, material.frag
706
\li Result
707
\row \li \qml
708
materials: CustomMaterial {
709
fragmentShader: "material.frag"
710
property color uColor: "green"
711
}
712
\endqml
713
\badcode
714
void MAIN()
715
{
716
BASE_COLOR = vec4(uColor.rgb, 1.0);
717
METALNESS = 0.6;
718
SPECULAR_AMOUNT = 0.4;
719
ROUGHNESS = 0.4;
720
}
721
\endcode
722
\li \image quick3d-custom-color2.jpg
723
\endtable
724
725
Switching to a PrincipledMaterial, we can confirm that setting the
726
\l{PrincipledMaterial::baseColor} to "green" and following the metalness and other
727
properties, the result is identical to our second approach:
728
729
\table
730
\header
731
\li Change in main.qml
732
\li Result
733
\row \li \qml
734
materials: PrincipledMaterial {
735
baseColor: "green"
736
metalness: 0.6
737
specularAmount: 0.4
738
roughness: 0.4
739
}
740
\endqml
741
\li \image quick3d-custom-color3.jpg
742
\endtable
743
744
If the type of the \c uColor property was changed to \c vector4d, or any type other than
745
\c color, the results would suddenly change and become identical to our first approach.
746
747
Why is this?
748
749
The answer lies in the sRGB to linear conversion that is performed implicitly for color
750
properties of DefaultMaterial, PrincipledMaterial, and also for custom properties with a
751
\c color type in a CustomMaterial. Such conversion is not performed for any other value,
752
so if the shader hardcodes a color value, or bases it on a QML property with a type
753
different from \c color, it will be up to the shader to perform linearization in case the
754
source value was in sRGB color space. Converting to linear is important since Qt Quick 3D
755
performs \l{SceneEnvironment::tonemapMode}{tonemapping} on the results of fragment
756
shading, and that process assumes values in the sRGB space as its input.
757
758
The built-in QColor constants, such as, \c{"green"}, are all given in sRGB
759
space. Therefore, just assigning \c{vec4(0.0, 0.5, 0.0, 1.0)} to BASE_COLOR in the first
760
attempt is insufficient if we wanted a result that matches an RGB value \c{(0, 128, 0)} in
761
the sRGB space. See the \c BASE_COLOR documentation in \l CustomMaterial for a formula for
762
linearizing such color values. The same applies to color values retrieved by sampling
763
textures: if the source image data is not in the sRGB color space, a conversion is needed
764
(unless \l{SceneEnvironment::tonemapMode}{tonemapping} is disabled).
765
766
\section2 Blending
767
768
Just writing a value less than \c 1.0 to \c{BASE_COLOR.a} is not sufficient if the
769
expectation is to get alpha blending. Such materials will very often change the values of
770
\l{CustomMaterial::sourceBlend}{sourceBlend} and
771
\l{CustomMaterial::destinationBlend}{destinationBlend} properties to get the desired
772
results.
773
774
Also keep in mind that the combined alpha value is the \l{Node::opacity}{Node opacity}
775
multiplied by the material alpha.
776
777
To visualize, let's use a shader that assigns red with alpha \c 0.5 to \c BASE_COLOR:
778
779
\table
780
\header
781
\li main.qml, material.frag
782
\li Result
783
\row \li \qml
784
import QtQuick
785
import QtQuick3D
786
Item {
787
View3D {
788
anchors.fill: parent
789
environment: SceneEnvironment {
790
backgroundMode: SceneEnvironment.Color
791
clearColor: "white"
792
}
793
PerspectiveCamera {
794
id: camera
795
z: 600
796
}
797
DirectionalLight { }
798
Model {
799
source: "#Cube"
800
x: -150
801
eulerRotation.x: 60
802
eulerRotation.y: 20
803
materials: CustomMaterial {
804
fragmentShader: "material.frag"
805
}
806
}
807
Model {
808
source: "#Cube"
809
eulerRotation.x: 60
810
eulerRotation.y: 20
811
materials: CustomMaterial {
812
sourceBlend: CustomMaterial.SrcAlpha
813
destinationBlend: CustomMaterial.OneMinusSrcAlpha
814
fragmentShader: "material.frag"
815
}
816
}
817
Model {
818
source: "#Cube"
819
x: 150
820
eulerRotation.x: 60
821
eulerRotation.y: 20
822
materials: CustomMaterial {
823
sourceBlend: CustomMaterial.SrcAlpha
824
destinationBlend: CustomMaterial.OneMinusSrcAlpha
825
fragmentShader: "material.frag"
826
}
827
opacity: 0.5
828
}
829
}
830
}
831
\endqml
832
\badcode
833
void MAIN()
834
{
835
BASE_COLOR = vec4(1.0, 0.0, 0.0, 0.5);
836
}
837
\endcode
838
\li \image quick3d-custom-blend.jpg
839
\endtable
840
841
The first cube is writing 0.5 to the alpha value of the color but it does not bring
842
visible results since alpha blending is not enabled. The second cube enables simple alpha
843
blending via the CustomMaterial properties. The third one also assigns an opacity of 0.5
844
to the Model, which means that the effective opacity is 0.25.
845
846
\section2 Passing data between the vertex and fragment shader
847
848
Calculating a value per vertex (for example, assuming a single triangle, for the 3 corners
849
of the triangle), and then passing it on to the fragment stage, where for each fragment
850
(for example, every fragment covered by the rasterized triangle) an interpolated value is
851
made accessible. In custom material shader snippets this is made possible by the \c
852
VARYING keyword. This provides a syntax similar to GLSL 120 and GLSL ES 100, but will work
853
regardless of the graphics API used at run time. The engine will take care of rewriting
854
the varying declaration as appropriate.
855
856
Let's see how the classic texture sampling with UV coordinates would look like. Textures
857
are going to be covered in an upcoming section, for now let's focus on how we get the UV
858
coordinates that can be passed to the \c{texture()} function in the shader.
859
860
\table
861
\header
862
\li main.qml, material.vert, material.frag
863
\row \li \qml
864
import QtQuick
865
import QtQuick3D
866
Item {
867
View3D {
868
anchors.fill: parent
869
environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
870
PerspectiveCamera { z: 600 }
871
DirectionalLight { }
872
Model {
873
source: "#Sphere"
874
scale: Qt.vector3d(4, 4, 4)
875
eulerRotation.x: 30
876
materials: CustomMaterial {
877
vertexShader: "material.vert"
878
fragmentShader: "material.frag"
879
property TextureInput someTextureMap: TextureInput {
880
texture: Texture {
881
source: "qt_logo_rect.png"
882
}
883
}
884
}
885
}
886
}
887
}
888
\endqml
889
\badcode
890
VARYING vec2 uv;
891
void MAIN()
892
{
893
uv = UV0;
894
}
895
\endcode
896
\badcode
897
VARYING vec2 uv;
898
void MAIN()
899
{
900
BASE_COLOR = texture(someTextureMap, uv);
901
}
902
\endcode
903
\endtable
904
905
\table
906
\header
907
\li qt_logo_rect.png
908
\li Result
909
\row \li \image quick3d-custom-varying-map.png
910
\li \image quick3d-custom-varying1.jpg
911
\endtable
912
913
Note that \c VARYING declarations. The name and type must match, \c uv in the fragment
914
shader will expose the interpolated UV coordinate for the current fragment.
915
916
Any other type of data can be passed on to the fragment stage in a similar manner. It is
917
worth noting that in many cases setting up the material's own varyings is not necessary
918
because there are builtins provided that cover many of typical needs. This includes making
919
the (interpolated) normals, UVs, world position (\c VAR_WORLD_POSITION), or the vector
920
pointing towards the camera (\c VIEW_VECTOR).
921
922
The above example can in fact be simplified to the following as \c UV0 is automatically
923
available in the fragment stage as well:
924
925
\table
926
\header
927
\li Change in main.qml, material.frag
928
\li Result
929
\row \li \qml
930
materials: CustomMaterial {
931
fragmentShader: "material.frag"
932
property TextureInput someTextureMap: TextureInput {
933
texture: Texture {
934
source: "qt_logo_rect.png"
935
}
936
}
937
\endqml
938
\badcode
939
void MAIN()
940
{
941
BASE_COLOR = texture(someTextureMap, UV0);
942
}
943
\endcode
944
\li \image quick3d-custom-varying1.jpg
945
\endtable
946
947
To disable interpolation for a variable, use the \c flat keyword in both the
948
vertex and fragment shader snippet. For example:
949
\badcode
950
VARYING flat vec2 v;
951
\endcode
952
953
\section2 Textures
954
955
A \l CustomMaterial has no built-in texture maps, meaning there is no equivalent of, for
956
example, \l{PrincipledMaterial::baseColorMap}. This is because implementing the same is
957
often trivial, while giving a lot more flexibility than what DefaultMaterial and
958
PrincipledMaterial has built in. Besides simply sampling a texture, custom fragment shader
959
snippets are free to combine and blend data from various sources when calculating the
960
values they assign to \c BASE_COLOR, \c EMISSIVE_COLOR, \c ROUGHNESS, etc. They can base
961
these calculations on data provided via QML properties, interpolated data sent on from the
962
vertex stage, values retrieved from sampling textures, and on hardcoded values.
963
964
As the previous example shows, exposing a texture to the vertex, fragment, or both shaders
965
is very similar to scalar and vector uniform values: a QML property with the type \l
966
TextureInput will automatically get associated with a \c sampler2D in the shader code. As
967
always, there is no need to declare this sampler in the shader code.
968
969
A \l TextureInput references a \l Texture, with an additional
970
\l{TextureInput::enabled}{enabled} property. A \l Texture can source its data in three
971
ways: \l{Texture::source}{from an image file}, \l{Texture::sourceItem}{from a texture with
972
live Qt Quick content}, or \l{Texture::textureData}{can be provided from C++} via
973
QQuick3DTextureData.
974
975
\note When it comes to \l Texture properties, the source, tiling, and filtering related
976
ones are the only ones that are taken into account implicitly with custom materials, as
977
the rest (such as, UV transformations) is up to the custom shaders to implement as they
978
see fit.
979
980
Let's see an example where a model, a sphere in this case, is textured using live Qt Quick
981
content:
982
983
\table
984
\header
985
\li main.qml, material.frag
986
\row \li \qml
987
import QtQuick
988
import QtQuick3D
989
Item {
990
View3D {
991
anchors.fill: parent
992
environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
993
PerspectiveCamera { z: 600 }
994
DirectionalLight { }
995
Model {
996
source: "#Sphere"
997
scale: Qt.vector3d(4, 4, 4)
998
eulerRotation.x: 30
999
materials: CustomMaterial {
1000
fragmentShader: "material.frag"
1001
property TextureInput someTextureMap: TextureInput {
1002
texture: Texture {
1003
sourceItem: Rectangle {
1004
width: 512; height: 512
1005
color: "red"
1006
Rectangle {
1007
width: 32; height: 32
1008
anchors.horizontalCenter: parent.horizontalCenter
1009
y: 150
1010
color: "gray";
1011
NumberAnimation on rotation { from: 0; to: 360; duration: 3000; loops: -1 }
1012
}
1013
Text {
1014
anchors.centerIn: parent
1015
text: "Texture Map"
1016
font.pointSize: 16
1017
}
1018
}
1019
}
1020
}
1021
}
1022
}
1023
}
1024
}
1025
\endqml
1026
\badcode
1027
void MAIN()
1028
{
1029
vec2 uv = vec2(UV0.x, 1.0 - UV0.y);
1030
vec4 c = texture(someTextureMap, uv);
1031
BASE_COLOR = c;
1032
}
1033
\endcode
1034
\endtable
1035
1036
\image quick3d-custmat-tex1-anim.gif
1037
1038
Here the 2D subtree (Rectangle with two children: another Rectangle and the Text) is
1039
rendered in to an 512x512 2D texture every time this mini-scene changes. The texture is
1040
then exposed to the custom material under the name of \c someTextureMap.
1041
1042
Note the flipping of the V coordinate in the shader. As noted above, custom materials,
1043
where there is full programmability on shader level, do not offer the "fixed" features of
1044
\l Texture and \l PrincipledMaterial. This means that any transformations to the UV
1045
coordinates will need to be applied by the shader. Here we know that the texture is
1046
generated via \l{Texture::sourceItem} and so V needs to be flipped to get something that
1047
matches the UV set of the mesh we are using.
1048
1049
What this example shows is possible to do with a \l PrincipledMaterial too. Let's make it
1050
more interesting by doing a simple emboss effect in addition:
1051
1052
\table
1053
\header
1054
\li material.frag
1055
\li Result
1056
\row \li \badcode
1057
void MAIN()
1058
{
1059
vec2 uv = vec2(UV0.x, 1.0 - UV0.y);
1060
vec2 size = vec2(textureSize(someTextureMap, 0));
1061
vec2 d = vec2(1.0 / size.x, 1.0 / size.y);
1062
vec4 diff = texture(someTextureMap, uv + d) - texture(someTextureMap, uv - d);
1063
float c = (diff.x + diff.y + diff.z) + 0.5;
1064
BASE_COLOR = vec4(c, c, c, 1.0);
1065
}
1066
\endcode
1067
\li \image quick3d-custmat-tex2-anim.gif
1068
\endtable
1069
1070
With the features covered so far a wide range of possibilities are open for creating
1071
materials that shade the meshes in visually impressive ways. To finish the basic tour,
1072
let's look at an example that applies height and normal maps to a plane mesh. (a dedicated
1073
\c{.mesh} file is used here because the builtin \c{#Rectangle} does not have enough
1074
subdivisions) For better lighting results, we will use image based lighting with a 360
1075
degree HDR image. The image is also set as the skybox to make it more clear what is
1076
happening.
1077
1078
First let's start with an empty CustomMaterial:
1079
1080
\table
1081
\header
1082
\li main.qml
1083
\li Result
1084
\row \li \qml
1085
import QtQuick
1086
import QtQuick3D
1087
Item {
1088
View3D {
1089
anchors.fill: parent
1090
environment: SceneEnvironment {
1091
backgroundMode: SceneEnvironment.SkyBox
1092
lightProbe: Texture {
1093
source: "00489_OpenfootageNET_snowfield_low.hdr"
1094
}
1095
}
1096
PerspectiveCamera {
1097
z: 600
1098
}
1099
Model {
1100
source: "plane.mesh"
1101
scale: Qt.vector3d(400, 400, 400)
1102
z: 400
1103
y: -50
1104
eulerRotation.x: -90
1105
materials: CustomMaterial { }
1106
}
1107
}
1108
}
1109
\endqml
1110
\li \image quick3d-custom-tex3.jpg
1111
\endtable
1112
1113
Now let's make some shaders that apply a height and normal map to the mesh:
1114
1115
\table
1116
\header
1117
\li Height map
1118
\li Normap map
1119
\row
1120
\li \image quick3d-custom-heightmap.png
1121
\li \image quick3d-custom-normalmap.jpg
1122
\endtable
1123
1124
\table
1125
\header
1126
\li material.vert, material.frag
1127
\row
1128
\li \badcode
1129
float getHeight(vec2 pos)
1130
{
1131
return texture(heightMap, pos).r;
1132
}
1133
1134
void MAIN()
1135
{
1136
const float offset = 0.004;
1137
VERTEX.y += getHeight(UV0);
1138
TANGENT = normalize(vec3(0.0, getHeight(UV0 + vec2(0.0, offset)) - getHeight(UV0 + vec2(0.0, -offset)), offset * 2.0));
1139
BINORMAL = normalize(vec3(offset * 2.0, getHeight(UV0 + vec2(offset, 0.0)) - getHeight(UV0 + vec2(-offset, 0.0)), 0.0));
1140
NORMAL = cross(TANGENT, BINORMAL);
1141
}
1142
\endcode
1143
\badcode
1144
void MAIN()
1145
{
1146
vec3 normalValue = texture(normalMap, UV0).rgb;
1147
normalValue.xy = normalValue.xy * 2.0 - 1.0;
1148
normalValue.z = sqrt(max(0.0, 1.0 - dot(normalValue.xy, normalValue.xy)));
1149
NORMAL = normalize(mix(NORMAL, TANGENT * normalValue.x + BINORMAL * normalValue.y + NORMAL * normalValue.z, 1.0));
1150
}
1151
\endcode
1152
\endtable
1153
1154
\table
1155
\header
1156
\li Change in main.qml
1157
\li Result
1158
\row
1159
\li \qml
1160
materials: CustomMaterial {
1161
vertexShader: "material.vert"
1162
fragmentShader: "material.frag"
1163
property TextureInput normalMap: TextureInput {
1164
texture: Texture { source: "normalmap.jpg" }
1165
}
1166
property TextureInput heightMap: TextureInput {
1167
texture: Texture { source: "heightmap.png" }
1168
}
1169
}
1170
\endqml
1171
\li \image quick3d-custom-tex4.jpg
1172
\endtable
1173
1174
\note The \l WasdController object can be immensely helpful during development and
1175
troubleshooting as it allows navigating and looking around in the scene with the keyboard
1176
and mouse in a familiar manner. Having a camera controlled by the WasdController is as
1177
simple as:
1178
1179
\qml
1180
import QtQuick3D.Helpers
1181
View3D {
1182
PerspectiveCamera {
1183
id: camera
1184
}
1185
// ...
1186
}
1187
WasdController {
1188
controlledObject: camera
1189
}
1190
\endqml
1191
1192
\section2 Depth and screen textures
1193
1194
When a custom shader snippet uses the \c DEPTH_TEXTURE or \c SCREEN_TEXTURE keywords, it
1195
opts in to generating the corresponding textures in a separate render pass, which is not
1196
necessarily a cheap operation, but allows implementing a variety of techniques, such as
1197
refraction for glass-like materials.
1198
1199
\c DEPTH_TEXTURE is a \c sampler2D that allows sampling a texture with the contents of the
1200
depth buffer with all the \c opaque objects in the scene rendered. Similarly, \c
1201
SCREEN_TEXTURE is a \c sampler2D that allows sampling a texture containing the contents of
1202
the scene excluding any transparent materials or any materials also using the
1203
SCREEN_TEXTURE. The texture can be used for materials that require the contents of the
1204
framebuffer they are being rendered to. The SCREEN_TEXTURE texture uses the same clear mode
1205
as the View3D. The size of these textures matches the size of the View3D in pixels.
1206
1207
Let's have a simple demonstration by visualizing the depth buffer contents via \c
1208
DEPTH_TEXTURE. The camera's \l{PerspectiveCamera::clipFar}{far clip value} is reduced here from the
1209
default 10000 to 2000, in order to have a smaller range, and so have the visualized depth
1210
value differences more obvious. The result is a rectangle that happens to visualize the
1211
depth buffer for the scene over its surface.
1212
1213
\table
1214
\header
1215
\li main.qml, material.frag
1216
\li Result
1217
\row
1218
\li \qml
1219
import QtQuick
1220
import QtQuick3D
1221
import QtQuick3D.Helpers
1222
Rectangle {
1223
width: 400
1224
height: 400
1225
color: "black"
1226
View3D {
1227
anchors.fill: parent
1228
PerspectiveCamera {
1229
id: camera
1230
z: 600
1231
clipNear: 1
1232
clipFar: 2000
1233
}
1234
DirectionalLight { }
1235
Model {
1236
source: "#Cube"
1237
scale: Qt.vector3d(2, 2, 2)
1238
position: Qt.vector3d(150, 200, -1000)
1239
eulerRotation.x: 60
1240
eulerRotation.y: 20
1241
materials: PrincipledMaterial { }
1242
}
1243
Model {
1244
source: "#Cylinder"
1245
scale: Qt.vector3d(2, 2, 2)
1246
position: Qt.vector3d(400, 200, -1000)
1247
materials: PrincipledMaterial { }
1248
opacity: 0.3
1249
}
1250
Model {
1251
source: "#Sphere"
1252
scale: Qt.vector3d(2, 2, 2)
1253
position: Qt.vector3d(-150, 200, -600)
1254
materials: PrincipledMaterial { }
1255
}
1256
Model {
1257
source: "#Cone"
1258
scale: Qt.vector3d(2, 2, 2)
1259
position: Qt.vector3d(0, 400, -1200)
1260
materials: PrincipledMaterial { }
1261
}
1262
Model {
1263
source: "#Rectangle"
1264
scale: Qt.vector3d(3, 3, 3)
1265
y: -150
1266
materials: CustomMaterial {
1267
fragmentShader: "material.frag"
1268
}
1269
}
1270
}
1271
WasdController {
1272
controlledObject: camera
1273
}
1274
}
1275
\endqml
1276
\badcode
1277
void MAIN()
1278
{
1279
float zNear = CAMERA_PROPERTIES.x;
1280
float zFar = CAMERA_PROPERTIES.y;
1281
float zRange = zFar - zNear;
1282
vec4 depthSample = texture(DEPTH_TEXTURE, vec2(UV0.x, 1.0 - UV0.y));
1283
float zn = 2.0 * depthSample.r - 1.0;
1284
float d = 2.0 * zNear * zFar / (zFar + zNear - zn * zRange);
1285
d /= zFar;
1286
BASE_COLOR = vec4(d, d, d, 1.0);
1287
}
1288
\endcode
1289
\li \image quick3d-custom-depth-anim.gif
1290
\endtable
1291
1292
Note how the cylinder is not present in \c DEPTH_TEXTURE due to its reliance on
1293
semi-transparency, which puts it into a different category than the other objects that are
1294
all opaque. These objects do not write into the depth buffer, although they do test
1295
against the depth values written by opaque objects, and rely on being rendered in back to
1296
front order. Hence they are not present in \c DEPTH_TEXTURE either.
1297
1298
What happens if we switch the shader to sample \c SCREEN_TEXTURE instead?
1299
1300
\table
1301
\header
1302
\li material.frag
1303
\li Result
1304
\row \li \badcode
1305
void MAIN()
1306
{
1307
vec4 c = texture(SCREEN_TEXTURE, vec2(UV0.x, 1.0 - UV0.y));
1308
if (c.a == 0.0)
1309
c.rgb = vec3(0.2, 0.1, 0.3);
1310
BASE_COLOR = c;
1311
}
1312
\endcode
1313
\li \image quick3d-custom-screen.jpg
1314
\endtable
1315
1316
Here the rectangle is textured with \c SCREEN_TEXTURE, while replacing transparent pixels
1317
with purple.
1318
1319
\section2 Light processor functions
1320
1321
An advanced feature of \l CustomMaterial is the ability to define functions in the
1322
fragment shader that reimplement the lighting equations that are used to calculate the
1323
fragment color. A light processor function, when present, is called once per each light in
1324
the scene, for each fragment. There is a dedicated function for different light types, as
1325
well as the ambient and specular contribution. When no corresponding light processor
1326
function is present, the standard calculations are used, just like a PrincipledMaterial
1327
would do. When a light processor is present, but the function body is empty, it means
1328
there will be no contribution from a given type of lights in the scene.
1329
1330
Refer to the \l CustomMaterial documentation for details on functions such as \c
1331
DIRECTIONAL_LIGHT, \c POINT_LIGHT, \c SPOT_LIGHT, \c AMBIENT_LIGHT, and \c SPECULAR_LIGHT.
1332
1333
\section2 Unshaded custom materials
1334
1335
There is another type of \l CustomMaterial: \c unshaded custom materials. All the example
1336
so far used \c shaded custom materials, with the
1337
\l{CustomMaterial::shadingMode}{shadingMode} property left at its default
1338
CustomMaterial.Shaded value.
1339
1340
What happens if we switch this property to CustomMaterial.Unshaded?
1341
1342
First of all, keywords like \c BASE_COLOR, \c EMISSIVE_COLOR, \c METALNESS, etc. no longer
1343
have the desired effect. This is because an unshaded material, as the name suggests, does
1344
not automatically get amended with much of the standard shading code, thus ignoring
1345
lights, image based lighting, shadows, and ambient occlusion in the scene. Rather, an
1346
unshaded material gives full control to the shader via the \c FRAGCOLOR keyword. This is
1347
similar to gl_FragColor: the color assigned to \c FRAGCOLOR is the result and the final
1348
color of the fragment, without any further adjustments by Qt Quick 3D.
1349
1350
\table
1351
\header
1352
\li main.qml, material.frag, material2.frag
1353
\li Result
1354
\row \li \qml
1355
import QtQuick
1356
import QtQuick3D
1357
Item {
1358
View3D {
1359
anchors.fill: parent
1360
environment: SceneEnvironment {
1361
backgroundMode: SceneEnvironment.Color
1362
clearColor: "black"
1363
}
1364
PerspectiveCamera { z: 600 }
1365
DirectionalLight { }
1366
Model {
1367
source: "#Cylinder"
1368
x: -100
1369
eulerRotation.x: 30
1370
materials: CustomMaterial {
1371
fragmentShader: "material.frag"
1372
}
1373
}
1374
Model {
1375
source: "#Cylinder"
1376
x: 100
1377
eulerRotation.x: 30
1378
materials: CustomMaterial {
1379
shadingMode: CustomMaterial.Unshaded
1380
fragmentShader: "material2.frag"
1381
}
1382
}
1383
}
1384
}
1385
\endqml
1386
\badcode
1387
void MAIN()
1388
{
1389
BASE_COLOR = vec4(1.0);
1390
}
1391
\endcode
1392
\badcode
1393
void MAIN()
1394
{
1395
FRAGCOLOR = vec4(1.0);
1396
}
1397
\endcode
1398
\li \image quick3d-custom-unshaded1.jpg
1399
\endtable
1400
1401
Notice how the right cylinder ignores the DirectionalLight in the scene. Its shading knows
1402
nothing about scene lighting, the final fragment color is all white.
1403
1404
The vertex shader in an unshaded material still has the typical inputs available: \c
1405
VERTEX, \c NORMAL, \c MODELVIEWPROJECTION_MATRIX, etc. and can write to \c POSITION. The
1406
fragment shader no longer has the similar conveniences available, however: \c NORMAL, \c
1407
UV0, or \c VAR_WORLD_POSITION are not available in an unshaded material's fragment
1408
shader. Rather, it is now up to the shader code to calculate and pass on using \c VARYING
1409
everything it needs to determine the final fragment color.
1410
1411
Let's look at an example that has both a vertex and fragment shader. The altered vertex
1412
position is passed on to the fragment shader, with an interpolated value made available to
1413
every fragment.
1414
1415
\table
1416
\header
1417
\li main.qml, material.vert, material.frag
1418
\row \li \qml
1419
import QtQuick
1420
import QtQuick3D
1421
Item {
1422
View3D {
1423
anchors.fill: parent
1424
environment: SceneEnvironment {
1425
backgroundMode: SceneEnvironment.Color
1426
clearColor: "black"
1427
}
1428
PerspectiveCamera { z: 600 }
1429
Model {
1430
source: "#Sphere"
1431
scale: Qt.vector3d(3, 3, 3)
1432
materials: CustomMaterial {
1433
property real time: 0.0
1434
NumberAnimation on time { from: 0; to: 100; duration: 20000; loops: -1 }
1435
property real amplitude: 10.0
1436
shadingMode: CustomMaterial.Unshaded
1437
vertexShader: "material.vert"
1438
fragmentShader: "material.frag"
1439
}
1440
}
1441
}
1442
}
1443
\endqml
1444
\badcode
1445
VARYING vec3 pos;
1446
void MAIN()
1447
{
1448
pos = VERTEX;
1449
pos.x += sin(time * 4.0 + pos.y) * amplitude;
1450
POSITION = MODELVIEWPROJECTION_MATRIX * vec4(pos, 1.0);
1451
}
1452
\endcode
1453
\badcode
1454
VARYING vec3 pos;
1455
void MAIN()
1456
{
1457
FRAGCOLOR = vec4(vec3(pos.x * 0.02, pos.y * 0.02, pos.z * 0.02), 1.0);
1458
}
1459
\endcode
1460
\endtable
1461
1462
\image quick3d-custom-unshaded-anim.gif
1463
1464
Unshaded materials are useful when interacting with scene lighting is not necessary or
1465
desired, and the material needs full control on the final fragment color. Notice how the
1466
example above has neither a DirectionalLight nor any other lights, but the sphere with the
1467
custom material shows up as expected.
1468
1469
\note An unshaded material that only has a vertex shader snippet, but does not specify the
1470
fragmentShader property, will still be functional but the results are as if the
1471
shadingMode was set to Shaded. Therefore it makes little sense to switch shadingMode for
1472
materials that only have a vertex shader.
1473
1474
\section1 Programmability for Effects
1475
1476
Post-processing effects apply one or more fragment shaders to the result of a \l
1477
View3D. The output from these fragment shaders is then displayed instead of the original
1478
rendering results. This is conceptually very similar to Qt Quick's \l ShaderEffect and \l
1479
ShaderEffectSource.
1480
1481
\note Post-processing effects are only available when the
1482
\l{View3D::renderMode}{renderMode} for the View3D is set to View3D.Offscreen.
1483
1484
Custom vertex shader snippets can also be specified for an effect, but they have limited
1485
usefulness and therefore are expected to be used relatively rarely. The vertex input for a
1486
post-processing effect is a quad (either two triangles or a triangle strip), transforming
1487
or displacing the vertices of that is often not helpful. It can however make sense to have
1488
a vertex shader in order to calculate and pass on data to the fragment shader using the \c
1489
VARYING keyword. As usual, the fragment shader will then receive an interpolated value
1490
based on the current fragment coordinate.
1491
1492
The syntax of the shader snippets associated with a \l Effect is identical to the shaders
1493
for an unshaded \l CustomMaterial. When it comes to the built-in special keywords, \c
1494
VARYING, \c MAIN, \c FRAGCOLOR (fragment shader only), \c POSITION (vertex shader only), \c
1495
VERTEX (vertex shader only), and \c MODELVIEWPROJECTION_MATRIX work identically to \l
1496
CustomMaterial.
1497
1498
The most important special keywords for \l Effect fragment shaders are the following:
1499
1500
\table
1501
\header
1502
\li Name
1503
\li Type
1504
\li Description
1505
\row
1506
\li INPUT
1507
\li sampler2D or sampler2DArray
1508
\li The sampler for the input texture. An effect will typically sample this using \c INPUT_UV.
1509
\row
1510
\li INPUT_UV
1511
\li vec2
1512
\li UV coordinates for sampling \c INPUT.
1513
\row
1514
\li INPUT_SIZE
1515
\li vec2
1516
\li The size of the \c INPUT texture, in pixels. This is a convenient alternative to calling textureSize().
1517
\row
1518
\li OUTPUT_SIZE
1519
\li vec2
1520
\li The size of the output texture, in pixels. Equal to \c INPUT_SIZE in many cases, but a multi-pass effect
1521
may have passes that output to intermediate textures with different sizes.
1522
\row
1523
\li DEPTH_TEXTURE
1524
\li sampler2D
1525
\li Depth texture with the depth buffer contents with the opaque objects in the scene. Like with CustomMaterial,
1526
the presence of this keyword in the shader triggers generating the depth texture automatically.
1527
\endtable
1528
1529
\note When multiview rendering is enabled, the input texture is a 2D texture
1530
array. GLSL functions such as texture() and textureSize() take/return a
1531
vec3/ivec3, respectively, then. Use \c VIEW_INDEX for the layer. In VR/AR
1532
applications that wish to function both with and without multiview rendering, the
1533
portable approach is to write the shader code like this:
1534
\badcode
1535
#if QSHADER_VIEW_COUNT >= 2
1536
vec4 c = texture(INPUT, vec3(INPUT_UV, VIEW_INDEX));
1537
#else
1538
vec4 c = texture(INPUT, INPUT_UV);
1539
#endif
1540
\endcode
1541
1542
\section2 A post-processing effect
1543
1544
Let's start with a simple scene, this time using a few more objects, including a textured
1545
rectangle that uses a checkerboard texture as its base color map.
1546
1547
\table
1548
\header
1549
\li main.qml
1550
\li Result
1551
\row \li \qml
1552
import QtQuick
1553
import QtQuick3D
1554
Item {
1555
View3D {
1556
anchors.fill: parent
1557
environment: SceneEnvironment {
1558
backgroundMode: SceneEnvironment.Color
1559
clearColor: "black"
1560
}
1561
1562
PerspectiveCamera { z: 400 }
1563
1564
DirectionalLight { }
1565
1566
Texture {
1567
id: checkerboard
1568
source: "checkerboard.png"
1569
scaleU: 20
1570
scaleV: 20
1571
tilingModeHorizontal: Texture.Repeat
1572
tilingModeVertical: Texture.Repeat
1573
}
1574
1575
Model {
1576
source: "#Rectangle"
1577
scale: Qt.vector3d(10, 10, 1)
1578
eulerRotation.x: -45
1579
materials: PrincipledMaterial {
1580
baseColorMap: checkerboard
1581
}
1582
}
1583
1584
Model {
1585
source: "#Cone"
1586
position: Qt.vector3d(100, -50, 100)
1587
materials: PrincipledMaterial { }
1588
}
1589
1590
Model {
1591
source: "#Cube"
1592
position.y: 100
1593
eulerRotation.y: 20
1594
materials: PrincipledMaterial { }
1595
}
1596
1597
Model {
1598
source: "#Sphere"
1599
position: Qt.vector3d(-150, 200, -100)
1600
materials: PrincipledMaterial { }
1601
}
1602
}
1603
}
1604
\endqml
1605
\li \image quick3d-custom-effect-section-scene.jpg
1606
\endtable
1607
1608
Now let's apply an affect to the entire scene. More precisely, to the View3D. When there
1609
are multiple View3D items in the scene, each has its own SceneEnvironment and therefore
1610
have their own post-processing effect chain. In the example there is one single View3D
1611
covering the entire window.
1612
1613
\table
1614
\header
1615
\li Change in main.qml
1616
\li effect.frag
1617
\row \li \qml
1618
environment: SceneEnvironment {
1619
backgroundMode: SceneEnvironment.Color
1620
clearColor: "black"
1621
effects: redEffect
1622
}
1623
1624
Effect {
1625
id: redEffect
1626
property real uRed: 1.0
1627
NumberAnimation on uRed { from: 1; to: 0; duration: 5000; loops: -1 }
1628
passes: Pass {
1629
shaders: Shader {
1630
stage: Shader.Fragment
1631
shader: "effect.frag"
1632
}
1633
}
1634
}
1635
\endqml
1636
\li \badcode
1637
void MAIN()
1638
{
1639
vec4 c = texture(INPUT, INPUT_UV);
1640
c.r = uRed;
1641
FRAGCOLOR = c;
1642
}
1643
\endcode
1644
\endtable
1645
1646
This simple effect alters the red color channel value. Exposing QML properties as uniforms
1647
works the same way with effects as with custom materials. The shader starts with a line
1648
that is going to be very common when writing fragment shaders fro effects: sampling \c
1649
INPUT at the UV coordinates \c INPUT_UV. It then performs its desired calculations, and
1650
assigns the final fragment color to \c FRAGCOLOR.
1651
1652
\image quick3d-custom-first-effect-anim.gif
1653
1654
Many properties set in the example are in plural (effects, passes, shaders). While the
1655
list \c{[ ]} syntax can be omitted when having a single element only, all these properties
1656
are lists, and can hold more than one element. Why is this?
1657
1658
\list
1659
1660
\li \l{SceneEnvironment::effects}{effects} is a list, because View3D allows chaining
1661
multiple effects together. The effects are applied in the order in which they are added to
1662
the list. This allows easily applying two or more effects together to the View3D, and is
1663
similar to what one can achieve in Qt Quick by nesting \l ShaderEffect items. The \c INPUT
1664
texture of the next effect is always a texture that contains the previous effect's
1665
output. The output of the last effect in what gets used as the final output of the View3D.
1666
1667
\li \l{Effect::passes}{passes} is a list, because unlike ShaderEffect, Effect has built-in
1668
support for multiple passes. A multi-pass effect is more powerful than chaining together
1669
multiple, independent effects in \l{SceneEnvironment::effects}{effects}: a pass can output
1670
to a temporary, intermediate texture, which can then be used as input to subsequent
1671
passes, in addition to the original input texture of the effect. This allows creating
1672
complex effects that calculate, render, and blend together multiple textures in order to
1673
get to the final fragment color. This advanced use case is not going to be covered
1674
here. Refer to the \l Effect documentation page for details.
1675
1676
\li \l{Pass::shaders}{shaders} is a list, because an effect may have both a vertex and a
1677
fragment shader associated.
1678
1679
\endlist
1680
1681
\section2 Chaining multiple effects
1682
1683
Let's look at an example where the effect from the previous example gets complemented by
1684
another effect similar to the built-in \l DistortionSpiral effect.
1685
1686
\table
1687
\header
1688
\li Change in main.qml
1689
\li effect2.frag
1690
\row \li \qml
1691
environment: SceneEnvironment {
1692
backgroundMode: SceneEnvironment.Color
1693
clearColor: "black"
1694
effects: [redEffect, distortEffect]
1695
}
1696
1697
Effect {
1698
id: redEffect
1699
property real uRed: 1.0
1700
NumberAnimation on uRed { from: 1; to: 0; duration: 5000; loops: -1 }
1701
passes: Pass {
1702
shaders: Shader {
1703
stage: Shader.Fragment
1704
shader: "effect.frag"
1705
}
1706
}
1707
}
1708
1709
Effect {
1710
id: distortEffect
1711
property real uRadius: 0.1
1712
NumberAnimation on uRadius { from: 0.1; to: 1.0; duration: 5000; loops: -1 }
1713
passes: Pass {
1714
shaders: Shader {
1715
stage: Shader.Fragment
1716
shader: "effect2.frag"
1717
}
1718
}
1719
}
1720
\endqml
1721
\li \badcode
1722
void MAIN()
1723
{
1724
vec2 center_vec = INPUT_UV - vec2(0.5, 0.5);
1725
center_vec.y *= INPUT_SIZE.y / INPUT_SIZE.x;
1726
float dist_to_center = length(center_vec) / uRadius;
1727
vec2 texcoord = INPUT_UV;
1728
if (dist_to_center <= 1.0) {
1729
float rotation_amount = (1.0 - dist_to_center) * (1.0 - dist_to_center);
1730
float r = radians(360.0) * rotation_amount / 4.0;
1731
float cos_r = cos(r);
1732
float sin_r = sin(r);
1733
mat2 rotation = mat2(cos_r, sin_r, -sin_r, cos_r);
1734
texcoord = vec2(0.5, 0.5) + rotation * (INPUT_UV - vec2(0.5, 0.5));
1735
}
1736
vec4 c = texture(INPUT, texcoord);
1737
FRAGCOLOR = c;
1738
}
1739
\endcode
1740
\endtable
1741
1742
\image quick3d-custom-chained-effect-anim.gif
1743
1744
Now the perhaps surprising question: why is this a bad example?
1745
1746
More precisely, it is not bad, but rather shows a pattern that can often be beneficial to
1747
avoid.
1748
1749
Chaining effects this way can be useful, but it is important to keep in mind the
1750
performance implications: doing two render passes (one to generate a texture with the
1751
adjusted red color channel, and then another one two calculate the distortion) is quite
1752
wasteful when one would be enough. If the fragment shader snippets were combined, the same
1753
result could have been achieved with one single effect.
1754
1755
\section1 Defining Mesh and Texture Data from C++
1756
1757
Procedurally generating mesh and texture image data both follow similar steps:
1758
1759
\list
1760
\li Subclass \l QQuick3DGeometry or \l QQuick3DTextureData
1761
\li Set the desired vertex or image data upon construction by calling the protected member functions
1762
from the base class
1763
\li If dynamic changes are needed afterwards at some point, set the new data and call update()
1764
\li Once the implementation is done, the class needs to be registered to make it visible in QML
1765
\li \l Model and \l Texture objects in QML can now use the custom vertex or image data provider by
1766
setting the \l{Model::geometry} or \l{Texture::textureData} property
1767
\endlist
1768
1769
\section2 Custom vertex data
1770
1771
Vertex data refers to the sequence of (typically \c float) values that make up a
1772
mesh. Instead of loading \c{.mesh} files, a custom geometry provider is responsible for
1773
providing the same data. The vertex data consist of \c attributes, such as position,
1774
texture (UV) coordinates, or normals. The specification of attributes describes what kind
1775
of attributes are present, the component type (for example, a 3 component float vector for
1776
vertex position consisting of x, y, z values), which offset they start at in the provided
1777
data, and what the stride (the increment that needs to be added to the offset to point to
1778
the next element for the same attribute) is.
1779
1780
This may seem familiar if one has worked with graphics APIs, such as OpenGL or Vulkan
1781
directly, because the way vertex input is specified with those APIs maps loosely to what a
1782
\c{.mesh} file or a \l QQuick3DGeometry instance defines.
1783
1784
In addition, the mesh topology (primitive type) must be specified too. For indexed
1785
drawing, the data for an index buffer must be provided as well.
1786
1787
There is one built-in custom geometry implementation: the QtQuick3D.Helpers module
1788
includes a \l GridGeometry type. This allows rendering a grid in the scene with line
1789
primitives, without having to implement a custom \l QQuick3DGeometry subclass.
1790
1791
One other common use cases is rendering points. This is fairly simple to do since the
1792
attribute specification is going to be minimal: we provide three floats (x, y, z) for each
1793
vertex, nothing else. A QQuick3DGeometry subclass could implement a geometry consisting of
1794
2000 points similarly to the following:
1795
1796
\badcode
1797
clear();
1798
const int N = 2000;
1799
const int stride = 3 * sizeof(float);
1800
QByteArray v;
1801
v.resize(N * stride);
1802
float *p = reinterpret_cast<float *>(v.data());
1803
QRandomGenerator *rg = QRandomGenerator::global();
1804
for (int i = 0; i < N; ++i) {
1805
const float x = float(rg->bounded(200.0f) - 100.0f) / 20.0f;
1806
const float y = float(rg->bounded(200.0f) - 100.0f) / 20.0f;
1807
*p++ = x;
1808
*p++ = y;
1809
*p++ = 0.0f;
1810
}
1811
setVertexData(v);
1812
setStride(stride);
1813
setPrimitiveType(QQuick3DGeometry::PrimitiveType::Points);
1814
addAttribute(QQuick3DGeometry::Attribute::PositionSemantic, 0, QQuick3DGeometry::Attribute::F32Type);
1815
\endcode
1816
1817
Combined with a material of
1818
1819
\qml
1820
DefaultMaterial {
1821
lighting: DefaultMaterial.NoLighting
1822
cullMode: DefaultMaterial.NoCulling
1823
diffuseColor: "yellow"
1824
pointSize: 4
1825
}
1826
\endqml
1827
1828
the end result is similar to this (here viewed from an altered camera angle, with the help
1829
of \l WasdController):
1830
1831
\image quick3d-custom-points.jpg
1832
1833
\note Be aware that point sizes and line widths other than 1 may not be supported at run
1834
time, depending on the underlying graphics API. This is not something Qt has control
1835
over. Therefore, it can become necessary to implement alternative techniques instead of
1836
relying on point and line drawing.
1837
1838
\section2 Custom texture data
1839
1840
With textures, the data that needs to be provided is a lot simpler structurally: it is the
1841
raw pixel data, with a varying number of bytes per pixel, depending on the texture
1842
format. For example, an \c RGBA texture expects four bytes per pixel, whereas \c RGBA16F
1843
is four half-floats per pixel. This is similar to what a \l QImage stores
1844
internally. However, Qt Quick 3D textures can have formats the data for which cannot be
1845
represented by a QImage. For example, floating point HDR textures, or compressed
1846
textures. Therefore the data for \l QQuick3DTextureData is always provided as a raw
1847
sequence of bytes. This may seem familiar if one has worked with graphics APIs, such as
1848
OpenGL or Vulkan directly.
1849
1850
For details, refer to the \l QQuick3DGeometry and \l QQuick3DTextureData documentation pages.
1851
1852
\sa CustomMaterial, Effect, QQuick3DGeometry, QQuick3DTextureData, {Qt Quick 3D - Custom
1853
Effect Example}, {Qt Quick 3D - Custom Shaders Example}, {Qt Quick 3D - Custom Materials
1854
Example}, {Qt Quick 3D - Custom Geometry Example}, {Qt Quick 3D - Procedural Texture
1855
Example}
1856
1857
*/
qtquick3d
src
quick3d
doc
src
qtquick3d-custom.qdoc
Generated on
for Qt by
1.14.0