Qt
Internal/Contributor docs for the Qt SDK. Note: These are NOT official API docs; those are found at https://doc.qt.io/
Loading...
Searching...
No Matches
qquick3deffect.cpp
Go to the documentation of this file.
1// Copyright (C) 2020 The Qt Company Ltd.
2// SPDX-License-Identifier: LicenseRef-Qt-Commercial OR GPL-3.0-only
3
5
6#include <ssg/qssgrendercontextcore.h>
7#include <QtQuick3DRuntimeRender/private/qssgrendereffect_p.h>
8#include <QtQuick3DRuntimeRender/private/qssgshadermaterialadapter_p.h>
9#include <QtQuick3DUtils/private/qssgutils_p.h>
10#include <QtQuick/qquickwindow.h>
11#include <QtQuick3D/private/qquick3dobject_p.h>
12#include <QtQuick3D/private/qquick3dscenemanager_p.h>
13#include <QtCore/qfile.h>
14#include <QtCore/qurl.h>
15
16
18
19/*!
20 \qmltype Effect
21 \inherits Object3D
22 \inqmlmodule QtQuick3D
23 \nativetype QQuick3DEffect
24 \brief Base component for creating a post-processing effect.
25
26 The Effect type allows the user to implement their own post-processing
27 effects for QtQuick3D.
28
29 \section1 Post-processing effects
30
31 A post-processing effect is conceptually very similar to Qt Quick's \l
32 ShaderEffect item. When an effect is present, the scene is rendered into a
33 separate texture first. The effect is then applied by drawing a textured
34 quad to the main render target, depending on the
35 \l{View3D::renderMode}{render mode} of the View3D. The effect can provide a
36 vertex shader, a fragment shader, or both. Effects are always applied on the
37 entire scene, per View3D.
38
39 Effects are associated with the \l SceneEnvironment in the
40 \l{SceneEnvironment::effects} property. The property is a list: effects can
41 be chained together; they are applied in the order they are in the list,
42 using the previous step's output as the input to the next one, with the last
43 effect's output defining the contents of the View3D.
44
45 \note \l SceneEnvironment and \l ExtendedSceneEnvironment provide a set of
46 built-in effects, such as depth of field, glow/bloom, lens flare, color
47 grading, and vignette. Always consider first if these are sufficient for
48 the application's needs, and prefer using the built-in facilities instead
49 of implementing a custom post-processing effect.
50
51 Effects are similar to \l{CustomMaterial}{custom materials} in many
52 ways. However, a custom material is associated with a model and is
53 responsible for the shading of that given mesh. Whereas an effect's vertex
54 shader always gets a quad (for example, two triangles) as its input, while
55 its fragment shader samples the texture with the scene's content.
56
57 Unlike custom materials, effects support multiple passes. For many effects
58 this it not necessary, and when there is a need to apply multiple effects,
59 identical results can often be achieved by chaining together multiple
60 effects in \l{SceneEnvironment::effects}{the SceneEnvironment}. This is
61 demonstrated by the \l{Qt Quick 3D - Custom Effect Example}{Custom Effect
62 example} as well. However, passes have the possibility to request additional
63 color buffers (texture), and specify which of these additional buffers they
64 output to. This allows implementing more complex image processing techniques
65 since subsequent passes can then use one or more of these additional
66 buffers, plus the original scene's content, as their input. If necessary,
67 these additional buffers can have an extended lifetime, meaning their
68 content is preserved between frames, which allows implementing effects that
69 rely on accumulating content from multiple frames, such as, motion blur.
70
71 When compared to Qt Quick's 2D ShaderEffect, the 3D post-processing effects
72 have the advantage of being able to work with depth buffer data, as well as
73 the ability to implement multiple passes with intermediate buffers. In
74 addition, the texture-related capabilities are extended: Qt Quick 3D allows
75 more fine-grained control over filtering modes, and allows effects to work
76 with texture formats other than RGBA8, for example, floating point formats.
77
78 \note Post-processing effects are currently available when the View3D
79 has its \l{View3D::renderMode}{renderMode} set to \c Offscreen,
80 \c Underlay or \c Overlay. Effects will not be rendered for \c Inline mode.
81
82 \note When using post-processing effects, the application-provided shaders
83 should expect linear color data without tonemapping applied. The
84 tonemapping that is performed during the main render pass (or during skybox
85 rendering, if there is a skybox) when
86 \l{SceneEnvironment::tonemapMode}{tonemapMode} is set to a value other than
87 \c SceneEnvironment.TonemapModeNone, is automatically disabled when there
88 is at least one post-processing effect specified in the SceneEnvironment.
89 The last effect in the chain (more precisely, the last pass of the last
90 effect in the chain) will automatically get its fragment shader amended to
91 perform the same tonemapping the main render pass would.
92
93 \note Effects that perform their own tonemapping should be used in a
94 SceneEnvironment that has the built-in tonemapping disabled by setting
95 \l{SceneEnvironment::tonemapMode}{tonemapMode} to \c
96 SceneEnvironment.TonemapModeNone.
97
98 \note By default the texture used as the effects' input is created with a
99 floating point texture format, such as 16-bit floating point RGBA. The
100 output texture's format is the same since by default it follows the input
101 format. This can be overridden using \l Buffer and an empty name. The
102 default RGBA16F is useful because it allows working with non-tonemapped
103 linear data without having the color values outside the 0-1 range clamped.
104
105 \section1 Exposing data to the shaders
106
107 Like with CustomMaterial or ShaderEffect, the dynamic properties of an
108 Effect object can be changed and animated using the usual QML and Qt Quick
109 facilities, and the values are exposed to the shaders automatically. The
110 following list shows how properties are mapped:
111
112 \list
113 \li bool, int, real -> bool, int, float
114 \li QColor, \l{QtQml::Qt::rgba()}{color} -> vec4, and the color gets
115 converted to linear, assuming sRGB space for the color value specified in
116 QML. The built-in Qt colors, such as \c{"green"} are in sRGB color space as
117 well, and the same conversion is performed for all color properties of
118 DefaultMaterial and PrincipledMaterial, so this behavior of Effect
119 matches those.
120 \li QRect, QRectF, \l{QtQml::Qt::rect()}{rect} -> vec4
121 \li QPoint, QPointF, \l{QtQml::Qt::point()}{point}, QSize, QSizeF, \l{QtQml::Qt::size()}{size} -> vec2
122 \li QVector2D, \l{QtQml::Qt::vector2d()}{vector2d} -> vec3
123 \li QVector3D, \l{QtQml::Qt::vector3d()}{vector3d} -> vec3
124 \li QVector4D, \l{QtQml::Qt::vector4d()}{vector4d} -> vec4
125 \li QMatrix4x4, \l{QtQml::Qt::matrix4x4()}{matrix4x4} -> mat4
126 \li QQuaternion, \l{QtQml::Qt::quaternion()}{quaternion} -> vec4, scalar value is \c w
127
128 \li TextureInput -> sampler2D or samplerCube, depending on whether \l
129 Texture or \l CubeMapTexture is used in the texture property of the
130 TextureInput. Setting the \l{TextureInput::enabled}{enabled} property to
131 false leads to exposing a dummy texture to the shader, meaning the shaders
132 are still functional but will sample a texture with opaque black image
133 content. Pay attention to the fact that properties for samplers must always
134 reference a \l TextureInput object, not a \l Texture directly. When it
135 comes to the \l Texture properties, the source, tiling, and filtering
136 related ones are the only ones that are taken into account implicitly with
137 effects, as the rest (such as, UV transformations) is up to the custom
138 shaders to implement as they see fit.
139
140 \endlist
141
142 \note When a uniform referenced in the shader code does not have a
143 corresponding property, it will cause a shader compilation error when
144 processing the effect at run time. There are some exceptions to this,
145 such as, sampler uniforms, that get a dummy texture bound when no
146 corresponding QML property is present, but as a general rule, all uniforms
147 and samplers must have a corresponding property declared in the
148 Effect object.
149
150 \section1 Getting started with user-defined effects
151
152 A custom post-processing effect involves at minimum an Effect object and a
153 fragment shader snippet. Some effects will also want a customized vertex
154 shader as well.
155
156 As a simple example, let's create an effect that combines the scene's
157 content with an image, while further altering the red channel's value in an
158 animated manner:
159
160 \table 70%
161 \row
162 \li \qml
163 Effect {
164 id: simpleEffect
165 property TextureInput tex: TextureInput {
166 texture: Texture { source: "image.png" }
167 }
168 property real redLevel
169 NumberAnimation on redLevel { from: 0; to: 1; duration: 5000; loops: -1 }
170 passes: Pass {
171 shaders: Shader {
172 stage: Shader.Fragment
173 shader: "effect.frag"
174 }
175 }
176 }
177 \endqml
178 \li \badcode
179 void MAIN()
180 {
181 vec4 c = texture(tex, TEXTURE_UV);
182 c.r *= redLevel;
183 FRAGCOLOR = c * texture(INPUT, INPUT_UV);
184 }
185 \endcode
186 \endtable
187
188 Here the texture with the image \c{image.png} is exposed to the shader under
189 the name \c tex. The value of redLevel is available in the shader in a \c
190 float uniform with the same name.
191
192 The fragment shader must contain a function called \c MAIN. The final
193 fragment color is determined by \c FRAGCOLOR. The main input texture, with
194 the contents of the View3D's scene, is accessible under a \c sampler2D with
195 the name \c INPUT. The UV coordinates from the quad are in \c
196 INPUT_UV. These UV values are always suitable for sampling \c INPUT,
197 regardless of the underlying graphics API at run time (and so regardless of
198 the Y axis direction in images since the necessary adjustments are applied
199 automatically by Qt Quick 3D). Sampling the texture with our external image
200 is done using \c TEXTURE_UV. \c INPUT_UV is not suitable in cross-platform
201 applications since V needs to be flipped to cater for the coordinate system
202 differences mentioned before, using a logic that is different for textures
203 based on images and textures used as render targets. Fortunately this is all
204 taken care of by the engine so the shader need no further logic for this.
205
206 Once simpleEffect is available, it can be associated with the effects list
207 of a the View3D's SceneEnvironment:
208
209 \qml
210 environment: SceneEnvironment {
211 effects: [ simpleEffect ]
212 }
213 \endqml
214
215 The results would look something like the following, with the original scene
216 on the left and with the effect applied on the right:
217
218 \table 70%
219 \row
220 \li \image effect_intro_1.png
221 \li \image effect_intro_2.png
222 \endtable
223
224 \note The \c shader property value in Shader is a URL, as is the custom in
225 QML and Qt Quick, referencing the file containing the shader snippet, and
226 works very similarly to ShaderEffect or
227 \l{Image::source}{Image.source}. Only the \c file and \c qrc schemes are
228 supported.. It is also possible to omit the \c file scheme, allowing to
229 specify a relative path in a convenient way. Such a path is resolved
230 relative to the component's (the \c{.qml} file's) location.
231
232 \note Shader code is always provided using Vulkan-style GLSL, regardless of
233 the graphics API used by Qt at run time.
234
235 \note The vertex and fragment shader code provided by the effect are not
236 full, complete GLSL shaders on their own. Rather, they provide a \c MAIN
237 function, and optionally a set of \c VARYING declarations, which are then
238 amended with further shader code by the engine.
239
240 \note The above example is not compatible with the optional multiview rendering mode that is used in some VR/AR applications.
241 To make it function both with and without multiview mode, change MAIN() like this:
242 \badcode
243 void MAIN()
244 {
245 vec4 c = texture(tex, TEXTURE_UV);
246 c.r *= redLevel;
247 #if QSHADER_VIEW_COUNT >= 2
248 FRAGCOLOR = c * texture(INPUT, vec3(INPUT_UV, VIEW_INDEX));
249 #else
250 FRAGCOLOR = c * texture(INPUT, INPUT_UV);
251 #endif
252 }
253 \endcode
254
255 \section1 Effects with vertex shaders
256
257 A vertex shader, when present, must provide a function called \c MAIN. In
258 the vast majority of cases the custom vertex shader will not want to provide
259 its own calculation of the homogenous vertex position, but it is possible
260 using \c POSITION, \c VERTEX, and \c MODELVIEWPROJECTION_MATRIX. When
261 \c POSITION is not present in the custom shader code, a statement equivalent to
262 \c{POSITION = MODELVIEWPROJECTION_MATRIX * vec4(VERTEX, 1.0);} will be
263 injected automatically by Qt Quick 3D.
264
265 To pass data between the vertex and fragment shaders, use the VARYING
266 keyword. Internally this will then be transformed into the appropriate
267 vertex output or fragment input declaration. The fragment shader can use the
268 same declaration, which then allows to read the interpolated value for the
269 current fragment.
270
271 Let's look at example, that is in effect very similar to the built-in
272 DistortionSpiral effect:
273
274 \table 70%
275 \row
276 \li \badcode
277 VARYING vec2 center_vec;
278 void MAIN()
279 {
280 center_vec = INPUT_UV - vec2(0.5, 0.5);
281 center_vec.y *= INPUT_SIZE.y / INPUT_SIZE.x;
282 }
283 \endcode
284 \li \badcode
285 VARYING vec2 center_vec;
286 void MAIN()
287 {
288 float radius = 0.25;
289 float dist_to_center = length(center_vec) / radius;
290 vec2 texcoord = INPUT_UV;
291 if (dist_to_center <= 1.0) {
292 float rotation_amount = (1.0 - dist_to_center) * (1.0 - dist_to_center);
293 float r = radians(360.0) * rotation_amount / 4.0;
294 mat2 rotation = mat2(cos(r), sin(r), -sin(r), cos(r));
295 texcoord = vec2(0.5, 0.5) + rotation * (INPUT_UV - vec2(0.5, 0.5));
296 }
297 FRAGCOLOR = texture(INPUT, texcoord);
298 }
299 \endcode
300 \endtable
301
302 The Effect object's \c passes list should now specify both the vertex and
303 fragment snippets:
304
305 \qml
306 passes: Pass {
307 shaders: [
308 Shader {
309 stage: Shader.Vertex
310 shader: "effect.vert"
311 },
312 Shader {
313 stage: Shader.Fragment
314 shader: "effect.frag"
315 }
316 ]
317 }
318 \endqml
319
320 The end result looks like the following:
321
322 \table 70%
323 \row
324 \li \image effect_intro_1.png
325 \li \image effect_intro_3.png
326 \endtable
327
328 \section1 Special keywords in effect shaders
329
330 \list
331
332 \li \c VARYING - Declares a vertex output or fragment input, depending on the type of the current shader.
333 \li \c MAIN - This function must always be present in an effect shader.
334 \li \c FRAGCOLOR - \c vec4 - The final fragment color; the output of the fragment shader. (fragment shader only)
335 \li \c POSITION - \c vec4 - The homogenous position calculated in the vertex shader. (vertex shader only)
336 \li \c MODELVIEWPROJECTION_MATRIX - \c mat4 - The transformation matrix for the screen quad.
337 \li \c VERTEX - \c vec3 - The vertices of the quad; the input to the vertex shader. (vertex shader only)
338
339 \li \c INPUT - \c sampler2D or \c sampler2DArray - The sampler for the input
340 texture with the scene rendered into it, unless a pass redirects its input
341 via a BufferInput object, in which case \c INPUT refers to the additional
342 color buffer's texture referenced by the BufferInput. With \l{Multiview
343 Rendering}{multiview rendering} enabled, which can be relevant for VR/AR
344 applications, this is a sampler2DArray, while the input texture becomes a 2D
345 texture array.
346
347 \li \c INPUT_UV - \c vec2 - UV coordinates for sampling \c INPUT.
348
349 \li \c TEXTURE_UV - \c vec2 - UV coordinates suitable for sampling a Texture
350 with contents loaded from an image file.
351
352 \li \c INPUT_SIZE - \c vec2 - The size of the \c INPUT texture, in pixels.
353
354 \li \c OUTPUT_SIZE - \c vec2 - The size of the output buffer, in
355 pixels. Often the same as \c INPUT_SIZE, unless the pass outputs to an extra
356 Buffer with a size multiplier on it.
357
358 \li \c FRAME - \c float - A frame counter, incremented after each frame in the View3D.
359
360 \li \c DEPTH_TEXTURE - \c sampler2D or \c sampler2DArray - A depth texture
361 with the depth buffer contents with the opaque objects in the scene. Like
362 with CustomMaterial, the presence of this keyword in the shader triggers
363 generating the depth texture automatically.
364
365 \li \c NORMAL_ROUGHNESS_TEXTURE - \c sampler2D - A texture with the
366 world-space normals and material roughness of the opaque objects in the
367 currently visible portion of the scene. Like with CustomMaterial, the
368 presence of this keyword in the shader implies an additional render pass to
369 generate the normal texture.
370
371 \li \c VIEW_INDEX - \c uint - With \l{Multiview Rendering}{multiview
372 rendering} enabled, this is the current view index, available in both vertex
373 and fragment shaders. Always 0 when multiview rendering is not used.
374
375 \li \c PROJECTION_MATRIX - \c mat4, the projection matrix. Note that with
376 \l{Multiview Rendering}{multiview rendering}, this is an array of matrices.
377
378 \li \c INVERSE_PROJECTION_MATRIX - \c mat4, the inverse projection matrix.
379 Note that with \l{Multiview Rendering}{multiview rendering}, this is an array
380 of matrices.
381
382 \li \c VIEW_MATRIX -> \c mat4, the view (camera) matrix.
383 Note that with \l{Multiview Rendering}{multiview rendering}, this is an array
384 of matrices.
385
386 \li float \c NDC_Y_UP - The value is \c 1 when the Y axis points up in
387 normalized device coordinate space, and \c{-1} when the Y axis points down.
388 Y pointing down is the case when rendering happens with Vulkan.
389
390 \li float \c FRAMEBUFFER_Y_UP - The value is \c 1 when the Y axis points up
391 in the coordinate system for framebuffers (textures), meaning \c{(0, 0)} is
392 the bottom-left corner. The value is \c{-1} when the Y axis points down,
393 \c{(0, 0)} being the top-left corner.
394
395 \li float \c NEAR_CLIP_VALUE - The value is \c -1 for when the clipping plane
396 range's starts at \c -1 and goes to \c 1. This is true when using OpenGL for
397 rendering. For other rendering backends the value of this property will be
398 \c 0 meaning the clipping plane range is \c 0 to \c 1.
399
400 \endlist
401
402 \section1 Building multi-pass effects
403
404 A multi-pass effect often uses more than one set of shaders, and takes the
405 \l{Pass::output}{output} and \l{Pass::commands}{commands} properties into
406 use. Each entry in the passes list translates to a render pass drawing a
407 quad into the pass's output texture, while sampling the effect's input texture
408 and optionally other textures as well.
409
410 The typical outline of a multi-pass Effect can look like the following:
411
412 \qml
413 passes: [
414 Pass {
415 shaders: [
416 Shader {
417 stage: Shader.Vertex
418 shader: "pass1.vert"
419 },
420 Shader {
421 stage: Shader.Fragment
422 shader: "pass1.frag"
423 }
424 // This pass outputs to the intermediate texture described
425 // by the Buffer object.
426 output: intermediateColorBuffer
427 ],
428 },
429 Pass {
430 shaders: [
431 Shader {
432 stage: Shader.Vertex
433 shader: "pass2.vert"
434 },
435 Shader {
436 stage: Shader.Fragment
437 shader: "pass2.frag"
438 }
439 // The output of the last pass needs no redirection, it is
440 // the final result of the effect.
441 ],
442 commands: [
443 // This pass reads from the intermediate texture, meaning
444 // INPUT in the shader will refer to the texture associated
445 // with the Buffer.
446 BufferInput {
447 buffer: intermediateColorBuffer
448 }
449 ]
450 }
451 ]
452 \endqml
453
454 What is \c intermediateColorBuffer?
455
456 \qml
457 Buffer {
458 id: intermediateColorBuffer
459 name: "tempBuffer"
460 // format: Buffer.RGBA8
461 // textureFilterOperation: Buffer.Linear
462 // textureCoordOperation: Buffer.ClampToEdge
463 }
464 \endqml
465
466 The commented properties are not necessary if the desired values match the
467 defaults.
468
469 Internally the presence of this Buffer object and referencing it from the \c
470 output property of a Pass leads to creating a texture with a size matching
471 the View3D, and so the size of the implicit input and output textures. When
472 this is not desired, the \l{Buffer::sizeMultiplier}{sizeMultiplier} property
473 can be used to get an intermediate texture with a different size. This can
474 lead to the \c INPUT_SIZE and \c OUTPUT_SIZE uniforms in the shader having
475 different values.
476
477 By default the Effect cannot count on textures preserving their contents
478 between frames. When a new intermediate texture is created, it is cleared to
479 \c{vec4(0.0)}. Afterwards, the same texture can be reused for another
480 purpose. Therefore, effect passes should always write to the entire texture,
481 without making assumptions about their content at the start of the pass.
482 There is an exception to this: Buffer objects with
483 \l{Buffer::bufferFlags}{bufferFlags} set to Buffer.SceneLifetime. This
484 indicates that the texture is permanently associated with a pass of the
485 effect and it will not be reused for other purposes. The contents of such
486 color buffers is preserved between frames. This is typically used in a
487 ping-pong fashion in effects like motion blur: the first pass takes the
488 persistent buffer as its input, in addition to the effects main input
489 texture, outputting to another intermediate buffer, while the second pass
490 outputs to the persistent buffer. This way in the first frame the first pass
491 samples an empty (transparent) texture, whereas in subsequent frames it
492 samples the output of the second pass from the previous frame. A third pass
493 can then blend the effect's input and the second pass' output together.
494
495 The BufferInput command type is used to expose custom texture buffers to the
496 render pass.
497
498 For instance, to access \c someBuffer in the render pass shaders under
499 the name, \c mySampler, the following can be added to its command list:
500 \qml
501 BufferInput { buffer: someBuffer; sampler: "mySampler" }
502 \endqml
503
504 If the \c sampler name is not specified, \c INPUT will be used as default.
505
506 Buffers can be useful to share intermediate results between render passes.
507
508 To expose preloaded textures to the effect, TextureInput should be used instead.
509 These can be defined as properties of the Effect itself, and will automatically
510 be accessible to the shaders by their property names.
511 \qml
512 property TextureInput tex: TextureInput {
513 texture: Texture { source: "image.png" }
514 }
515 \endqml
516
517 Here \c tex is a valid sampler in all shaders of all the passes of the
518 effect.
519
520 When it comes to uniform values from properties, all passes in the Effect
521 read the same values in their shaders. If necessary it is possible to
522 override the value of a uniform just for a given pass. This is achieved by
523 adding the \l SetUniformValue command to the list of commands for the pass.
524
525 \note The \l{SetUniformValue::target}{target} of the pass-specific uniform
526 value setter can only refer to a name that is the name of a property of the
527 effect. It can override the value for a property's corresponding uniform,
528 but it cannot introduce new uniforms.
529
530 \section1 Performance considerations
531
532 Be aware of the increased resource usage and potentially reduced performance
533 when using post-processing effects. Just like with Qt Quick layers and
534 ShaderEffect, rendering the scene into a texture and then using that to
535 texture a quad is not a cheap operation, especially on low-end hardware with
536 limited fragment processing power. The amount of additional graphics memory
537 needed, as well as the increase in GPU load both depend on the size of the
538 View3D (which, on embedded devices without a windowing system, may often be
539 as big as the screen resolution). Multi-pass effects, as well as applying
540 multiple effects increase the resource and performance requirements further.
541
542 Therefore, it is highly advisable to ensure early on in the development
543 lifecycle that the targeted device and graphics stack is able to cope with
544 the effects included in the design of the 3D scene at the final product's
545 screen resolution.
546
547 While unavoidable with techniques that need it, \c DEPTH_TEXTURE implies an
548 additional rendering pass to generate the contents of that texture, which
549 can also present a hit on less capable hardware. Therefore, use \c
550 DEPTH_TEXTURE in the effect's shaders only when essential.
551
552 The complexity of the operations in the shaders is also important. Just like
553 with CustomMaterial, a sub-optimal fragment shader can easily lead to
554 reduced rendering performance.
555
556 Be cautious with \l{Buffer::sizeMultiplier}{sizeMultiplier in Buffer} when
557 values larger than 1 are involved. For example, a multiplier of 4 means
558 creating and then rendering to a texture that is 4 times the size of the
559 View3D. Just like with shadow maps and multi- or supersampling, the
560 increased resource and performance costs can quickly outweigh the benefits
561 from better quality on systems with limited GPU power.
562
563 \section1 VR/AR considerations
564
565 When developing applications for virtual or augmented reality by using Qt
566 Quick 3D XR, postprocessing effects are functional and available to use.
567 However, designers are developers should take special care to understand
568 which and what kind of effects make sense in a virtual reality environment.
569 Some effects, including some of the built-in ones in
570 ExtendedSceneEnvironment or the deprecated Effects module, do not lead to a
571 good visual experience in a VR environment, possibly affecting the user
572 physically even (causing, for example, motion sickness or dizziness).
573
574 When the more efficient \l{Multiview Rendering}{multiview rendering mode} is
575 enabled in a VR/AR application, there is no separate render pass for the
576 left and right eye contents. Instead, it all happens in one pass, using a 2D
577 texture array with two layers instead of two independent 2D textures. This
578 also means that many intermediate buffers, meaning color or depth textures,
579 will need to become texture arrays in this mode. This then has implications
580 for custom materials and postprocessing effects. Textures such as the input
581 texture (\c INPUT) and the depth texture (\c DEPTH_TEXTURE) become 2D texture
582 arrays, exposed in the shader as a \c sampler2DArray instead of \c
583 sampler2D. This has implications for GLSL functions such as texture(),
584 textureLod(), or textureSize(). The UV coordinate is then a vec3, not a
585 vec2. Whereas textureSize() returns a vec3, not a vec2. Effects intended to
586 function regardless of the rendering mode, can be written with an
587 appropriate ifdef:
588 \badcode
589 #if QSHADER_VIEW_COUNT >= 2
590 vec4 c = texture(INPUT, vec3(INPUT_UV, VIEW_INDEX));
591 #else
592 vec4 c = texture(INPUT, INPUT_UV);
593 #endif
594 \endcode
595
596 It can also be useful to define macros that handle both cases. For example:
597 \badcode
598 #if QSHADER_VIEW_COUNT >= 2
599 #define SAMPLE_INPUT(uv) texture(INPUT, vec3(uv, VIEW_INDEX))
600 #define SAMPLE_DEPTH(uv) texture(DEPTH_TEXTURE, vec3(uv, VIEW_INDEX)).r
601 #define PROJECTION PROJECTION_MATRIX[VIEW_INDEX]
602 #define INVERSE_PROJECTION INVERSE_PROJECTION_MATRIX[VIEW_INDEX]
603 #else
604 #define SAMPLE_INPUT(uv) texture(INPUT, uv)
605 #define SAMPLE_DEPTH(uv) texture(DEPTH_TEXTURE, uv).r
606 #define PROJECTION PROJECTION_MATRIX
607 #define INVERSE_PROJECTION INVERSE_PROJECTION_MATRIX
608 #endif
609 \endcode
610
611 This does not apply to \c NORMAL_ROUGHNESS_TEXTURE that is always a 2D texture,
612 even when multiview rendering is active:
613 \badcode
614 #define SAMPLE_NORMAL(uv) normalize(texture(NORMAL_ROUGHNESS_TEXTURE, uv).rgb)
615 \endcode
616
617 \note The presence of keywords such as \c DEPTH_TEXTURE trigger additional
618 render passes, and uniforms such as \c INVERSE_PROJECTION_MATRIX are
619 calculated and set upon the presence of the keyword in the shader snippet
620 anywhere. This is more expensive, both when it comes to performance and
621 resource usage. Hence it is recommended to only add such #defines when the
622 textures and matrices will really be used in the effect.
623
624 \sa Shader, Pass, Buffer, BufferInput, {Qt Quick 3D - Custom Effect Example}
625*/
626
627/*!
628 \qmlproperty list Effect::passes
629 Contains a list of render \l {Pass}{passes} implemented by the effect.
630*/
631
632QQuick3DEffect::QQuick3DEffect(QQuick3DObject *parent)
633 : QQuick3DObject(*(new QQuick3DObjectPrivate(QQuick3DObjectPrivate::Type::Effect)), parent)
634{
635}
636
637QQmlListProperty<QQuick3DShaderUtilsRenderPass> QQuick3DEffect::passes()
638{
639 return QQmlListProperty<QQuick3DShaderUtilsRenderPass>(this,
640 nullptr,
641 QQuick3DEffect::qmlAppendPass,
642 QQuick3DEffect::qmlPassCount,
643 QQuick3DEffect::qmlPassAt,
644 QQuick3DEffect::qmlPassClear);
645}
646
647// Default vertex and fragment shader code that is used when no corresponding
648// Shader is present in the Effect. These go through the usual processing so
649// should use the user-facing builtins.
650
652 "void MAIN()\n"
653 "{\n"
654 "}\n";
655
657 "void MAIN()\n"
658 "{\n"
659 "#if QSHADER_VIEW_COUNT >= 2\n"
660 " FRAGCOLOR = texture(INPUT, vec3(INPUT_UV, VIEW_INDEX));\n"
661 "#else\n"
662 " FRAGCOLOR = texture(INPUT, INPUT_UV);\n"
663 "#endif\n"
664 "}\n";
665
666static inline void insertVertexMainArgs(QByteArray &snippet)
667{
668 static const char *argKey = "/*%QT_ARGS_MAIN%*/";
669 const int argKeyLen = int(strlen(argKey));
670 const int argKeyPos = snippet.indexOf(argKey);
671 if (argKeyPos >= 0)
672 snippet = snippet.left(argKeyPos) + QByteArrayLiteral("inout vec3 VERTEX") + snippet.mid(argKeyPos + argKeyLen);
673}
674
675static inline void resetShaderDependentEffectFlags(QSSGRenderEffect *effectNode)
676{
677 effectNode->setFlag(QSSGRenderEffect::Flags::UsesDepthTexture, false);
678 effectNode->setFlag(QSSGRenderEffect::Flags::UsesProjectionMatrix, false);
679 effectNode->setFlag(QSSGRenderEffect::Flags::UsesInverseProjectionMatrix, false);
680 effectNode->setFlag(QSSGRenderEffect::Flags::UsesViewMatrix, false);
681 effectNode->setFlag(QSSGRenderEffect::Flags::UsesNormalTexture, false);
682}
683
684static inline void accumulateEffectFlagsFromShader(QSSGRenderEffect *effectNode, const QSSGCustomShaderMetaData &meta)
685{
686 if (meta.flags.testFlag(QSSGCustomShaderMetaData::UsesDepthTexture))
687 effectNode->setFlag(QSSGRenderEffect::Flags::UsesDepthTexture);
688 if (meta.flags.testFlag(QSSGCustomShaderMetaData::UsesProjectionMatrix))
689 effectNode->setFlag(QSSGRenderEffect::Flags::UsesProjectionMatrix);
690 if (meta.flags.testFlag(QSSGCustomShaderMetaData::UsesInverseProjectionMatrix))
691 effectNode->setFlag(QSSGRenderEffect::Flags::UsesInverseProjectionMatrix);
692 if (meta.flags.testFlag(QSSGCustomShaderMetaData::UsesViewMatrix))
693 effectNode->setFlag(QSSGRenderEffect::Flags::UsesViewMatrix);
694 if (meta.flags.testFlag(QSSGCustomShaderMetaData::UsesNormalTexture))
695 effectNode->setFlag(QSSGRenderEffect::Flags::UsesNormalTexture);
696}
697
698QSSGRenderGraphObject *QQuick3DEffect::updateSpatialNode(QSSGRenderGraphObject *node)
699{
700 using namespace QSSGShaderUtils;
701
702 const auto &renderContext = QQuick3DObjectPrivate::get(this)->sceneManager->wattached->rci();
703 if (!renderContext) {
704 qWarning("QQuick3DEffect: No render context interface?");
705 return nullptr;
706 }
707
708 QSSGRenderEffect *effectNode = static_cast<QSSGRenderEffect *>(node);
709 bool newBackendNode = false;
710 if (!effectNode) {
711 effectNode = new QSSGRenderEffect;
712 newBackendNode = true;
713 }
714
715 bool shadersOrBuffersMayChange = false;
716 if (m_dirtyAttributes & Dirty::EffectChainDirty)
717 shadersOrBuffersMayChange = true;
718
719 const bool fullUpdate = newBackendNode || effectNode->incompleteBuildTimeObject || (m_dirtyAttributes & Dirty::TextureDirty);
720
721 if (fullUpdate || shadersOrBuffersMayChange) {
722 markAllDirty();
723
724 // Need to clear the old list with properties and textures first.
725 effectNode->properties.clear();
726 effectNode->textureProperties.clear();
727
728 QMetaMethod propertyDirtyMethod;
729 const int idx = metaObject()->indexOfSlot("onPropertyDirty()");
730 if (idx != -1)
731 propertyDirtyMethod = metaObject()->method(idx);
732
733 // Properties -> uniforms
734 QSSGShaderCustomMaterialAdapter::StringPairList uniforms;
735 QSSGShaderCustomMaterialAdapter::StringPairList multiViewDependentSamplers;
736 const int propCount = metaObject()->propertyCount();
737 int propOffset = metaObject()->propertyOffset();
738
739 // Effect can have multilayered inheritance structure, so find the actual propOffset
740 const QMetaObject *superClass = metaObject()->superClass();
741 while (superClass && qstrcmp(superClass->className(), "QQuick3DEffect") != 0) {
742 propOffset = superClass->propertyOffset();
743 superClass = superClass->superClass();
744 }
745
746 using TextureInputProperty = QPair<QQuick3DShaderUtilsTextureInput *, const char *>;
747
748 QVector<TextureInputProperty> textureProperties; // We'll deal with these later
749 for (int i = propOffset; i != propCount; ++i) {
750 const QMetaProperty property = metaObject()->property(i);
751 if (Q_UNLIKELY(!property.isValid()))
752 continue;
753
754 const auto name = property.name();
755 QMetaType propType = property.metaType();
756 QVariant propValue = property.read(this);
757 if (propType == QMetaType(QMetaType::QVariant))
758 propType = propValue.metaType();
759
760 if (propType.id() >= QMetaType::User) {
761 if (propType.id() == qMetaTypeId<QQuick3DShaderUtilsTextureInput *>()) {
762 if (QQuick3DShaderUtilsTextureInput *texture = property.read(this).value<QQuick3DShaderUtilsTextureInput *>())
763 textureProperties.push_back({texture, name});
764 }
765 } else if (propType == QMetaType(QMetaType::QObjectStar)) {
766 if (QQuick3DShaderUtilsTextureInput *texture = qobject_cast<QQuick3DShaderUtilsTextureInput *>(propValue.value<QObject *>()))
767 textureProperties.push_back({texture, name});
768 } else {
769 const auto type = uniformType(propType);
770 if (type != QSSGRenderShaderValue::Unknown) {
771 uniforms.append({ uniformTypeName(propType), name });
772 effectNode->properties.push_back({ name, uniformTypeName(propType),
773 propValue, uniformType(propType), i});
774 // Track the property changes
775 if (fullUpdate) {
776 if (property.hasNotifySignal() && propertyDirtyMethod.isValid())
777 connect(this, property.notifySignal(), this, propertyDirtyMethod);
778 } // else already connected
779 } else {
780 // ### figure out how _not_ to warn when there are no dynamic
781 // properties defined (because warnings like Blah blah objectName etc. are not helpful)
782 //qWarning("No known uniform conversion found for effect property %s. Skipping", property.name());
783 }
784 }
785 }
786
787 const auto processTextureProperty = [&](QQuick3DShaderUtilsTextureInput &texture, const QByteArray &name) {
788 QSSGRenderEffect::TextureProperty texProp;
789 QQuick3DTexture *tex = texture.texture(); // may be null if the TextureInput has no 'texture' set
790 if (fullUpdate) {
791 connect(&texture, &QQuick3DShaderUtilsTextureInput::enabledChanged, this, &QQuick3DEffect::onTextureDirty);
792 connect(&texture, &QQuick3DShaderUtilsTextureInput::textureChanged, this, &QQuick3DEffect::onTextureDirty);
793 } // else already connected
794 texProp.name = name;
795 if (texture.enabled && tex)
796 texProp.texImage = tex->getRenderImage();
797
798 texProp.shaderDataType = QSSGRenderShaderValue::Texture;
799
800 if (tex) {
801 texProp.minFilterType = tex->minFilter() == QQuick3DTexture::Nearest ? QSSGRenderTextureFilterOp::Nearest
802 : QSSGRenderTextureFilterOp::Linear;
803 texProp.magFilterType = tex->magFilter() == QQuick3DTexture::Nearest ? QSSGRenderTextureFilterOp::Nearest
804 : QSSGRenderTextureFilterOp::Linear;
805 texProp.mipFilterType = tex->generateMipmaps() ? (tex->mipFilter() == QQuick3DTexture::Nearest ? QSSGRenderTextureFilterOp::Nearest
806 : QSSGRenderTextureFilterOp::Linear)
807 : QSSGRenderTextureFilterOp::None;
808 texProp.horizontalClampType = tex->horizontalTiling() == QQuick3DTexture::Repeat ? QSSGRenderTextureCoordOp::Repeat
809 : (tex->horizontalTiling() == QQuick3DTexture::ClampToEdge ? QSSGRenderTextureCoordOp::ClampToEdge
810 : QSSGRenderTextureCoordOp::MirroredRepeat);
811 texProp.verticalClampType = tex->verticalTiling() == QQuick3DTexture::Repeat ? QSSGRenderTextureCoordOp::Repeat
812 : (tex->verticalTiling() == QQuick3DTexture::ClampToEdge ? QSSGRenderTextureCoordOp::ClampToEdge
813 : QSSGRenderTextureCoordOp::MirroredRepeat);
814 texProp.zClampType = tex->depthTiling() == QQuick3DTexture::Repeat ? QSSGRenderTextureCoordOp::Repeat
815 : (tex->depthTiling() == QQuick3DTexture::ClampToEdge) ? QSSGRenderTextureCoordOp::ClampToEdge
816 : QSSGRenderTextureCoordOp::MirroredRepeat;
817 }
818
819 // Knowing upfront that a sampler2D needs to be a sampler2DArray in
820 // the multiview-compatible version of the shader is not trivial.
821 // Consider: we know the list of TextureInputs, without any
822 // knowledge about the usage of those textures. Intermediate buffers
823 // (textures) also have a default constructed (no source, no source
824 // item, no texture data) Texture set. What indicates that these are
825 // used as intermediate buffers, is the 'output' property of a Pass,
826 // referencing a Buffer object (which objects we otherwise do not
827 // track), the 'name' of which matches TextureInput property name.
828 // The list of passes may vary dynamically, and some Passes may not
829 // be listed at any point in time if the effect has an
830 // ubershader-ish design. Thus one can have TextureInputs that are
831 // not associated with a Buffer (when scanning through the Passes),
832 // and so we cannot just check the 'output'-referenced Buffers to
833 // decide if a TextureInput's Texture needs to be treated specially
834 // in the generated shader code. (and the type must be correct even
835 // for, from our perspective, "unused" samplers since they are still
836 // in the shader code, and will get a dummy texture bound)
837 //
838 // Therefore, in the absence of more sophisticated options, we just
839 // look at the TextureInput's texture, and if it is something along
840 // the lines of
841 // property TextureInput intermediateColorBuffer1: TextureInput { texture: Texture { } }
842 // then it is added to the special list, indicating the the type is
843 // sampler2D or sampler2DArray, depending on the rendering mode the
844 // shader is targeting.
845
846 if (tex && !tex->hasSourceData()) {
847 multiViewDependentSamplers.append({ QByteArrayLiteral("sampler2D"), name }); // the type may get adjusted later
848 } else {
849 if (tex && QQuick3DObjectPrivate::get(tex)->type == QQuick3DObjectPrivate::Type::ImageCube)
850 uniforms.append({ QByteArrayLiteral("samplerCube"), name });
851 else if (tex && tex->textureData() && tex->textureData()->depth() > 0)
852 uniforms.append({ QByteArrayLiteral("sampler3D"), name });
853 else
854 uniforms.append({ QByteArrayLiteral("sampler2D"), name });
855 }
856
857 effectNode->textureProperties.push_back(texProp);
858 };
859
860 // Textures
861 for (const auto &property : std::as_const(textureProperties))
862 processTextureProperty(*property.first, property.second);
863
864 if (effectNode->incompleteBuildTimeObject) { // This object came from the shadergen tool
865 const auto names = dynamicPropertyNames();
866 for (const auto &name : names) {
867 QVariant propValue = property(name.constData());
868 QMetaType propType = propValue.metaType();
869 if (propType == QMetaType(QMetaType::QVariant))
870 propType = propValue.metaType();
871
872 if (propType.id() >= QMetaType::User) {
873 if (propType.id() == qMetaTypeId<QQuick3DShaderUtilsTextureInput *>()) {
874 if (QQuick3DShaderUtilsTextureInput *texture = propValue.value<QQuick3DShaderUtilsTextureInput *>())
875 textureProperties.push_back({texture, name});
876 }
877 } else if (propType.id() == QMetaType::QObjectStar) {
878 if (QQuick3DShaderUtilsTextureInput *texture = qobject_cast<QQuick3DShaderUtilsTextureInput *>(propValue.value<QObject *>()))
879 textureProperties.push_back({texture, name});
880 } else {
881 const auto type = uniformType(propType);
882 if (type != QSSGRenderShaderValue::Unknown) {
883 uniforms.append({ uniformTypeName(propType), name });
884 effectNode->properties.push_back({ name, uniformTypeName(propType),
885 propValue, uniformType(propType), -1 /* aka. dynamic property */});
886 // We don't need to track property changes
887 } else {
888 // ### figure out how _not_ to warn when there are no dynamic
889 // properties defined (because warnings like Blah blah objectName etc. are not helpful)
890 qWarning("No known uniform conversion found for effect property %s. Skipping", name.constData());
891 }
892 }
893 }
894
895 for (const auto &property : std::as_const(textureProperties))
896 processTextureProperty(*property.first, property.second);
897 }
898
899 // built-ins
900 uniforms.append({ "mat4", "qt_modelViewProjection" });
901 uniforms.append({ "vec2", "qt_inputSize" });
902 uniforms.append({ "vec2", "qt_outputSize" });
903 uniforms.append({ "float", "qt_frame_num" });
904 uniforms.append({ "float", "qt_fps" });
905 uniforms.append({ "vec2", "qt_cameraProperties" });
906 uniforms.append({ "float", "qt_normalAdjustViewportFactor" });
907 uniforms.append({ "float", "qt_nearClipValue" });
908 uniforms.append({ "vec4", "qt_rhi_properties" });
909
910 // qt_inputTexture is not listed in uniforms, will be added by prepareCustomShader()
911 // since the name and type varies between non-multiview and multiview mode
912
913 QSSGShaderCustomMaterialAdapter::StringPairList builtinVertexInputs;
914 builtinVertexInputs.append({ "vec3", "attr_pos" });
915 builtinVertexInputs.append({ "vec2", "attr_uv" });
916
917 QSSGShaderCustomMaterialAdapter::StringPairList builtinVertexOutputs;
918 builtinVertexOutputs.append({ "vec2", "qt_inputUV" });
919 builtinVertexOutputs.append({ "vec2", "qt_textureUV" });
920 builtinVertexOutputs.append({ "flat uint", "qt_viewIndex" });
921
922 // fragOutput is added automatically by the program generator
923
924 resetShaderDependentEffectFlags(effectNode);
925
926 if (!m_passes.isEmpty()) {
927 const QQmlContext *context = qmlContext(this);
928 effectNode->resetCommands();
929 for (QQuick3DShaderUtilsRenderPass *pass : std::as_const(m_passes)) {
930 // Have a key composed more or less of the vertex and fragment filenames.
931 // The shaderLibraryManager uses stage+shaderPathKey as the key.
932 // Thus shaderPathKey is then sufficient to look up both the vertex and fragment shaders later on.
933 // Note that this key is not suitable as a unique key for the graphics resources because the same
934 // set of shader files can be used in multiple different passes, or in multiple active effects.
935 // But that's the effect system's problem.
936 QByteArray shaderPathKey("effect pipeline--");
937 QSSGRenderEffect::ShaderPrepPassData passData;
938 for (QQuick3DShaderUtilsShader::Stage stage : { QQuick3DShaderUtilsShader::Stage::Vertex, QQuick3DShaderUtilsShader::Stage::Fragment }) {
939 QQuick3DShaderUtilsShader *shader = nullptr;
940 for (QQuick3DShaderUtilsShader *s : pass->m_shaders) {
941 if (s->stage == stage) {
942 shader = s;
943 break;
944 }
945 }
946
947 // just how many enums does one need for the exact same thing...
948 QSSGShaderCache::ShaderType type = QSSGShaderCache::ShaderType::Vertex;
949 if (stage == QQuick3DShaderUtilsShader::Stage::Fragment)
950 type = QSSGShaderCache::ShaderType::Fragment;
951
952 // Will just use the custom material infrastructure. Some
953 // substitutions are common between custom materials and effects.
954 //
955 // Substitutions relevant to us here:
956 // MAIN -> qt_customMain
957 // FRAGCOLOR -> fragOutput
958 // POSITION -> gl_Position
959 // MODELVIEWPROJECTION_MATRIX -> qt_modelViewProjection
960 // DEPTH_TEXTURE -> qt_depthTexture
961 // ... other things shared with custom material
962 //
963 // INPUT -> qt_inputTexture
964 // INPUT_UV -> qt_inputUV
965 // ... other effect specifics
966 //
967 // Built-in uniforms, inputs and outputs will be baked into
968 // metadata comment blocks in the resulting source code.
969 // Same goes for inputs/outputs declared with VARYING.
970
971 QByteArray code;
972 if (shader) {
973 code = QSSGShaderUtils::resolveShader(shader->shader, context, shaderPathKey); // appends to shaderPathKey
974 } else {
975 if (!shaderPathKey.isEmpty())
976 shaderPathKey.append('>');
977 shaderPathKey += "DEFAULT";
978 if (type == QSSGShaderCache::ShaderType::Vertex)
979 code = default_effect_vertex_shader;
980 else
981 code = default_effect_fragment_shader;
982 }
983
984 for (auto pathKeyIndex : { QSSGRenderCustomMaterial::RegularShaderPathKeyIndex, QSSGRenderCustomMaterial::MultiViewShaderPathKeyIndex }) {
985 QSSGShaderCustomMaterialAdapter::ShaderCodeAndMetaData result;
986 QSSGShaderCustomMaterialAdapter::CustomShaderPrepWorkData scratch;
987
988 QSSGShaderCustomMaterialAdapter::beginPrepareCustomShader(
989 &scratch,
990 &result,
991 code,
992 type,
993 pathKeyIndex == QSSGRenderCustomMaterial::RegularShaderPathKeyIndex ? false : true);
994
995 QSSGShaderCustomMaterialAdapter::StringPairList multiViewDependentUniforms;
996 if (result.second.flags.testFlag(QSSGCustomShaderMetaData::UsesProjectionMatrix)
997 || result.second.flags.testFlag(QSSGCustomShaderMetaData::UsesInverseProjectionMatrix))
998 {
999 multiViewDependentUniforms.append({ "mat4", "qt_projectionMatrix" });
1000 multiViewDependentUniforms.append({ "mat4", "qt_inverseProjectionMatrix" });
1001 }
1002
1003 multiViewDependentUniforms.append({ "mat4", "qt_viewMatrix" });
1004
1005 accumulateEffectFlagsFromShader(effectNode, result.second);
1006
1007 QSSGShaderCustomMaterialAdapter::finishPrepareCustomShader(
1008 &result.first, // effectively appends the QQ3D_SHADER_META block
1009 scratch,
1010 result,
1011 type,
1012 pathKeyIndex == QSSGRenderCustomMaterial::RegularShaderPathKeyIndex ? false : true,
1013 uniforms,
1014 type == QSSGShaderCache::ShaderType::Vertex ? builtinVertexInputs : builtinVertexOutputs,
1015 type == QSSGShaderCache::ShaderType::Vertex ? builtinVertexOutputs : QSSGShaderCustomMaterialAdapter::StringPairList(),
1016 multiViewDependentSamplers,
1017 multiViewDependentUniforms);
1018
1019 if (type == QSSGShaderCache::ShaderType::Vertex) {
1020 // qt_customMain() has an argument list which gets injected here
1021 insertVertexMainArgs(result.first);
1022 passData.vertexShaderCode[pathKeyIndex] = result.first;
1023 passData.vertexMetaData[pathKeyIndex] = result.second;
1024 } else {
1025 passData.fragmentShaderCode[pathKeyIndex] = result.first;
1026 passData.fragmentMetaData[pathKeyIndex] = result.second;
1027 }
1028 }
1029 }
1030
1031 effectNode->commands.push_back(nullptr); // will be changed to QSSGBindShader in finalizeShaders
1032 passData.bindShaderCmdIndex = effectNode->commands.size() - 1;
1033
1034 // finalizing the shader code happens in a separate step later on by the backend node
1035 passData.shaderPathKeyPrefix = shaderPathKey;
1036 effectNode->shaderPrepData.passes.append(passData);
1037 effectNode->shaderPrepData.valid = true; // trigger reprocessing the shader code later on
1038
1039 effectNode->commands.push_back(new QSSGApplyInstanceValue);
1040
1041 // Buffers
1042 QQuick3DShaderUtilsBuffer *outputBuffer = pass->outputBuffer;
1043 if (outputBuffer) {
1044 const QByteArray &outBufferName = outputBuffer->name;
1045 if (outBufferName.isEmpty()) {
1046 // default output buffer (with settings)
1047 auto outputFormat = QQuick3DShaderUtilsBuffer::mapTextureFormat(outputBuffer->format());
1048 effectNode->commands.push_back(new QSSGBindTarget(outputFormat));
1049 effectNode->outputFormat = outputFormat;
1050 } else {
1051 // Allocate buffer command
1052 effectNode->commands.push_back(outputBuffer->cloneCommand());
1053 connect(outputBuffer, &QQuick3DShaderUtilsBuffer::changed, this, &QQuick3DEffect::onPassDirty, Qt::UniqueConnection);
1054 // bind buffer
1055 effectNode->commands.push_back(new QSSGBindBuffer(outBufferName));
1056 }
1057 } else {
1058 // Use the default output buffer, same format as the source buffer
1059 effectNode->commands.push_back(new QSSGBindTarget(QSSGRenderTextureFormat::Unknown));
1060 effectNode->outputFormat = QSSGRenderTextureFormat::Unknown;
1061 }
1062
1063 // Other commands (BufferInput, Blending ... )
1064 const auto &extraCommands = pass->m_commands;
1065 for (const auto &command : extraCommands) {
1066 const int bufferCount = command->bufferCount();
1067 for (int i = 0; i != bufferCount; ++i) {
1068 effectNode->commands.push_back(command->bufferAt(i)->cloneCommand());
1069 connect(command->bufferAt(i), &QQuick3DShaderUtilsBuffer::changed, this, &QQuick3DEffect::onPassDirty, Qt::UniqueConnection);
1070 }
1071 effectNode->commands.push_back(command->cloneCommand());
1072 }
1073
1074 effectNode->commands.push_back(new QSSGRender);
1075 }
1076 }
1077 }
1078
1079 if (m_dirtyAttributes & Dirty::PropertyDirty) {
1080 for (const auto &prop : std::as_const(effectNode->properties)) {
1081 auto p = metaObject()->property(prop.pid);
1082 if (Q_LIKELY(p.isValid()))
1083 prop.value = p.read(this);
1084 }
1085 }
1086
1087 m_dirtyAttributes = 0;
1088
1089 DebugViewHelpers::ensureDebugObjectName(effectNode, this);
1090
1091 return effectNode;
1092}
1093
1094void QQuick3DEffect::onPropertyDirty()
1095{
1096 markDirty(Dirty::PropertyDirty);
1097}
1098
1099void QQuick3DEffect::onTextureDirty()
1100{
1101 markDirty(Dirty::TextureDirty);
1102}
1103
1104void QQuick3DEffect::onPassDirty()
1105{
1106 // changed() signals from not just Passes but Buffers are also hooked up to this.
1107 // Any property change should lead to re-evaluating the whole effect in sync.
1108 markDirty(Dirty::EffectChainDirty);
1109}
1110
1111void QQuick3DEffect::effectChainDirty()
1112{
1113 markDirty(Dirty::EffectChainDirty);
1114}
1115
1116void QQuick3DEffect::markDirty(QQuick3DEffect::Dirty type)
1117{
1118 if (!(m_dirtyAttributes & quint32(type))) {
1119 m_dirtyAttributes |= quint32(type);
1120 update();
1121 }
1122}
1123
1124void QQuick3DEffect::updateSceneManager(QQuick3DSceneManager *sceneManager)
1125{
1126 if (sceneManager) {
1127 for (const auto &it : std::as_const(m_dynamicTextureMaps)) {
1128 if (auto tex = it->texture())
1129 QQuick3DObjectPrivate::refSceneManager(tex, *sceneManager);
1130 }
1131 } else {
1132 for (const auto &it : std::as_const(m_dynamicTextureMaps)) {
1133 if (auto tex = it->texture())
1134 QQuick3DObjectPrivate::derefSceneManager(tex);
1135 }
1136 }
1137}
1138
1139void QQuick3DEffect::itemChange(QQuick3DObject::ItemChange change, const QQuick3DObject::ItemChangeData &value)
1140{
1141 if (change == QQuick3DObject::ItemSceneChange)
1142 updateSceneManager(value.sceneManager);
1143}
1144
1145void QQuick3DEffect::qmlAppendPass(QQmlListProperty<QQuick3DShaderUtilsRenderPass> *list, QQuick3DShaderUtilsRenderPass *pass)
1146{
1147 if (!pass)
1148 return;
1149
1150 QQuick3DEffect *that = qobject_cast<QQuick3DEffect *>(list->object);
1151 that->m_passes.push_back(pass);
1152
1153 connect(pass, &QQuick3DShaderUtilsRenderPass::changed, that, &QQuick3DEffect::onPassDirty);
1154 that->effectChainDirty();
1155}
1156
1157QQuick3DShaderUtilsRenderPass *QQuick3DEffect::qmlPassAt(QQmlListProperty<QQuick3DShaderUtilsRenderPass> *list, qsizetype index)
1158{
1159 QQuick3DEffect *that = qobject_cast<QQuick3DEffect *>(list->object);
1160 return that->m_passes.at(index);
1161}
1162
1163qsizetype QQuick3DEffect::qmlPassCount(QQmlListProperty<QQuick3DShaderUtilsRenderPass> *list)
1164{
1165 QQuick3DEffect *that = qobject_cast<QQuick3DEffect *>(list->object);
1166 return that->m_passes.size();
1167}
1168
1169void QQuick3DEffect::qmlPassClear(QQmlListProperty<QQuick3DShaderUtilsRenderPass> *list)
1170{
1171 QQuick3DEffect *that = qobject_cast<QQuick3DEffect *>(list->object);
1172
1173 for (QQuick3DShaderUtilsRenderPass *pass : that->m_passes)
1174 pass->disconnect(that);
1175
1176 that->m_passes.clear();
1177 that->effectChainDirty();
1178}
1179
1180void QQuick3DEffect::setDynamicTextureMap(QQuick3DShaderUtilsTextureInput *textureMap)
1181{
1182 // There can only be one texture input per property, as the texture input is a combination
1183 // of the texture used and the uniform name!
1184 auto it = m_dynamicTextureMaps.constFind(textureMap);
1185
1186 if (it == m_dynamicTextureMaps.constEnd()) {
1187 // Track the object, if it's destroyed we need to remove it from our table.
1188 connect(textureMap, &QQuick3DShaderUtilsTextureInput::destroyed, this, [this, textureMap]() {
1189 auto it = m_dynamicTextureMaps.constFind(textureMap);
1190 if (it != m_dynamicTextureMaps.constEnd())
1191 m_dynamicTextureMaps.erase(it);
1192 });
1193 m_dynamicTextureMaps.insert(textureMap);
1194
1195 update();
1196 }
1197}
1198
1199QT_END_NAMESPACE
static void resetShaderDependentEffectFlags(QSSGRenderEffect *effectNode)
static const char * default_effect_fragment_shader
static const char * default_effect_vertex_shader
static void insertVertexMainArgs(QByteArray &snippet)
static void accumulateEffectFlagsFromShader(QSSGRenderEffect *effectNode, const QSSGCustomShaderMetaData &meta)