Qt
Internal/Contributor docs for the Qt SDK. Note: These are NOT official API docs; those are found at https://doc.qt.io/
Loading...
Searching...
No Matches
qquick3deffect.cpp
Go to the documentation of this file.
1// Copyright (C) 2020 The Qt Company Ltd.
2// SPDX-License-Identifier: LicenseRef-Qt-Commercial OR GPL-3.0-only
3// Qt-Security score:significant reason:default
4
5
7
8#include <ssg/qssgrendercontextcore.h>
9#include <QtQuick3DRuntimeRender/private/qssgrendereffect_p.h>
10#include <QtQuick3DRuntimeRender/private/qssgshadermaterialadapter_p.h>
11#include <QtQuick3DUtils/private/qssgutils_p.h>
12#include <QtQuick/qquickwindow.h>
13#include <QtQuick3D/private/qquick3dobject_p.h>
14#include <QtQuick3D/private/qquick3dscenemanager_p.h>
15#include <QtCore/qfile.h>
16#include <QtCore/qurl.h>
17
18
20
21/*!
22 \qmltype Effect
23 \inherits Object3D
24 \inqmlmodule QtQuick3D
25 \nativetype QQuick3DEffect
26 \brief Base component for creating a post-processing effect.
27
28 The Effect type allows the user to implement their own post-processing
29 effects for QtQuick3D.
30
31 \section1 Post-processing effects
32
33 A post-processing effect is conceptually very similar to Qt Quick's \l
34 ShaderEffect item. When an effect is present, the scene is rendered into a
35 separate texture first. The effect is then applied by drawing a textured
36 quad to the main render target, depending on the
37 \l{View3D::renderMode}{render mode} of the View3D. The effect can provide a
38 vertex shader, a fragment shader, or both. Effects are always applied on the
39 entire scene, per View3D.
40
41 Effects are associated with the \l SceneEnvironment in the
42 \l{SceneEnvironment::effects} property. The property is a list: effects can
43 be chained together; they are applied in the order they are in the list,
44 using the previous step's output as the input to the next one, with the last
45 effect's output defining the contents of the View3D.
46
47 \note \l SceneEnvironment and \l ExtendedSceneEnvironment provide a set of
48 built-in effects, such as depth of field, glow/bloom, lens flare, color
49 grading, and vignette. Always consider first if these are sufficient for
50 the application's needs, and prefer using the built-in facilities instead
51 of implementing a custom post-processing effect.
52
53 Effects are similar to \l{CustomMaterial}{custom materials} in many
54 ways. However, a custom material is associated with a model and is
55 responsible for the shading of that given mesh. Whereas an effect's vertex
56 shader always gets a quad (for example, two triangles) as its input, while
57 its fragment shader samples the texture with the scene's content.
58
59 Unlike custom materials, effects support multiple passes. For many effects
60 this it not necessary, and when there is a need to apply multiple effects,
61 identical results can often be achieved by chaining together multiple
62 effects in \l{SceneEnvironment::effects}{the SceneEnvironment}. This is
63 demonstrated by the \l{Qt Quick 3D - Custom Effect Example}{Custom Effect
64 example} as well. However, passes have the possibility to request additional
65 color buffers (texture), and specify which of these additional buffers they
66 output to. This allows implementing more complex image processing techniques
67 since subsequent passes can then use one or more of these additional
68 buffers, plus the original scene's content, as their input. If necessary,
69 these additional buffers can have an extended lifetime, meaning their
70 content is preserved between frames, which allows implementing effects that
71 rely on accumulating content from multiple frames, such as, motion blur.
72
73 When compared to Qt Quick's 2D ShaderEffect, the 3D post-processing effects
74 have the advantage of being able to work with depth buffer data, as well as
75 the ability to implement multiple passes with intermediate buffers. In
76 addition, the texture-related capabilities are extended: Qt Quick 3D allows
77 more fine-grained control over filtering modes, and allows effects to work
78 with texture formats other than RGBA8, for example, floating point formats.
79
80 \note Post-processing effects are currently available when the View3D
81 has its \l{View3D::renderMode}{renderMode} set to \c Offscreen,
82 \c Underlay or \c Overlay. Effects will not be rendered for \c Inline mode.
83
84 \note When using post-processing effects, the application-provided shaders
85 should expect linear color data without tonemapping applied. The
86 tonemapping that is performed during the main render pass (or during skybox
87 rendering, if there is a skybox) when
88 \l{SceneEnvironment::tonemapMode}{tonemapMode} is set to a value other than
89 \c SceneEnvironment.TonemapModeNone, is automatically disabled when there
90 is at least one post-processing effect specified in the SceneEnvironment.
91 The last effect in the chain (more precisely, the last pass of the last
92 effect in the chain) will automatically get its fragment shader amended to
93 perform the same tonemapping the main render pass would.
94
95 \note Effects that perform their own tonemapping should be used in a
96 SceneEnvironment that has the built-in tonemapping disabled by setting
97 \l{SceneEnvironment::tonemapMode}{tonemapMode} to \c
98 SceneEnvironment.TonemapModeNone.
99
100 \note By default the texture used as the effects' input is created with a
101 floating point texture format, such as 16-bit floating point RGBA. The
102 output texture's format is the same since by default it follows the input
103 format. This can be overridden using \l Buffer and an empty name. The
104 default RGBA16F is useful because it allows working with non-tonemapped
105 linear data without having the color values outside the 0-1 range clamped.
106
107 \section1 Exposing data to the shaders
108
109 Like with CustomMaterial or ShaderEffect, the dynamic properties of an
110 Effect object can be changed and animated using the usual QML and Qt Quick
111 facilities, and the values are exposed to the shaders automatically. The
112 following list shows how properties are mapped:
113
114 \list
115 \li bool, int, real -> bool, int, float
116 \li QColor, \l{QtQml::Qt::rgba()}{color} -> vec4, and the color gets
117 converted to linear, assuming sRGB space for the color value specified in
118 QML. The built-in Qt colors, such as \c{"green"} are in sRGB color space as
119 well, and the same conversion is performed for all color properties of
120 DefaultMaterial and PrincipledMaterial, so this behavior of Effect
121 matches those.
122 \li QRect, QRectF, \l{QtQml::Qt::rect()}{rect} -> vec4
123 \li QPoint, QPointF, \l{QtQml::Qt::point()}{point}, QSize, QSizeF, \l{QtQml::Qt::size()}{size} -> vec2
124 \li QVector2D, \l{QtQml::Qt::vector2d()}{vector2d} -> vec3
125 \li QVector3D, \l{QtQml::Qt::vector3d()}{vector3d} -> vec3
126 \li QVector4D, \l{QtQml::Qt::vector4d()}{vector4d} -> vec4
127 \li QMatrix4x4, \l{QtQml::Qt::matrix4x4()}{matrix4x4} -> mat4
128 \li QQuaternion, \l{QtQml::Qt::quaternion()}{quaternion} -> vec4, scalar value is \c w
129
130 \li TextureInput -> sampler2D or samplerCube, depending on whether \l
131 Texture or \l CubeMapTexture is used in the texture property of the
132 TextureInput. Setting the \l{TextureInput::enabled}{enabled} property to
133 false leads to exposing a dummy texture to the shader, meaning the shaders
134 are still functional but will sample a texture with opaque black image
135 content. Pay attention to the fact that properties for samplers must always
136 reference a \l TextureInput object, not a \l Texture directly. When it
137 comes to the \l Texture properties, the source, tiling, and filtering
138 related ones are the only ones that are taken into account implicitly with
139 effects, as the rest (such as, UV transformations) is up to the custom
140 shaders to implement as they see fit.
141
142 \endlist
143
144 \note When a uniform referenced in the shader code does not have a
145 corresponding property, it will cause a shader compilation error when
146 processing the effect at run time. There are some exceptions to this,
147 such as, sampler uniforms, that get a dummy texture bound when no
148 corresponding QML property is present, but as a general rule, all uniforms
149 and samplers must have a corresponding property declared in the
150 Effect object.
151
152 \section1 Getting started with user-defined effects
153
154 A custom post-processing effect involves at minimum an Effect object and a
155 fragment shader snippet. Some effects will also want a customized vertex
156 shader as well.
157
158 As a simple example, let's create an effect that combines the scene's
159 content with an image, while further altering the red channel's value in an
160 animated manner:
161
162 \table 70%
163 \row
164 \li \qml
165 Effect {
166 id: simpleEffect
167 property TextureInput tex: TextureInput {
168 texture: Texture { source: "image.png" }
169 }
170 property real redLevel
171 NumberAnimation on redLevel { from: 0; to: 1; duration: 5000; loops: -1 }
172 passes: Pass {
173 shaders: Shader {
174 stage: Shader.Fragment
175 shader: "effect.frag"
176 }
177 }
178 }
179 \endqml
180 \li \badcode
181 void MAIN()
182 {
183 vec4 c = texture(tex, TEXTURE_UV);
184 c.r *= redLevel;
185 FRAGCOLOR = c * texture(INPUT, INPUT_UV);
186 }
187 \endcode
188 \endtable
189
190 Here the texture with the image \c{image.png} is exposed to the shader under
191 the name \c tex. The value of redLevel is available in the shader in a \c
192 float uniform with the same name.
193
194 The fragment shader must contain a function called \c MAIN. The final
195 fragment color is determined by \c FRAGCOLOR. The main input texture, with
196 the contents of the View3D's scene, is accessible under a \c sampler2D with
197 the name \c INPUT. The UV coordinates from the quad are in \c
198 INPUT_UV. These UV values are always suitable for sampling \c INPUT,
199 regardless of the underlying graphics API at run time (and so regardless of
200 the Y axis direction in images since the necessary adjustments are applied
201 automatically by Qt Quick 3D). Sampling the texture with our external image
202 is done using \c TEXTURE_UV. \c INPUT_UV is not suitable in cross-platform
203 applications since V needs to be flipped to cater for the coordinate system
204 differences mentioned before, using a logic that is different for textures
205 based on images and textures used as render targets. Fortunately this is all
206 taken care of by the engine so the shader need no further logic for this.
207
208 Once simpleEffect is available, it can be associated with the effects list
209 of a the View3D's SceneEnvironment:
210
211 \qml
212 environment: SceneEnvironment {
213 effects: [ simpleEffect ]
214 }
215 \endqml
216
217 The results would look something like the following, with the original scene
218 on the left and with the effect applied on the right:
219
220 \table 70%
221 \row
222 \li \image effect_intro_1.png
223 {Three different 3D objects}
224 \li \image effect_intro_2.png
225 {Three different 3D objects with transparent Qt logo overlaid as
226 fullscreen effect}
227 \endtable
228
229 \note The \c shader property value in Shader is a URL, as is the custom in
230 QML and Qt Quick, referencing the file containing the shader snippet, and
231 works very similarly to ShaderEffect or
232 \l{Image::source}{Image.source}. Only the \c file and \c qrc schemes are
233 supported.. It is also possible to omit the \c file scheme, allowing to
234 specify a relative path in a convenient way. Such a path is resolved
235 relative to the component's (the \c{.qml} file's) location.
236
237 \note Shader code is always provided using Vulkan-style GLSL, regardless of
238 the graphics API used by Qt at run time.
239
240 \note The vertex and fragment shader code provided by the effect are not
241 full, complete GLSL shaders on their own. Rather, they provide a \c MAIN
242 function, and optionally a set of \c VARYING declarations, which are then
243 amended with further shader code by the engine.
244
245 \note The above example is not compatible with the optional multiview rendering mode that is used in some VR/AR applications.
246 To make it function both with and without multiview mode, change MAIN() like this:
247 \badcode
248 void MAIN()
249 {
250 vec4 c = texture(tex, TEXTURE_UV);
251 c.r *= redLevel;
252 #if QSHADER_VIEW_COUNT >= 2
253 FRAGCOLOR = c * texture(INPUT, vec3(INPUT_UV, VIEW_INDEX));
254 #else
255 FRAGCOLOR = c * texture(INPUT, INPUT_UV);
256 #endif
257 }
258 \endcode
259
260 \section1 Effects with vertex shaders
261
262 A vertex shader, when present, must provide a function called \c MAIN. In
263 the vast majority of cases the custom vertex shader will not want to provide
264 its own calculation of the homogenous vertex position, but it is possible
265 using \c POSITION, \c VERTEX, and \c MODELVIEWPROJECTION_MATRIX. When
266 \c POSITION is not present in the custom shader code, a statement equivalent to
267 \c{POSITION = MODELVIEWPROJECTION_MATRIX * vec4(VERTEX, 1.0);} will be
268 injected automatically by Qt Quick 3D.
269
270 To pass data between the vertex and fragment shaders, use the VARYING
271 keyword. Internally this will then be transformed into the appropriate
272 vertex output or fragment input declaration. The fragment shader can use the
273 same declaration, which then allows to read the interpolated value for the
274 current fragment.
275
276 Let's look at example, that is in effect very similar to the built-in
277 DistortionSpiral effect:
278
279 \table 70%
280 \row
281 \li \badcode
282 VARYING vec2 center_vec;
283 void MAIN()
284 {
285 center_vec = INPUT_UV - vec2(0.5, 0.5);
286 center_vec.y *= INPUT_SIZE.y / INPUT_SIZE.x;
287 }
288 \endcode
289 \li \badcode
290 VARYING vec2 center_vec;
291 void MAIN()
292 {
293 float radius = 0.25;
294 float dist_to_center = length(center_vec) / radius;
295 vec2 texcoord = INPUT_UV;
296 if (dist_to_center <= 1.0) {
297 float rotation_amount = (1.0 - dist_to_center) * (1.0 - dist_to_center);
298 float r = radians(360.0) * rotation_amount / 4.0;
299 mat2 rotation = mat2(cos(r), sin(r), -sin(r), cos(r));
300 texcoord = vec2(0.5, 0.5) + rotation * (INPUT_UV - vec2(0.5, 0.5));
301 }
302 FRAGCOLOR = texture(INPUT, texcoord);
303 }
304 \endcode
305 \endtable
306
307 The Effect object's \c passes list should now specify both the vertex and
308 fragment snippets:
309
310 \qml
311 passes: Pass {
312 shaders: [
313 Shader {
314 stage: Shader.Vertex
315 shader: "effect.vert"
316 },
317 Shader {
318 stage: Shader.Fragment
319 shader: "effect.frag"
320 }
321 ]
322 }
323 \endqml
324
325 The end result looks like the following:
326
327 \table 70%
328 \row
329 \li \image effect_intro_1.png
330 {Three different 3D objects}
331 \li \image effect_intro_3.png
332 {Warped view of three different 3D objects showing vertex shader
333 effect}
334 \endtable
335
336 \section1 Special keywords in effect shaders
337
338 \list
339
340 \li \c VARYING - Declares a vertex output or fragment input, depending on the type of the current shader.
341 \li \c MAIN - This function must always be present in an effect shader.
342 \li \c FRAGCOLOR - \c vec4 - The final fragment color; the output of the fragment shader. (fragment shader only)
343 \li \c POSITION - \c vec4 - The homogenous position calculated in the vertex shader. (vertex shader only)
344 \li \c MODELVIEWPROJECTION_MATRIX - \c mat4 - The transformation matrix for the screen quad.
345 \li \c VERTEX - \c vec3 - The vertices of the quad; the input to the vertex shader. (vertex shader only)
346
347 \li \c INPUT - \c sampler2D or \c sampler2DArray - The sampler for the input
348 texture with the scene rendered into it, unless a pass redirects its input
349 via a BufferInput object, in which case \c INPUT refers to the additional
350 color buffer's texture referenced by the BufferInput. With \l{Multiview
351 Rendering}{multiview rendering} enabled, which can be relevant for VR/AR
352 applications, this is a sampler2DArray, while the input texture becomes a 2D
353 texture array.
354
355 \li \c INPUT_UV - \c vec2 - UV coordinates for sampling \c INPUT.
356
357 \li \c TEXTURE_UV - \c vec2 - UV coordinates suitable for sampling a Texture
358 with contents loaded from an image file.
359
360 \li \c INPUT_SIZE - \c vec2 - The size of the \c INPUT texture, in pixels.
361
362 \li \c OUTPUT_SIZE - \c vec2 - The size of the output buffer, in
363 pixels. Often the same as \c INPUT_SIZE, unless the pass outputs to an extra
364 Buffer with a size multiplier on it.
365
366 \li \c FRAME - \c float - A frame counter, incremented after each frame in the View3D.
367
368 \li \c DEPTH_TEXTURE - \c sampler2D or \c sampler2DArray - A depth texture
369 with the depth buffer contents with the opaque objects in the scene. Like
370 with CustomMaterial, the presence of this keyword in the shader triggers
371 generating the depth texture automatically.
372
373 \li \c NORMAL_ROUGHNESS_TEXTURE - \c sampler2D - A texture with the
374 world-space normals and material roughness of the opaque objects in the
375 currently visible portion of the scene. Like with CustomMaterial, the
376 presence of this keyword in the shader implies an additional render pass to
377 generate the normal texture.
378
379 \li \c VIEW_INDEX - \c uint - With \l{Multiview Rendering}{multiview
380 rendering} enabled, this is the current view index, available in both vertex
381 and fragment shaders. Always 0 when multiview rendering is not used.
382
383 \li \c PROJECTION_MATRIX - \c mat4, the projection matrix. Note that with
384 \l{Multiview Rendering}{multiview rendering}, this is an array of matrices.
385
386 \li \c INVERSE_PROJECTION_MATRIX - \c mat4, the inverse projection matrix.
387 Note that with \l{Multiview Rendering}{multiview rendering}, this is an array
388 of matrices.
389
390 \li \c VIEW_MATRIX -> \c mat4, the view (camera) matrix.
391 Note that with \l{Multiview Rendering}{multiview rendering}, this is an array
392 of matrices.
393
394 \li float \c NDC_Y_UP - The value is \c 1 when the Y axis points up in
395 normalized device coordinate space, and \c{-1} when the Y axis points down.
396 Y pointing down is the case when rendering happens with Vulkan.
397
398 \li float \c FRAMEBUFFER_Y_UP - The value is \c 1 when the Y axis points up
399 in the coordinate system for framebuffers (textures), meaning \c{(0, 0)} is
400 the bottom-left corner. The value is \c{-1} when the Y axis points down,
401 \c{(0, 0)} being the top-left corner.
402
403 \li float \c NEAR_CLIP_VALUE - The value is \c -1 for when the clipping plane
404 range's starts at \c -1 and goes to \c 1. This is true when using OpenGL for
405 rendering. For other rendering backends the value of this property will be
406 \c 0 meaning the clipping plane range is \c 0 to \c 1.
407
408 \endlist
409
410 \section1 Building multi-pass effects
411
412 A multi-pass effect often uses more than one set of shaders, and takes the
413 \l{Pass::output}{output} and \l{Pass::commands}{commands} properties into
414 use. Each entry in the passes list translates to a render pass drawing a
415 quad into the pass's output texture, while sampling the effect's input texture
416 and optionally other textures as well.
417
418 The typical outline of a multi-pass Effect can look like the following:
419
420 \qml
421 passes: [
422 Pass {
423 shaders: [
424 Shader {
425 stage: Shader.Vertex
426 shader: "pass1.vert"
427 },
428 Shader {
429 stage: Shader.Fragment
430 shader: "pass1.frag"
431 }
432 // This pass outputs to the intermediate texture described
433 // by the Buffer object.
434 output: intermediateColorBuffer
435 ],
436 },
437 Pass {
438 shaders: [
439 Shader {
440 stage: Shader.Vertex
441 shader: "pass2.vert"
442 },
443 Shader {
444 stage: Shader.Fragment
445 shader: "pass2.frag"
446 }
447 // The output of the last pass needs no redirection, it is
448 // the final result of the effect.
449 ],
450 commands: [
451 // This pass reads from the intermediate texture, meaning
452 // INPUT in the shader will refer to the texture associated
453 // with the Buffer.
454 BufferInput {
455 buffer: intermediateColorBuffer
456 }
457 ]
458 }
459 ]
460 \endqml
461
462 What is \c intermediateColorBuffer?
463
464 \qml
465 Buffer {
466 id: intermediateColorBuffer
467 name: "tempBuffer"
468 // format: Buffer.RGBA8
469 // textureFilterOperation: Buffer.Linear
470 // textureCoordOperation: Buffer.ClampToEdge
471 }
472 \endqml
473
474 The commented properties are not necessary if the desired values match the
475 defaults.
476
477 Internally the presence of this Buffer object and referencing it from the \c
478 output property of a Pass leads to creating a texture with a size matching
479 the View3D, and so the size of the implicit input and output textures. When
480 this is not desired, the \l{Buffer::sizeMultiplier}{sizeMultiplier} property
481 can be used to get an intermediate texture with a different size. This can
482 lead to the \c INPUT_SIZE and \c OUTPUT_SIZE uniforms in the shader having
483 different values.
484
485 By default the Effect cannot count on textures preserving their contents
486 between frames. When a new intermediate texture is created, it is cleared to
487 \c{vec4(0.0)}. Afterwards, the same texture can be reused for another
488 purpose. Therefore, effect passes should always write to the entire texture,
489 without making assumptions about their content at the start of the pass.
490 There is an exception to this: Buffer objects with
491 \l{Buffer::bufferFlags}{bufferFlags} set to Buffer.SceneLifetime. This
492 indicates that the texture is permanently associated with a pass of the
493 effect and it will not be reused for other purposes. The contents of such
494 color buffers is preserved between frames. This is typically used in a
495 ping-pong fashion in effects like motion blur: the first pass takes the
496 persistent buffer as its input, in addition to the effects main input
497 texture, outputting to another intermediate buffer, while the second pass
498 outputs to the persistent buffer. This way in the first frame the first pass
499 samples an empty (transparent) texture, whereas in subsequent frames it
500 samples the output of the second pass from the previous frame. A third pass
501 can then blend the effect's input and the second pass' output together.
502
503 The BufferInput command type is used to expose custom texture buffers to the
504 render pass.
505
506 For instance, to access \c someBuffer in the render pass shaders under
507 the name, \c mySampler, the following can be added to its command list:
508 \qml
509 BufferInput { buffer: someBuffer; sampler: "mySampler" }
510 \endqml
511
512 If the \c sampler name is not specified, \c INPUT will be used as default.
513
514 Buffers can be useful to share intermediate results between render passes.
515
516 To expose preloaded textures to the effect, TextureInput should be used instead.
517 These can be defined as properties of the Effect itself, and will automatically
518 be accessible to the shaders by their property names.
519 \qml
520 property TextureInput tex: TextureInput {
521 texture: Texture { source: "image.png" }
522 }
523 \endqml
524
525 Here \c tex is a valid sampler in all shaders of all the passes of the
526 effect.
527
528 When it comes to uniform values from properties, all passes in the Effect
529 read the same values in their shaders. If necessary it is possible to
530 override the value of a uniform just for a given pass. This is achieved by
531 adding the \l SetUniformValue command to the list of commands for the pass.
532
533 \note The \l{SetUniformValue::target}{target} of the pass-specific uniform
534 value setter can only refer to a name that is the name of a property of the
535 effect. It can override the value for a property's corresponding uniform,
536 but it cannot introduce new uniforms.
537
538 \section1 Performance considerations
539
540 Be aware of the increased resource usage and potentially reduced performance
541 when using post-processing effects. Just like with Qt Quick layers and
542 ShaderEffect, rendering the scene into a texture and then using that to
543 texture a quad is not a cheap operation, especially on low-end hardware with
544 limited fragment processing power. The amount of additional graphics memory
545 needed, as well as the increase in GPU load both depend on the size of the
546 View3D (which, on embedded devices without a windowing system, may often be
547 as big as the screen resolution). Multi-pass effects, as well as applying
548 multiple effects increase the resource and performance requirements further.
549
550 Therefore, it is highly advisable to ensure early on in the development
551 lifecycle that the targeted device and graphics stack is able to cope with
552 the effects included in the design of the 3D scene at the final product's
553 screen resolution.
554
555 While unavoidable with techniques that need it, \c DEPTH_TEXTURE implies an
556 additional rendering pass to generate the contents of that texture, which
557 can also present a hit on less capable hardware. Therefore, use \c
558 DEPTH_TEXTURE in the effect's shaders only when essential.
559
560 The complexity of the operations in the shaders is also important. Just like
561 with CustomMaterial, a sub-optimal fragment shader can easily lead to
562 reduced rendering performance.
563
564 Be cautious with \l{Buffer::sizeMultiplier}{sizeMultiplier in Buffer} when
565 values larger than 1 are involved. For example, a multiplier of 4 means
566 creating and then rendering to a texture that is 4 times the size of the
567 View3D. Just like with shadow maps and multi- or supersampling, the
568 increased resource and performance costs can quickly outweigh the benefits
569 from better quality on systems with limited GPU power.
570
571 \section1 VR/AR considerations
572
573 When developing applications for virtual or augmented reality by using Qt
574 Quick 3D XR, postprocessing effects are functional and available to use.
575 However, designers are developers should take special care to understand
576 which and what kind of effects make sense in a virtual reality environment.
577 Some effects, including some of the built-in ones in
578 ExtendedSceneEnvironment or the deprecated Effects module, do not lead to a
579 good visual experience in a VR environment, possibly affecting the user
580 physically even (causing, for example, motion sickness or dizziness).
581
582 When the more efficient \l{Multiview Rendering}{multiview rendering mode} is
583 enabled in a VR/AR application, there is no separate render pass for the
584 left and right eye contents. Instead, it all happens in one pass, using a 2D
585 texture array with two layers instead of two independent 2D textures. This
586 also means that many intermediate buffers, meaning color or depth textures,
587 will need to become texture arrays in this mode. This then has implications
588 for custom materials and postprocessing effects. Textures such as the input
589 texture (\c INPUT) and the depth texture (\c DEPTH_TEXTURE) become 2D texture
590 arrays, exposed in the shader as a \c sampler2DArray instead of \c
591 sampler2D. This has implications for GLSL functions such as texture(),
592 textureLod(), or textureSize(). The UV coordinate is then a vec3, not a
593 vec2. Whereas textureSize() returns a vec3, not a vec2. Effects intended to
594 function regardless of the rendering mode, can be written with an
595 appropriate ifdef:
596 \badcode
597 #if QSHADER_VIEW_COUNT >= 2
598 vec4 c = texture(INPUT, vec3(INPUT_UV, VIEW_INDEX));
599 #else
600 vec4 c = texture(INPUT, INPUT_UV);
601 #endif
602 \endcode
603
604 It can also be useful to define macros that handle both cases. For example:
605 \badcode
606 #if QSHADER_VIEW_COUNT >= 2
607 #define SAMPLE_INPUT(uv) texture(INPUT, vec3(uv, VIEW_INDEX))
608 #define SAMPLE_DEPTH(uv) texture(DEPTH_TEXTURE, vec3(uv, VIEW_INDEX)).r
609 #define PROJECTION PROJECTION_MATRIX[VIEW_INDEX]
610 #define INVERSE_PROJECTION INVERSE_PROJECTION_MATRIX[VIEW_INDEX]
611 #else
612 #define SAMPLE_INPUT(uv) texture(INPUT, uv)
613 #define SAMPLE_DEPTH(uv) texture(DEPTH_TEXTURE, uv).r
614 #define PROJECTION PROJECTION_MATRIX
615 #define INVERSE_PROJECTION INVERSE_PROJECTION_MATRIX
616 #endif
617 \endcode
618
619 This does not apply to \c NORMAL_ROUGHNESS_TEXTURE that is always a 2D texture,
620 even when multiview rendering is active:
621 \badcode
622 #define SAMPLE_NORMAL(uv) normalize(texture(NORMAL_ROUGHNESS_TEXTURE, uv).rgb)
623 \endcode
624
625 \note The presence of keywords such as \c DEPTH_TEXTURE trigger additional
626 render passes, and uniforms such as \c INVERSE_PROJECTION_MATRIX are
627 calculated and set upon the presence of the keyword in the shader snippet
628 anywhere. This is more expensive, both when it comes to performance and
629 resource usage. Hence it is recommended to only add such #defines when the
630 textures and matrices will really be used in the effect.
631
632 \sa Shader, Pass, Buffer, BufferInput, {Qt Quick 3D - Custom Effect Example}
633*/
634
635/*!
636 \qmlproperty list Effect::passes
637 Contains a list of render \l {Pass}{passes} implemented by the effect.
638*/
639
640QQuick3DEffect::QQuick3DEffect(QQuick3DObject *parent)
641 : QQuick3DObject(*(new QQuick3DObjectPrivate(QQuick3DObjectPrivate::Type::Effect)), parent)
642{
643}
644
645QQmlListProperty<QQuick3DShaderUtilsRenderPass> QQuick3DEffect::passes()
646{
647 return QQmlListProperty<QQuick3DShaderUtilsRenderPass>(this,
648 nullptr,
649 QQuick3DEffect::qmlAppendPass,
650 QQuick3DEffect::qmlPassCount,
651 QQuick3DEffect::qmlPassAt,
652 QQuick3DEffect::qmlPassClear);
653}
654
655// Default vertex and fragment shader code that is used when no corresponding
656// Shader is present in the Effect. These go through the usual processing so
657// should use the user-facing builtins.
658
660 "void MAIN()\n"
661 "{\n"
662 "}\n";
663
665 "void MAIN()\n"
666 "{\n"
667 "#if QSHADER_VIEW_COUNT >= 2\n"
668 " FRAGCOLOR = texture(INPUT, vec3(INPUT_UV, VIEW_INDEX));\n"
669 "#else\n"
670 " FRAGCOLOR = texture(INPUT, INPUT_UV);\n"
671 "#endif\n"
672 "}\n";
673
674static inline void insertVertexMainArgs(QByteArray &snippet)
675{
676 static const char *argKey = "/*%QT_ARGS_MAIN%*/";
677 const int argKeyLen = int(strlen(argKey));
678 const int argKeyPos = snippet.indexOf(argKey);
679 if (argKeyPos >= 0)
680 snippet = snippet.left(argKeyPos) + QByteArrayLiteral("inout vec3 VERTEX") + snippet.mid(argKeyPos + argKeyLen);
681}
682
683static inline void resetShaderDependentEffectFlags(QSSGRenderEffect *effectNode)
684{
685 effectNode->setFlag(QSSGRenderEffect::Flags::UsesDepthTexture, false);
686 effectNode->setFlag(QSSGRenderEffect::Flags::UsesProjectionMatrix, false);
687 effectNode->setFlag(QSSGRenderEffect::Flags::UsesInverseProjectionMatrix, false);
688 effectNode->setFlag(QSSGRenderEffect::Flags::UsesViewMatrix, false);
689 effectNode->setFlag(QSSGRenderEffect::Flags::UsesNormalTexture, false);
690 effectNode->setFlag(QSSGRenderEffect::Flags::UsesMotionVectorTexture, false);
691}
692
693static inline void accumulateEffectFlagsFromShader(QSSGRenderEffect *effectNode, const QSSGCustomShaderMetaData &meta)
694{
695 if (meta.flags.testFlag(QSSGCustomShaderMetaData::UsesDepthTexture))
696 effectNode->setFlag(QSSGRenderEffect::Flags::UsesDepthTexture);
697 if (meta.flags.testFlag(QSSGCustomShaderMetaData::UsesProjectionMatrix))
698 effectNode->setFlag(QSSGRenderEffect::Flags::UsesProjectionMatrix);
699 if (meta.flags.testFlag(QSSGCustomShaderMetaData::UsesInverseProjectionMatrix))
700 effectNode->setFlag(QSSGRenderEffect::Flags::UsesInverseProjectionMatrix);
701 if (meta.flags.testFlag(QSSGCustomShaderMetaData::UsesViewMatrix))
702 effectNode->setFlag(QSSGRenderEffect::Flags::UsesViewMatrix);
703 if (meta.flags.testFlag(QSSGCustomShaderMetaData::UsesNormalTexture))
704 effectNode->setFlag(QSSGRenderEffect::Flags::UsesNormalTexture);
705 if (meta.flags.testFlag(QSSGCustomShaderMetaData::UsesMotionVectorTexture))
706 effectNode->setFlag(QSSGRenderEffect::Flags::UsesMotionVectorTexture);
707}
708
709QSSGRenderGraphObject *QQuick3DEffect::updateSpatialNode(QSSGRenderGraphObject *node)
710{
711 using namespace QSSGShaderUtils;
712
713 const auto &renderContext = QQuick3DObjectPrivate::get(this)->sceneManager->wattached->rci();
714 if (!renderContext) {
715 qWarning("QQuick3DEffect: No render context interface?");
716 return nullptr;
717 }
718
719 QSSGRenderEffect *effectNode = static_cast<QSSGRenderEffect *>(node);
720 bool newBackendNode = false;
721 if (!effectNode) {
722 effectNode = new QSSGRenderEffect;
723 newBackendNode = true;
724 }
725
726 bool shadersOrBuffersMayChange = false;
727 if (m_dirtyAttributes & Dirty::EffectChainDirty)
728 shadersOrBuffersMayChange = true;
729
730 const bool fullUpdate = newBackendNode || effectNode->incompleteBuildTimeObject || (m_dirtyAttributes & Dirty::TextureDirty);
731
732 if (fullUpdate || shadersOrBuffersMayChange) {
733 markAllDirty();
734
735 // Need to clear the old list with properties and textures first.
736 effectNode->properties.clear();
737 effectNode->textureProperties.clear();
738
739 QMetaMethod propertyDirtyMethod;
740 const int idx = metaObject()->indexOfSlot("onPropertyDirty()");
741 if (idx != -1)
742 propertyDirtyMethod = metaObject()->method(idx);
743
744 // Properties -> uniforms
745 QSSGShaderCustomMaterialAdapter::StringPairList uniforms;
746 QSSGShaderCustomMaterialAdapter::StringPairList multiViewDependentSamplers;
747 const int propCount = metaObject()->propertyCount();
748 int propOffset = metaObject()->propertyOffset();
749
750 // Effect can have multilayered inheritance structure, so find the actual propOffset
751 const QMetaObject *superClass = metaObject()->superClass();
752 while (superClass && qstrcmp(superClass->className(), "QQuick3DEffect") != 0) {
753 propOffset = superClass->propertyOffset();
754 superClass = superClass->superClass();
755 }
756
757 using TextureInputProperty = QPair<QQuick3DShaderUtilsTextureInput *, const char *>;
758
759 QVector<TextureInputProperty> textureProperties; // We'll deal with these later
760 for (int i = propOffset; i != propCount; ++i) {
761 const QMetaProperty property = metaObject()->property(i);
762 if (Q_UNLIKELY(!property.isValid()))
763 continue;
764
765 const auto name = property.name();
766 QMetaType propType = property.metaType();
767 QVariant propValue = property.read(this);
768 if (propType == QMetaType(QMetaType::QVariant))
769 propType = propValue.metaType();
770
771 if (propType.id() >= QMetaType::User) {
772 if (propType.id() == qMetaTypeId<QQuick3DShaderUtilsTextureInput *>()) {
773 if (QQuick3DShaderUtilsTextureInput *texture = property.read(this).value<QQuick3DShaderUtilsTextureInput *>())
774 textureProperties.push_back({texture, name});
775 }
776 } else if (propType == QMetaType(QMetaType::QObjectStar)) {
777 if (QQuick3DShaderUtilsTextureInput *texture = qobject_cast<QQuick3DShaderUtilsTextureInput *>(propValue.value<QObject *>()))
778 textureProperties.push_back({texture, name});
779 } else {
780 const auto type = uniformType(propType);
781 if (type != QSSGRenderShaderValue::Unknown) {
782 uniforms.append({ uniformTypeName(propType), name });
783 effectNode->properties.push_back({ name, uniformTypeName(propType),
784 propValue, uniformType(propType), i});
785 // Track the property changes
786 if (fullUpdate) {
787 if (property.hasNotifySignal() && propertyDirtyMethod.isValid())
788 connect(this, property.notifySignal(), this, propertyDirtyMethod);
789 } // else already connected
790 } else {
791 // ### figure out how _not_ to warn when there are no dynamic
792 // properties defined (because warnings like Blah blah objectName etc. are not helpful)
793 //qWarning("No known uniform conversion found for effect property %s. Skipping", property.name());
794 }
795 }
796 }
797
798 const auto processTextureProperty = [&](QQuick3DShaderUtilsTextureInput &texture, const QByteArray &name) {
799 QSSGRenderEffect::TextureProperty texProp;
800 QQuick3DTexture *tex = texture.texture(); // may be null if the TextureInput has no 'texture' set
801 if (fullUpdate) {
802 connect(&texture, &QQuick3DShaderUtilsTextureInput::enabledChanged, this, &QQuick3DEffect::onTextureDirty);
803 connect(&texture, &QQuick3DShaderUtilsTextureInput::textureChanged, this, &QQuick3DEffect::onTextureDirty);
804 } // else already connected
805 texProp.name = name;
806 if (texture.enabled && tex)
807 texProp.texImage = tex->getRenderImage();
808
809 texProp.shaderDataType = QSSGRenderShaderValue::Texture;
810
811 if (tex) {
812 texProp.minFilterType = tex->minFilter() == QQuick3DTexture::Nearest ? QSSGRenderTextureFilterOp::Nearest
813 : QSSGRenderTextureFilterOp::Linear;
814 texProp.magFilterType = tex->magFilter() == QQuick3DTexture::Nearest ? QSSGRenderTextureFilterOp::Nearest
815 : QSSGRenderTextureFilterOp::Linear;
816 texProp.mipFilterType = tex->generateMipmaps() ? (tex->mipFilter() == QQuick3DTexture::Nearest ? QSSGRenderTextureFilterOp::Nearest
817 : QSSGRenderTextureFilterOp::Linear)
818 : QSSGRenderTextureFilterOp::None;
819 texProp.horizontalClampType = tex->horizontalTiling() == QQuick3DTexture::Repeat ? QSSGRenderTextureCoordOp::Repeat
820 : (tex->horizontalTiling() == QQuick3DTexture::ClampToEdge ? QSSGRenderTextureCoordOp::ClampToEdge
821 : QSSGRenderTextureCoordOp::MirroredRepeat);
822 texProp.verticalClampType = tex->verticalTiling() == QQuick3DTexture::Repeat ? QSSGRenderTextureCoordOp::Repeat
823 : (tex->verticalTiling() == QQuick3DTexture::ClampToEdge ? QSSGRenderTextureCoordOp::ClampToEdge
824 : QSSGRenderTextureCoordOp::MirroredRepeat);
825 texProp.zClampType = tex->depthTiling() == QQuick3DTexture::Repeat ? QSSGRenderTextureCoordOp::Repeat
826 : (tex->depthTiling() == QQuick3DTexture::ClampToEdge) ? QSSGRenderTextureCoordOp::ClampToEdge
827 : QSSGRenderTextureCoordOp::MirroredRepeat;
828 }
829
830 // Knowing upfront that a sampler2D needs to be a sampler2DArray in
831 // the multiview-compatible version of the shader is not trivial.
832 // Consider: we know the list of TextureInputs, without any
833 // knowledge about the usage of those textures. Intermediate buffers
834 // (textures) also have a default constructed (no source, no source
835 // item, no texture data) Texture set. What indicates that these are
836 // used as intermediate buffers, is the 'output' property of a Pass,
837 // referencing a Buffer object (which objects we otherwise do not
838 // track), the 'name' of which matches TextureInput property name.
839 // The list of passes may vary dynamically, and some Passes may not
840 // be listed at any point in time if the effect has an
841 // ubershader-ish design. Thus one can have TextureInputs that are
842 // not associated with a Buffer (when scanning through the Passes),
843 // and so we cannot just check the 'output'-referenced Buffers to
844 // decide if a TextureInput's Texture needs to be treated specially
845 // in the generated shader code. (and the type must be correct even
846 // for, from our perspective, "unused" samplers since they are still
847 // in the shader code, and will get a dummy texture bound)
848 //
849 // Therefore, in the absence of more sophisticated options, we just
850 // look at the TextureInput's texture, and if it is something along
851 // the lines of
852 // property TextureInput intermediateColorBuffer1: TextureInput { texture: Texture { } }
853 // then it is added to the special list, indicating the the type is
854 // sampler2D or sampler2DArray, depending on the rendering mode the
855 // shader is targeting.
856
857 if (tex && !tex->hasSourceData()) {
858 multiViewDependentSamplers.append({ QByteArrayLiteral("sampler2D"), name }); // the type may get adjusted later
859 } else {
860 if (tex && QQuick3DObjectPrivate::get(tex)->type == QQuick3DObjectPrivate::Type::ImageCube)
861 uniforms.append({ QByteArrayLiteral("samplerCube"), name });
862 else if (tex && tex->textureData() && tex->textureData()->depth() > 0)
863 uniforms.append({ QByteArrayLiteral("sampler3D"), name });
864 else
865 uniforms.append({ QByteArrayLiteral("sampler2D"), name });
866 }
867
868 effectNode->textureProperties.push_back(texProp);
869 };
870
871 // Textures
872 for (const auto &property : std::as_const(textureProperties))
873 processTextureProperty(*property.first, property.second);
874
875 if (effectNode->incompleteBuildTimeObject) { // This object came from the shadergen tool
876 const auto names = dynamicPropertyNames();
877 for (const auto &name : names) {
878 QVariant propValue = property(name.constData());
879 QMetaType propType = propValue.metaType();
880 if (propType == QMetaType(QMetaType::QVariant))
881 propType = propValue.metaType();
882
883 if (propType.id() >= QMetaType::User) {
884 if (propType.id() == qMetaTypeId<QQuick3DShaderUtilsTextureInput *>()) {
885 if (QQuick3DShaderUtilsTextureInput *texture = propValue.value<QQuick3DShaderUtilsTextureInput *>())
886 textureProperties.push_back({texture, name});
887 }
888 } else if (propType.id() == QMetaType::QObjectStar) {
889 if (QQuick3DShaderUtilsTextureInput *texture = qobject_cast<QQuick3DShaderUtilsTextureInput *>(propValue.value<QObject *>()))
890 textureProperties.push_back({texture, name});
891 } else {
892 const auto type = uniformType(propType);
893 if (type != QSSGRenderShaderValue::Unknown) {
894 uniforms.append({ uniformTypeName(propType), name });
895 effectNode->properties.push_back({ name, uniformTypeName(propType),
896 propValue, uniformType(propType), -1 /* aka. dynamic property */});
897 // We don't need to track property changes
898 } else {
899 // ### figure out how _not_ to warn when there are no dynamic
900 // properties defined (because warnings like Blah blah objectName etc. are not helpful)
901 qWarning("No known uniform conversion found for effect property %s. Skipping", name.constData());
902 }
903 }
904 }
905
906 for (const auto &property : std::as_const(textureProperties))
907 processTextureProperty(*property.first, property.second);
908 }
909
910 // built-ins
911 uniforms.append({ "mat4", "qt_modelViewProjection" });
912 uniforms.append({ "vec2", "qt_inputSize" });
913 uniforms.append({ "vec2", "qt_outputSize" });
914 uniforms.append({ "float", "qt_frame_num" });
915 uniforms.append({ "float", "qt_fps" });
916 uniforms.append({ "vec2", "qt_cameraProperties" });
917 uniforms.append({ "float", "qt_normalAdjustViewportFactor" });
918 uniforms.append({ "float", "qt_nearClipValue" });
919 uniforms.append({ "vec4", "qt_rhi_properties" });
920
921 // qt_inputTexture is not listed in uniforms, will be added by prepareCustomShader()
922 // since the name and type varies between non-multiview and multiview mode
923
924 QSSGShaderCustomMaterialAdapter::StringPairList builtinVertexInputs;
925 builtinVertexInputs.append({ "vec3", "attr_pos" });
926 builtinVertexInputs.append({ "vec2", "attr_uv" });
927
928 QSSGShaderCustomMaterialAdapter::StringPairList builtinVertexOutputs;
929 builtinVertexOutputs.append({ "vec2", "qt_inputUV" });
930 builtinVertexOutputs.append({ "vec2", "qt_textureUV" });
931 builtinVertexOutputs.append({ "flat uint", "qt_viewIndex" });
932
933 // fragOutput is added automatically by the program generator
934
935 resetShaderDependentEffectFlags(effectNode);
936
937 if (!m_passes.isEmpty()) {
938 const QQmlContext *context = qmlContext(this);
939 effectNode->resetCommands();
940 for (QQuick3DShaderUtilsRenderPass *pass : std::as_const(m_passes)) {
941 // Have a key composed more or less of the vertex and fragment filenames.
942 // The shaderLibraryManager uses stage+shaderPathKey as the key.
943 // Thus shaderPathKey is then sufficient to look up both the vertex and fragment shaders later on.
944 // Note that this key is not suitable as a unique key for the graphics resources because the same
945 // set of shader files can be used in multiple different passes, or in multiple active effects.
946 // But that's the effect system's problem.
947 QByteArray shaderPathKey("effect pipeline--");
948 QSSGRenderEffect::ShaderPrepPassData passData;
949 for (QQuick3DShaderUtilsShader::Stage stage : { QQuick3DShaderUtilsShader::Stage::Vertex, QQuick3DShaderUtilsShader::Stage::Fragment }) {
950 QQuick3DShaderUtilsShader *shader = nullptr;
951 for (QQuick3DShaderUtilsShader *s : pass->m_shaders) {
952 if (s->stage == stage) {
953 shader = s;
954 break;
955 }
956 }
957
958 // just how many enums does one need for the exact same thing...
959 QSSGShaderCache::ShaderType type = QSSGShaderCache::ShaderType::Vertex;
960 if (stage == QQuick3DShaderUtilsShader::Stage::Fragment)
961 type = QSSGShaderCache::ShaderType::Fragment;
962
963 // Will just use the custom material infrastructure. Some
964 // substitutions are common between custom materials and effects.
965 //
966 // Substitutions relevant to us here:
967 // MAIN -> qt_customMain
968 // FRAGCOLOR -> fragOutput
969 // POSITION -> gl_Position
970 // MODELVIEWPROJECTION_MATRIX -> qt_modelViewProjection
971 // DEPTH_TEXTURE -> qt_depthTexture
972 // ... other things shared with custom material
973 //
974 // INPUT -> qt_inputTexture
975 // INPUT_UV -> qt_inputUV
976 // ... other effect specifics
977 //
978 // Built-in uniforms, inputs and outputs will be baked into
979 // metadata comment blocks in the resulting source code.
980 // Same goes for inputs/outputs declared with VARYING.
981
982 QByteArray code;
983 if (shader) {
984 code = QSSGShaderUtils::resolveShader(shader->shader, context, shaderPathKey); // appends to shaderPathKey
985 } else {
986 if (!shaderPathKey.isEmpty())
987 shaderPathKey.append('>');
988 shaderPathKey += "DEFAULT";
989 if (type == QSSGShaderCache::ShaderType::Vertex)
990 code = default_effect_vertex_shader;
991 else
992 code = default_effect_fragment_shader;
993 }
994
995 for (auto pathKeyIndex : { QSSGRenderCustomMaterial::RegularShaderPathKeyIndex, QSSGRenderCustomMaterial::MultiViewShaderPathKeyIndex }) {
996 QSSGShaderCustomMaterialAdapter::ShaderCodeAndMetaData result;
997 QSSGShaderCustomMaterialAdapter::CustomShaderPrepWorkData scratch;
998
999 QSSGShaderCustomMaterialAdapter::beginPrepareCustomShader(
1000 &scratch,
1001 &result,
1002 code,
1003 type,
1004 pathKeyIndex == QSSGRenderCustomMaterial::RegularShaderPathKeyIndex ? false : true);
1005
1006 QSSGShaderCustomMaterialAdapter::StringPairList multiViewDependentUniforms;
1007 if (result.second.flags.testFlag(QSSGCustomShaderMetaData::UsesProjectionMatrix)
1008 || result.second.flags.testFlag(QSSGCustomShaderMetaData::UsesInverseProjectionMatrix))
1009 {
1010 multiViewDependentUniforms.append({ "mat4", "qt_projectionMatrix" });
1011 multiViewDependentUniforms.append({ "mat4", "qt_inverseProjectionMatrix" });
1012 }
1013
1014 multiViewDependentUniforms.append({ "mat4", "qt_viewMatrix" });
1015
1016 accumulateEffectFlagsFromShader(effectNode, result.second);
1017
1018 QSSGShaderCustomMaterialAdapter::finishPrepareCustomShader(
1019 &result.first, // effectively appends the QQ3D_SHADER_META block
1020 scratch,
1021 result,
1022 type,
1023 pathKeyIndex == QSSGRenderCustomMaterial::RegularShaderPathKeyIndex ? false : true,
1024 uniforms,
1025 type == QSSGShaderCache::ShaderType::Vertex ? builtinVertexInputs : builtinVertexOutputs,
1026 type == QSSGShaderCache::ShaderType::Vertex ? builtinVertexOutputs : QSSGShaderCustomMaterialAdapter::StringPairList(),
1027 multiViewDependentSamplers,
1028 multiViewDependentUniforms);
1029
1030 if (type == QSSGShaderCache::ShaderType::Vertex) {
1031 // qt_customMain() has an argument list which gets injected here
1032 insertVertexMainArgs(result.first);
1033 passData.vertexShaderCode[pathKeyIndex] = result.first;
1034 passData.vertexMetaData[pathKeyIndex] = result.second;
1035 } else {
1036 passData.fragmentShaderCode[pathKeyIndex] = result.first;
1037 passData.fragmentMetaData[pathKeyIndex] = result.second;
1038 }
1039 }
1040 }
1041
1042 effectNode->commands.push_back(nullptr); // will be changed to QSSGBindShader in finalizeShaders
1043 passData.bindShaderCmdIndex = effectNode->commands.size() - 1;
1044
1045 // finalizing the shader code happens in a separate step later on by the backend node
1046 passData.shaderPathKeyPrefix = shaderPathKey;
1047 effectNode->shaderPrepData.passes.append(passData);
1048 effectNode->shaderPrepData.valid = true; // trigger reprocessing the shader code later on
1049
1050 effectNode->commands.push_back(new QSSGApplyInstanceValue);
1051
1052 // Buffers
1053 QQuick3DShaderUtilsBuffer *outputBuffer = pass->outputBuffer;
1054 if (outputBuffer) {
1055 const QByteArray &outBufferName = outputBuffer->name;
1056 if (outBufferName.isEmpty()) {
1057 // default output buffer (with settings)
1058 auto outputFormat = QQuick3DShaderUtilsBuffer::mapTextureFormat(outputBuffer->format());
1059 effectNode->commands.push_back(new QSSGBindTarget(outputFormat));
1060 effectNode->outputFormat = outputFormat;
1061 } else {
1062 // Allocate buffer command
1063 effectNode->commands.push_back(outputBuffer->cloneCommand());
1064 connect(outputBuffer, &QQuick3DShaderUtilsBuffer::changed, this, &QQuick3DEffect::onPassDirty, Qt::UniqueConnection);
1065 // bind buffer
1066 effectNode->commands.push_back(new QSSGBindBuffer(outBufferName));
1067 }
1068 } else {
1069 // Use the default output buffer, same format as the source buffer
1070 effectNode->commands.push_back(new QSSGBindTarget(QSSGRenderTextureFormat::Unknown));
1071 effectNode->outputFormat = QSSGRenderTextureFormat::Unknown;
1072 }
1073
1074 // Other commands (BufferInput, Blending ... )
1075 const auto &extraCommands = pass->m_commands;
1076 for (const auto &command : extraCommands) {
1077 const int bufferCount = command->bufferCount();
1078 for (int i = 0; i != bufferCount; ++i) {
1079 effectNode->commands.push_back(command->bufferAt(i)->cloneCommand());
1080 connect(command->bufferAt(i), &QQuick3DShaderUtilsBuffer::changed, this, &QQuick3DEffect::onPassDirty, Qt::UniqueConnection);
1081 }
1082 effectNode->commands.push_back(command->cloneCommand());
1083 }
1084
1085 effectNode->commands.push_back(new QSSGRender);
1086 }
1087 }
1088 }
1089
1090 if (m_dirtyAttributes & Dirty::PropertyDirty) {
1091 for (const auto &prop : std::as_const(effectNode->properties)) {
1092 auto p = metaObject()->property(prop.pid);
1093 if (Q_LIKELY(p.isValid()))
1094 prop.value = p.read(this);
1095 }
1096 }
1097
1098 m_dirtyAttributes = 0;
1099
1100 DebugViewHelpers::ensureDebugObjectName(effectNode, this);
1101
1102 return effectNode;
1103}
1104
1105void QQuick3DEffect::onPropertyDirty()
1106{
1107 markDirty(Dirty::PropertyDirty);
1108}
1109
1110void QQuick3DEffect::onTextureDirty()
1111{
1112 markDirty(Dirty::TextureDirty);
1113}
1114
1115void QQuick3DEffect::onPassDirty()
1116{
1117 // changed() signals from not just Passes but Buffers are also hooked up to this.
1118 // Any property change should lead to re-evaluating the whole effect in sync.
1119 markDirty(Dirty::EffectChainDirty);
1120}
1121
1122void QQuick3DEffect::effectChainDirty()
1123{
1124 markDirty(Dirty::EffectChainDirty);
1125}
1126
1127void QQuick3DEffect::markDirty(QQuick3DEffect::Dirty type)
1128{
1129 if (!(m_dirtyAttributes & quint32(type))) {
1130 m_dirtyAttributes |= quint32(type);
1131 update();
1132 }
1133}
1134
1135void QQuick3DEffect::updateSceneManager(QQuick3DSceneManager *sceneManager)
1136{
1137 if (sceneManager) {
1138 for (const auto &it : std::as_const(m_dynamicTextureMaps)) {
1139 if (auto tex = it->texture())
1140 QQuick3DObjectPrivate::refSceneManager(tex, *sceneManager);
1141 }
1142 } else {
1143 for (const auto &it : std::as_const(m_dynamicTextureMaps)) {
1144 if (auto tex = it->texture())
1145 QQuick3DObjectPrivate::derefSceneManager(tex);
1146 }
1147 }
1148}
1149
1150void QQuick3DEffect::itemChange(QQuick3DObject::ItemChange change, const QQuick3DObject::ItemChangeData &value)
1151{
1152 if (change == QQuick3DObject::ItemSceneChange)
1153 updateSceneManager(value.sceneManager);
1154}
1155
1156void QQuick3DEffect::qmlAppendPass(QQmlListProperty<QQuick3DShaderUtilsRenderPass> *list, QQuick3DShaderUtilsRenderPass *pass)
1157{
1158 if (!pass)
1159 return;
1160
1161 QQuick3DEffect *that = qobject_cast<QQuick3DEffect *>(list->object);
1162 that->m_passes.push_back(pass);
1163
1164 connect(pass, &QQuick3DShaderUtilsRenderPass::changed, that, &QQuick3DEffect::onPassDirty);
1165 that->effectChainDirty();
1166}
1167
1168QQuick3DShaderUtilsRenderPass *QQuick3DEffect::qmlPassAt(QQmlListProperty<QQuick3DShaderUtilsRenderPass> *list, qsizetype index)
1169{
1170 QQuick3DEffect *that = qobject_cast<QQuick3DEffect *>(list->object);
1171 return that->m_passes.at(index);
1172}
1173
1174qsizetype QQuick3DEffect::qmlPassCount(QQmlListProperty<QQuick3DShaderUtilsRenderPass> *list)
1175{
1176 QQuick3DEffect *that = qobject_cast<QQuick3DEffect *>(list->object);
1177 return that->m_passes.size();
1178}
1179
1180void QQuick3DEffect::qmlPassClear(QQmlListProperty<QQuick3DShaderUtilsRenderPass> *list)
1181{
1182 QQuick3DEffect *that = qobject_cast<QQuick3DEffect *>(list->object);
1183
1184 for (QQuick3DShaderUtilsRenderPass *pass : that->m_passes)
1185 pass->disconnect(that);
1186
1187 that->m_passes.clear();
1188 that->effectChainDirty();
1189}
1190
1191void QQuick3DEffect::setDynamicTextureMap(QQuick3DShaderUtilsTextureInput *textureMap)
1192{
1193 // There can only be one texture input per property, as the texture input is a combination
1194 // of the texture used and the uniform name!
1195 auto it = m_dynamicTextureMaps.constFind(textureMap);
1196
1197 if (it == m_dynamicTextureMaps.constEnd()) {
1198 // Track the object, if it's destroyed we need to remove it from our table.
1199 connect(textureMap, &QQuick3DShaderUtilsTextureInput::destroyed, this, [this, textureMap]() {
1200 auto it = m_dynamicTextureMaps.constFind(textureMap);
1201 if (it != m_dynamicTextureMaps.constEnd())
1202 m_dynamicTextureMaps.erase(it);
1203 });
1204 m_dynamicTextureMaps.insert(textureMap);
1205
1206 update();
1207 }
1208}
1209
1210QT_END_NAMESPACE
Combined button and popup list for selecting options.
static void resetShaderDependentEffectFlags(QSSGRenderEffect *effectNode)
static const char * default_effect_fragment_shader
static const char * default_effect_vertex_shader
static void insertVertexMainArgs(QByteArray &snippet)
static void accumulateEffectFlagsFromShader(QSSGRenderEffect *effectNode, const QSSGCustomShaderMetaData &meta)