Qt
Internal/Contributor docs for the Qt SDK. Note: These are NOT official API docs; those are found at https://doc.qt.io/
Loading...
Searching...
No Matches
qtquick3d-architecture.qdoc
Go to the documentation of this file.
1
// Copyright (C) 2020 The Qt Company Ltd.
2
// SPDX-License-Identifier: LicenseRef-Qt-Commercial OR GFDL-1.3-no-invariants-only
3
4
/*!
5
\page qtquick3d-architecture.html
6
\title Qt Quick 3D Architecture
7
\brief An overview of the architecture of Qt Quick 3D
8
\ingroup explanations-2dand3dgraphics
9
10
Qt Quick 3D extends Qt Quick to support the rendering of 3D content. It adds
11
extensive functionality, including several new public QML imports, as well as
12
a new internal scene graph and renderer. This document describes the
13
architecture of Qt Quick 3D from the public API to the details of how the
14
rendering pipeline works.
15
16
\section1 Module Overview
17
18
Qt Quick 3D consists of several modules and plugins that expose the
19
additional 3D APIs as well as utilities for conditioning and importing
20
existing 3D assets.
21
22
\section2 QML Imports
23
24
\list
25
\li QtQuick3D - The main import which contains all the core components of
26
Qt Quick 3D
27
\li \l{QtQuick3D.AssetUtils QML Types}{QtQuick3D.AssetUtils} - A library for importing 3D assets at runtime
28
\li \l{Qt Quick 3D Helpers QML Types}{QtQuick3D.Helpers} - A library of additional components which can be
29
used to help design 3D and debug 3D scenes.
30
\endlist
31
32
\section2 C++ Libraries
33
34
\list
35
\li \l{Qt Quick 3D C++ Classes}{QtQuick3D} - The only public C++ module.
36
Contains the definitions of all types exposed to the QtQuick3D QML import
37
as well as a few C++ APIs
38
\list
39
\li QQuick3DGeometry - Subclass to create procedural mesh data
40
\li QQuick3DTextureData - Subclass to create procedural texture data
41
\li QQuick3D::idealSurfaceFormat - used to get the ideal surface format
42
\endlist
43
\li \c QtQuick3DAssetImport - An internal and private library to aid in
44
importing assets and convert assets to QML.
45
\li \c QtQuick3DRuntimeRender - An internal and private library that
46
contains the spatial scene graph nodes and renderer.
47
\li \c QtQuick3DUtils - An internal and private library used as a common
48
utility library by all of the other C++ modules.
49
\endlist
50
51
\section2 AssetImporters Plugins
52
The asset import tooling is implemented using a plugin based architecture. The
53
plugins shipped with Qt Quick 3D extend the functionality of the asset importer
54
library and tool, \l{Balsam Asset Import Tool}{Balsam}.
55
\list
56
\li Assimp - This plugin uses the 3rd party library libAssimp to convert
57
3D assets in 3D interchange formats to Qt Quick 3D QML components.
58
\endlist
59
60
\section1 How does Qt Quick 3D fit into the Qt Graphics Stack
61
62
\image quick3d-graphics-stack.drawio.svg
63
64
The above diagram illustrates how Qt Quick 3D fits into the larger Qt
65
graphics stack. Qt Quick 3D works as an extension to the 2D Qt Quick API, and
66
when using 3D scene items in conjunction with View3D the scene will be
67
rendered via the Qt Rendering Hardware Interface (RHI). The RHI will
68
translate API calls into the correct native rendering hardware API calls for
69
a given platform. The diagram above shows the options available for
70
each platform. If no native backend is explicitly defined, then Qt Quick will
71
default to a sensible native backend for rendering for each platform.
72
73
The integration between the Qt Quick 3D components of the stack and the Qt Quick
74
stack are described below in the next sections.
75
76
\section1 3D in 2D Integration
77
78
Displaying 3D content in 2D is the primary purpose of the Qt Quick 3D API. The
79
primary interface for integrating 3D content into 2D is the View3D component.
80
81
The View3D component works like any other QQuickItem derived class with
82
content and implements the virtual function QQuickItem::updatePaintNode. Qt
83
Quick calls updatePaintNode for all "dirty" items in the Qt Quick scenegraph
84
during the synchronization phase. This includes the 3D items managed by a
85
View3D, which also undergo their synchronization phase as a result of the
86
updatePaintNode call.
87
88
The updatePaintNode method of View3D performs the following actions:
89
\list
90
\li Set up a renderer and render target if one doesn't exist already
91
\li Synchronize items in the 3D scene via SceneManager
92
\li Update any "dynamic" textures that were rendered by Qt Quick (\l {Texture Path}{2D in 3D Texture path} below)
93
\endlist
94
95
The rendering of the 3D scene, however, does not occur in the View3D
96
updatePaintNode method. Instead updatePaintNode returns a QSGNode subclass
97
containing the renderer for Qt Quick 3D, which will render the 3D scene during
98
the preprocess phase of the Qt Quick render process.
99
100
The plumbing for how Qt Quick 3D will render depends on which
101
View3D::renderMode is used:
102
103
\section2 Offscreen
104
105
The default mode for View3D is \l {View3D::renderMode}{Offscreen}. When using offscreen mode
106
View3D becomes a texture provider by creating an offscreen surface and
107
rendering to it. This surface can be mapped as a texture in Qt Quick and
108
rendered with a QSGSimpleTextureNode.
109
110
This pattern is very close to how QSGLayerNodes work already in Qt Quick.
111
112
\section2 Underlay
113
114
When using the \l {View3D::renderMode}{Underlay} mode the 3D scene is directly rendered to the
115
QQuickWindow containing the View3D. Rendering occurs as a result of the signal
116
QQuickWindow::beforeRenderPassRecording() which means that everything else in
117
Qt Quick will be rendered on top of the 3D content.
118
119
\section2 Overlay
120
121
When using the \l {View3D::renderMode}{Overlay} mode the 3D scene is directly rendered to the
122
QQuickWindow containing the View3D. Rendering occurs as a result of the signal
123
QQuickWindow::afterRenderPassRecording() which means that the 3D content will
124
be rendered on top of all other Qt Quick content.
125
126
\section2 Inline
127
128
The \l {View3D::renderMode}{Inline} render mode uses QSGRenderNode, which enables direct
129
rendering to Qt Quick's render target without using an offscreen surface. It
130
does this by injecting the render commands inline during the 2D rendering of
131
the Qt Quick Scene.
132
133
This mode can be problematic because it uses the same depth buffer as the
134
Qt Quick renderer, and z values mean completely different things in Qt Quick
135
vs Qt Quick 3D.
136
137
\section1 2D in 3D Integration
138
139
When rendering a 3D scene, there are many scenarios where there is a need to
140
embed 2D elements into 3D. There are two different ways to integrate 2D
141
content inside of 3D scenes, each of which has its own path to get to the
142
screen.
143
144
\section2 Direct Path
145
146
The direct path is used to render 2D Qt Quick content as if it existed as an
147
flat item in the 3D scene. For example, consider the following scene
148
definition:
149
150
\code
151
Node {
152
Text {
153
text: "Hello world!"
154
}
155
}
156
\endcode
157
158
What happens here is: when a child component is set on
159
a spatial node of type QQuickItem, it is first wrapped by a
160
QQuick3DItem2D, which is just a container that adds 3D coordinates to a 2D item.
161
This sets the base 3D transformation for how all further 2D children are
162
rendered so that they appear correctly in the 3D scene.
163
164
When the time comes to render the scene, these 2D items' QSGNodes are passed to
165
the Qt Quick Renderer to generate the appropriate render commands. Because the
166
commands are done inline and take the current 3D transformation into
167
consideration, they are rendered exactly the same as in the 2D renderer, but
168
show up as if they were rendered in 3D.
169
170
The drawback of this approach is that no lighting information of the 3D scene
171
can be used to shade the 2D content, because the Qt Quick 2D renderer has no
172
concept of lighting.
173
174
\section2 Texture Path
175
176
The texture path uses a 2D Qt Quick scene to create dynamic texture
177
content. Consider the following Texture definition:
178
179
\code
180
Texture {
181
sourceItem: Item {
182
width: 256
183
height: 256
184
Text {
185
anchors.centerIn: parent
186
text: "Hello World!"
187
}
188
}
189
}
190
\endcode
191
192
This approach works in the same way that Layer items work in Qt Quick, in that
193
everything is rendered to an offscreen surface the size of the top-level Item,
194
and that offscreen surface is then usable as a texture that can be reused
195
elsewhere.
196
197
This Texture can then be used by materials in the scene to render Qt Quick
198
content on items.
199
200
\section1 Scene Synchronization
201
202
\section2 Scene Manager
203
204
The scene manager in Qt Quick 3D is responsible for keeping track of spatial
205
items in a 3D scene, and for making sure that items are updating their
206
corresponding scene graph nodes during the synchronize phase. In Qt Quick,
207
this role is performed by QQuickWindow for the 2D case. The scene manager is
208
the primary interface between the frontend nodes and the backend scene graph
209
objects.
210
211
Each View3D item will have at least one Scene Manager, as one is created and
212
associated with the built-in scene root on construction. When spatial nodes
213
are added as children of the View3D, they are registered with the View3D's
214
scene manager. When using an imported scene, a second SceneManager is created
215
(or referenced if one exists already) to manage the nodes that are not direct
216
children of the View3D. This is needed because, unlike the View3D, an
217
imported scene doesn't exist on a QQuickWindow until it is referenced. The
218
additional SceneManager makes sure that assets belonging to the imported
219
scene are created at least once per QQuickWindow they are referenced in.
220
221
While the scene manager is an internal API, it is important to know that the
222
scene manager is responsible for calling updateSpatialNode on all objects that
223
have been marked dirty by calling the update() method.
224
225
\section2 Frontend/Backend Synchronization
226
227
The objective of synchronization is to make sure that the states set on the
228
frontend (Qt Quick) match what is set on the backend (Qt Quick Spatial Scene
229
Graph Renderer). By default the frontend and backend live in separate threads:
230
the frontend in the Qt Main thread, and the backend in Qt Quick's render thread. The
231
synchronization phase is where the main thread and render thread can safely
232
exchange data. During this phase, the scene manager will call updateSpatialNode for each dirty
233
node in the scene. This will either create a new backend node or update an
234
existing one for use by the renderer.
235
236
\section2 Qt Quick Spatial Scene Graph
237
238
Qt Quick 3D is designed to use the same frontend/backend separation pattern
239
as Qt Quick: frontend objects are controlled by the Qt Quick engine, while
240
backend objects contain state data for rendering the scene. Frontend objects
241
inherit from QObject and are exposed to the Qt Quick engine. Items in QML
242
source files map directly to frontend objects.
243
244
As the properties of these frontend objects are updated, one or more backend nodes
245
are created and placed into a scenegraph. Because rendering 3D scenes
246
involves a lot more state than rendering 2D, there is a separate set of specialized scene
247
graph nodes for representing the state of the 3D scene objects.
248
This scene graph is know as the Qt Quick Spatial Scene Graph.
249
250
Both the frontend objects and backend nodes can be categorized into two classes.
251
The first are spatial, in the sense that they exist somewhere in the in 3D space.
252
What this means in practice is that each of these types contains a transform
253
matrix. For spatial items the parent child relationship is significant because
254
each child item inherits the transform of its parents.
255
256
The other class of items are resources. Resource items do not have a position
257
in 3D space, but rather are just state that is used by other items. There can
258
be a parent-child relationship between these items, but it has no other meaning
259
than ownership.
260
261
Unlike the 2D scene graph in Qt Quick, the spatial scene graph exposes resource
262
nodes to the user directly. So for example in Qt Quick, while QSGTexture is
263
public API, there is no QQuickItem that exposes this object directly. Instead
264
the user must either use an Image item, which describes both where the texture
265
comes from as well as how to render it, or write C++ code to operate on the
266
QSGTexture itself. In Qt Quick 3D these resources are exposed directly in the
267
QML API. This is necessary because resources are an important part of the scene
268
state. These resources can be referenced by many objects in the scene: for
269
example, many Materials could use the same Texture. It is also possible to
270
set properties of a Texture at runtime that would directly change how a texture
271
is sampled, for example.
272
273
\section3 Spatial Objects
274
275
All spatial Objects are subclasses of the Node component, which contains the
276
properties defining the position, rotation, and scale in 3D space.
277
278
\list
279
\li \l [QtQuick3D QML] {Node}{Node}
280
\li \l [QtQuick3D QML] {Light}{Light}
281
\list
282
\li DirectionalLight
283
\li PointLight
284
\li SpotLight
285
\endlist
286
\li \l [QtQuick3D QML] {Camera}{Camera}
287
\list
288
\li PerspectiveCamera
289
\li OrthographicCamera
290
\li FrustumCamera
291
\li CustomCamera
292
\endlist
293
\li \l [QtQuick3D QML] {Model}{Model}
294
\li Loader3D
295
\li Repeater3D
296
\li \l [QtQuick3D QML] {Skeleton}{Skeleton}
297
\li \l [QtQuick3D QML] {Joint}{Joint}
298
\endlist
299
300
\section3 Resource Objects
301
302
Resource objects are subclasses of the Object3D component. Object3D is just a
303
QObject subclass with some special helpers for use with the scene manager.
304
Resource objects do have parent/child associations, but these are mostly useful
305
for resource ownership.
306
307
\list
308
\li \l [QtQuick3D QML] {Texture}{Texture}
309
\li \l [QtQuick3D QML] {TextureData}{TextureData}
310
\li \l [QtQuick3D QML] {Geometry}{Geometry}
311
\li \l [QtQuick3D QML] {Material}{Material}
312
\list
313
\li DefaultMaterial
314
\li PrincipledMaterial
315
\li CustomMaterial
316
\endlist
317
\li \l [QtQuick3D QML] {Effect}{Effect}
318
\li SceneEnvironment
319
\endlist
320
321
\section3 View3D and Render Layers
322
323
With regard to the frontend/backend separation, View3D is the separation
324
point from the user perspective because a View3D is what defines what scene
325
content to render. In the Qt Quick Spatial Scene Graph, the root node for a
326
scene that will be rendered is a Layer node. Layer nodes are created by the
327
View3D using a combination of the the View3D's properties and the properties
328
of the SceneEnvironment. When rendering a scene for a View3D, it is this Layer
329
node that is being passed to the renderer to render a scene.
330
331
\section1 Scene Rendering
332
333
\image qtquick3d-rendergraph.drawio.svg
334
335
\section2 Set up Render Target
336
337
The first step in the rendering process is to determine and set up the scene
338
render target. Depending on which properties are set in the SceneEnvironment,
339
the actual render target will vary. The first decision is whether content is
340
being rendered directly to a window surface, or to an offscreen texture.
341
By default, View3D will render to an offscreen texture. When using post
342
processing effects, rendering to an offscreen texture is mandatory.
343
344
Once a scene render target is determined, then some global states are set.
345
\list
346
\li window size - if rendering to a window
347
\li viewport - the size of the scene area being rendered
348
\li scissor rect - the subset of a window that the viewport should be
349
clipped to
350
\li clear color - what color to clear the render target with, if any.
351
\endlist
352
353
\section2 Prepare for Render
354
355
The next stage of rendering is the prepare stage where the renderer does
356
house-keeping to figure out what needs to be rendered for a given frame,
357
and that all necessary resources are available and up to date.
358
359
The prepare stage itself has two phases: the high-level preparation of
360
determining what is to be rendered and what resources are needed; and the
361
low-level preparation that uses RHI to actually set up rendering pipelines and
362
buffers, as well as setting up the rendering dependencies of the main scene pass.
363
364
\section3 High level render preparation
365
366
The purpose of this phase is to extract the state of the spatial scene graph
367
into something that can be used to create render commands. The overview here is
368
that the renderer is creating lists of geometry and material combinations to
369
render from the perspective of a single camera with a set of lighting states.
370
371
The first thing that is done is to determine the global common state for all
372
content. If the SceneEnvironment defines a \l {SceneEnvironment::lightProbe}{lightProbe}, then it checks if the
373
environment map associated with that light probe texture is loaded, and if its
374
not, a new environment map is is loaded or generated. The generation of
375
an environment will itself be a set of passes to convolve the source texture
376
into a cube map. This cube map will contain both specular reflection information
377
as well as irradiance, which is used for material shading.
378
379
The next thing is that the renderer needs to determine which camera in the
380
scene to use. If an active camera is not explicitly defined by a View3D, the
381
first camera available in the scene is used. If there are no cameras
382
in the scene, then no content is rendered and the renderer bails out.
383
384
With a camera determined, it is possible to calculate the projection matrix
385
for this frame. The calculation is done at this point because each renderable
386
needs to know how to be projected. This also means that it is now possible to
387
calculate which renderable items should be rendered. Starting with the list of
388
all renderable items, we remove all items that are not visible because they
389
are either disabled or completely transparent. Then, if frustum culling is
390
enabled on the active camera, each renderable item is checked to see if it is
391
completely outside of the view of the camera's frustum, and if so it is
392
removed from the renderable list.
393
394
In addition to the camera projection, the camera direction is also calculated
395
as this is necessary for lighting calculations in the shading code.
396
397
If there are light nodes in the scene, these are then gathered into a list the
398
length of the maximum available lights available. If more light nodes exist in
399
the scene than the amount of lights the renderer supports, any additional
400
light nodes over that limit are ignored and don't contribute to the lighting of
401
the scene. It is possible to specify the scope of light nodes, but note that
402
even when setting a scope the lighting state of each light is still sent to
403
every material which has lighting, but for lights not in scope the brightness
404
will be set to 0, so in practice those lights will not contribute to the
405
lighting of those materials.
406
407
Now with a hopefully shorter list of renderables, each of these items need to
408
be updated to reflect the current state of the scene. For each renderable we
409
check that a suitable material is loaded, and if not a new one is created.
410
A material is a combination of shaders and a rendering pipeline, and it is needed
411
for creating a draw call. In addition the renderer makes sure that any
412
resources needed to render a renderable is loaded, for example geometry and
413
textures set on the Model. Resources that are not loaded already are
414
loaded here.
415
416
The renderables list is then sorted into 3 lists.
417
\list
418
\li Opaque Items: these are sorted from front to back, or in other words
419
from items that are closest to the camera to items that are furthest from the
420
camera. This is done to take advantage of hardware occlusion culling or
421
early z detection in the fragment shader.
422
\li 2D Items: these are QtQuick Items that are rendered by the Qt Quick
423
renderer.
424
\li Transparent Items: these are sorted from back to front, or in other
425
words from items that are farthest from the camera to items that are nearest
426
to the camera. This is done because transparent items need to be blended
427
with all items that are behind them.
428
\endlist
429
430
\section3 Low Level render preparation
431
432
Now that everything that needs to be considered for this frame has been
433
determined, the plumbing and dependencies for the main render pass can be
434
addressed. The first thing that is done in this phase is to render any
435
pre-passes that are required for the main pass.
436
437
\list
438
\li Render DepthPass - Certain features like Screen Space Ambient Occlusion
439
and Shadowing require a depth pre-pass. This pass consists of all opaque
440
items being rendered to a depth texture.
441
442
\li Render SSAOPass - The objective of the Screen Space Ambient Occlusion
443
pass is to generate an ambient occlusion texture. This texture is used
444
later by materials to darken certain areas when shading.
445
446
\li Render ShadowPasses - Each light in the scene that has shadow enabled,
447
contributes to an additional shadow pass. There are two different shadowing
448
techniques employed by the renderer, so depending on the light types there
449
will be different passes. When rendering shadows from a directional light,
450
the scene is rendered to a 2D occlusion texture from a combination of the
451
directional light's direction and the size of the camera frustum. When
452
rendering shadows from a point or spot light the light's occlusion texture is
453
a cube map representing the occlusion contribution relative to each face
454
direction of the light.
455
456
\li Render ScreenTexture - This pass will only occur when using a
457
CustomMaterial that requires a screen texture, which can be used for
458
rendering tecniques such as refraction. This pass works like a depth pass,
459
but instead renders all opaque items to a color texture.
460
\endlist
461
462
After the dependency renders are done, the rest of the passes are prepared but
463
not rendered. This preparation involves taking the state gathered in the
464
high-level prep stage and translating that to graphics primitives like
465
creating/updating uniform buffers values, associating samplers with dependency
466
textures, setup for shader resource bindings, and everything else involved in
467
creating a pipeline state necessary for performing a draw call.
468
469
\section2 Scene Rendering
470
471
Now that the hard work of preperation is done, the easy part is running the
472
commands that contribute to the main scene's content. That rendering works
473
in this order:
474
475
\list
476
\li Clear Pass - This isn't really a pass, but depending on what
477
backgroundMode is set on SceneEnvironment, different things can happen here.
478
If the background mode is either transparent or color, then the color buffer
479
will be cleared with either transparency or the color specified. If, however,
480
the background mode is set to SkyBox, then a pass will be run that renders
481
the SkyBox from the perspective of the camera, which will also fill the buffer
482
with initial data.
483
484
\li Opaque Pass - Next all opaque items will be drawn. This just involves
485
setting the pipeline state, and running the draw command for each item in
486
the order in the list since they are already sorted at this point.
487
488
\li 2D Pass - If there are any 2D items in the scene, then the Qt Quick
489
renderer is invoked to generate the render commands necessary to render
490
those items.
491
492
\li Transparent Pass - Then finally the transparent items in the scene are
493
rendered one by one in the same manner as the opaque items.
494
\endlist
495
496
This concludes the rendering of the scene.
497
498
\section2 Post-Processing
499
500
If any post-processing functionality is enabled, then it can be assumed that the
501
result of the scene renderer was a texture that is an input for the post
502
processing phase. All post-processing methods are additional passes that
503
operate on this scene input texture.
504
505
All steps of the Post-Processing phase are optional, and if no built-in
506
features and no user-defined effects are enabled, the output of the scene
507
render is what is used by the final render target. Note however that
508
\l{ExtendedSceneEnvironment::tonemapMode}{tonemapping} is enabled by default.
509
510
\image qtquick3d-postprocess-graph.drawio.svg
511
512
\section3 Built-in Post-Processing
513
514
\l ExtendedSceneEnvironment and its parent type \l SceneEnvironment offer the
515
most common effects used in 3D scenes, as well as tonemapping that is used to
516
map the high dynamic range color values generated by the renderer to the 0-1
517
LDR range. The effects include depth of field, glow/bloom, lens flare,
518
vignette, color adjustment and grading, fog, and ambient occlusion.
519
520
\section3 Post-Processing Effects
521
522
Applications can specify their own custom post-processing effects as an ordered
523
list in the SceneEnvironment::effects property. When this list is non-empty,
524
the effects in it are applied \e before the built-in effects provided by \l
525
ExtendedSceneEnvironment. Each post-processing effect is part of a chain such
526
that the output of the previous effect is the input for the next. The first
527
effect in this chain gets its input directly from the output of the scene
528
renderer step. It is also possible for effects to access the depth texture
529
output of the scene renderer.
530
531
Each effect in this process can consist of multiple sub-passes, which means it
532
is possible to render content into intermediate buffers. The final pass of a
533
multi-pass effect is expected to output a single texture containing the color
534
data to be used by the next steps of the post-processing phase.
535
536
\section3 Temporal and Progressive Antialiasing
537
538
The Temporal and Progressive antialiasing steps are optionally enabled by
539
setting properties in the SceneEnvironment. While not strictly a part of the
540
post-processing phase, the actual results of Temporal and Progressive
541
antialiasing are realized during the post-processing phase.
542
543
Temporal Antialiasing is performed when a scene is being actively updated.
544
When enabled, the active camera makes very small adjustments to the camera
545
direction for each frame while drawing the scene. The current frame is then
546
blended with the previously rendered frame to smooth out what was rendered.
547
548
Progressive Antialiasing is only performed when a scene is not being updated.
549
When enabled, an update is forced and the current state of the scene is
550
rendered with very small adjustments to the active cameras direction. Up to 8
551
frames are accumulated and blended together with pre-defined weights. This has
552
the effect of smoothing out a non-animating scene, but comes at a
553
performance cost because several extra frames will be rendered for each update.
554
555
\section3 Super Sampling Antialiasing (SSAA)
556
557
Super Sampling Antialiasing is a brute force way of smoothing out a scene. It
558
works by rendering to a texture that is a multiple of the requested size of
559
the scene, and then afterwards downsampling it to the target size. So for
560
example if 2X SSAA is requested, then the scene would be rendered to a texture
561
that is 2 times the intended size, and then downsampled as part of this
562
phase. This can have a huge impact on performance and resource usage so
563
should be avoided if possible. It's also possible for the View3D size to be
564
too large to use this method, since the texture needed for this method may be
565
larger than what is supported by the rendering hardware.
566
567
*/
qtquick3d
src
quick3d
doc
src
qtquick3d-architecture.qdoc
Generated on
for Qt by
1.14.0