Qt
Internal/Contributor docs for the Qt SDK. Note: These are NOT official API docs; those are found at https://doc.qt.io/
Loading...
Searching...
No Matches
qtquick3d-architecture.qdoc
Go to the documentation of this file.
1
// Copyright (C) 2020 The Qt Company Ltd.
2
// SPDX-License-Identifier: LicenseRef-Qt-Commercial OR GFDL-1.3-no-invariants-only
3
4
/*!
5
\page qtquick3d-architecture.html
6
\title Qt Quick 3D Architecture
7
\brief An overview of the architecture of Qt Quick 3D
8
\ingroup explanations-2dand3dgraphics
9
10
Qt Quick 3D extends Qt Quick to support the rendering of 3D content. It adds
11
extensive functionality, including several new public QML imports, as well as
12
a new internal scene graph and renderer. This document describes the
13
architecture of Qt Quick 3D from the public API to the details of how the
14
rendering pipeline works.
15
16
\section1 Module Overview
17
18
Qt Quick 3D consists of several modules and plugins that expose the
19
additional 3D APIs as well as utilities for conditioning and importing
20
existing 3D assets.
21
22
\section2 QML Imports
23
24
\list
25
\li QtQuick3D - The main import which contains all the core components of
26
Qt Quick 3D
27
\li \l{QtQuick3D.AssetUtils QML Types}{QtQuick3D.AssetUtils} - A library for importing 3D assets at runtime
28
\li \l{Qt Quick 3D Helpers QML Types}{QtQuick3D.Helpers} - A library of additional components which can be
29
used to help design 3D and debug 3D scenes.
30
\endlist
31
32
\section2 C++ Libraries
33
34
\list
35
\li \l{Qt Quick 3D C++ Classes}{QtQuick3D} - The only public C++ module.
36
Contains the definitions of all types exposed to the QtQuick3D QML import
37
as well as a few C++ APIs
38
\list
39
\li QQuick3DGeometry - Subclass to create procedural mesh data
40
\li QQuick3DTextureData - Subclass to create procedural texture data
41
\li QQuick3D::idealSurfaceFormat - used to get the ideal surface format
42
\endlist
43
\li \c QtQuick3DAssetImport - An internal and private library to aid in
44
importing assets and convert assets to QML.
45
\li \c QtQuick3DRuntimeRender - An internal and private library that
46
contains the spatial scene graph nodes and renderer.
47
\li \c QtQuick3DUtils - An internal and private library used as a common
48
utility library by all of the other C++ modules.
49
\endlist
50
51
\section2 AssetImporters Plugins
52
The asset import tooling is implemented using a plugin based architecture. The
53
plugins shipped with Qt Quick 3D extend the functionality of the asset importer
54
library and tool, \l{Balsam Asset Import Tool}{Balsam}.
55
\list
56
\li Assimp - This plugin uses the 3rd party library libAssimp to convert
57
3D assets in 3D interchange formats to Qt Quick 3D QML components.
58
\endlist
59
60
\section1 How does Qt Quick 3D fit into the Qt Graphics Stack
61
62
\image quick3d-graphics-stack.drawio.svg
63
{Qt Quick 3D graphics stack architecture diagram}
64
65
The above diagram illustrates how Qt Quick 3D fits into the larger Qt
66
graphics stack. Qt Quick 3D works as an extension to the 2D Qt Quick API, and
67
when using 3D scene items in conjunction with View3D the scene will be
68
rendered via the Qt Rendering Hardware Interface (RHI). The RHI will
69
translate API calls into the correct native rendering hardware API calls for
70
a given platform. The diagram above shows the options available for
71
each platform. If no native backend is explicitly defined, then Qt Quick will
72
default to a sensible native backend for rendering for each platform.
73
74
The integration between the Qt Quick 3D components of the stack and the Qt Quick
75
stack are described below in the next sections.
76
77
\section1 3D in 2D Integration
78
79
Displaying 3D content in 2D is the primary purpose of the Qt Quick 3D API. The
80
primary interface for integrating 3D content into 2D is the View3D component.
81
82
The View3D component works like any other QQuickItem derived class with
83
content and implements the virtual function QQuickItem::updatePaintNode. Qt
84
Quick calls updatePaintNode for all "dirty" items in the Qt Quick scenegraph
85
during the synchronization phase. This includes the 3D items managed by a
86
View3D, which also undergo their synchronization phase as a result of the
87
updatePaintNode call.
88
89
The updatePaintNode method of View3D performs the following actions:
90
\list
91
\li Set up a renderer and render target if one doesn't exist already
92
\li Synchronize items in the 3D scene via SceneManager
93
\li Update any "dynamic" textures that were rendered by Qt Quick (\l {Texture Path}{2D in 3D Texture path} below)
94
\endlist
95
96
The rendering of the 3D scene, however, does not occur in the View3D
97
updatePaintNode method. Instead updatePaintNode returns a QSGNode subclass
98
containing the renderer for Qt Quick 3D, which will render the 3D scene during
99
the preprocess phase of the Qt Quick render process.
100
101
The plumbing for how Qt Quick 3D will render depends on which
102
View3D::renderMode is used:
103
104
\section2 Offscreen
105
106
The default mode for View3D is \l {View3D::renderMode}{Offscreen}. When using offscreen mode
107
View3D becomes a texture provider by creating an offscreen surface and
108
rendering to it. This surface can be mapped as a texture in Qt Quick and
109
rendered with a QSGSimpleTextureNode.
110
111
This pattern is very close to how QSGLayerNodes work already in Qt Quick.
112
113
\section2 Underlay
114
115
When using the \l {View3D::renderMode}{Underlay} mode the 3D scene is directly rendered to the
116
QQuickWindow containing the View3D. Rendering occurs as a result of the signal
117
QQuickWindow::beforeRenderPassRecording() which means that everything else in
118
Qt Quick will be rendered on top of the 3D content.
119
120
\section2 Overlay
121
122
When using the \l {View3D::renderMode}{Overlay} mode the 3D scene is directly rendered to the
123
QQuickWindow containing the View3D. Rendering occurs as a result of the signal
124
QQuickWindow::afterRenderPassRecording() which means that the 3D content will
125
be rendered on top of all other Qt Quick content.
126
127
\section2 Inline
128
129
The \l {View3D::renderMode}{Inline} render mode uses QSGRenderNode, which enables direct
130
rendering to Qt Quick's render target without using an offscreen surface. It
131
does this by injecting the render commands inline during the 2D rendering of
132
the Qt Quick Scene.
133
134
This mode can be problematic because it uses the same depth buffer as the
135
Qt Quick renderer, and z values mean completely different things in Qt Quick
136
vs Qt Quick 3D.
137
138
\section1 2D in 3D Integration
139
140
When rendering a 3D scene, there are many scenarios where there is a need to
141
embed 2D elements into 3D. There are two different ways to integrate 2D
142
content inside of 3D scenes, each of which has its own path to get to the
143
screen.
144
145
\section2 Direct Path
146
147
The direct path is used to render 2D Qt Quick content as if it existed as an
148
flat item in the 3D scene. For example, consider the following scene
149
definition:
150
151
\code
152
Node {
153
Text {
154
text: "Hello world!"
155
}
156
}
157
\endcode
158
159
What happens here is: when a child component is set on
160
a spatial node of type QQuickItem, it is first wrapped by a
161
QQuick3DItem2D, which is just a container that adds 3D coordinates to a 2D item.
162
This sets the base 3D transformation for how all further 2D children are
163
rendered so that they appear correctly in the 3D scene.
164
165
When the time comes to render the scene, these 2D items' QSGNodes are passed to
166
the Qt Quick Renderer to generate the appropriate render commands. Because the
167
commands are done inline and take the current 3D transformation into
168
consideration, they are rendered exactly the same as in the 2D renderer, but
169
show up as if they were rendered in 3D.
170
171
The drawback of this approach is that no lighting information of the 3D scene
172
can be used to shade the 2D content, because the Qt Quick 2D renderer has no
173
concept of lighting.
174
175
\section2 Texture Path
176
177
The texture path uses a 2D Qt Quick scene to create dynamic texture
178
content. Consider the following Texture definition:
179
180
\code
181
Texture {
182
sourceItem: Item {
183
width: 256
184
height: 256
185
Text {
186
anchors.centerIn: parent
187
text: "Hello World!"
188
}
189
}
190
}
191
\endcode
192
193
This approach works in the same way that Layer items work in Qt Quick, in that
194
everything is rendered to an offscreen surface the size of the top-level Item,
195
and that offscreen surface is then usable as a texture that can be reused
196
elsewhere.
197
198
This Texture can then be used by materials in the scene to render Qt Quick
199
content on items.
200
201
\section1 Scene Synchronization
202
203
\section2 Scene Manager
204
205
The scene manager in Qt Quick 3D is responsible for keeping track of spatial
206
items in a 3D scene, and for making sure that items are updating their
207
corresponding scene graph nodes during the synchronize phase. In Qt Quick,
208
this role is performed by QQuickWindow for the 2D case. The scene manager is
209
the primary interface between the frontend nodes and the backend scene graph
210
objects.
211
212
Each View3D item will have at least one Scene Manager, as one is created and
213
associated with the built-in scene root on construction. When spatial nodes
214
are added as children of the View3D, they are registered with the View3D's
215
scene manager. When using an imported scene, a second SceneManager is created
216
(or referenced if one exists already) to manage the nodes that are not direct
217
children of the View3D. This is needed because, unlike the View3D, an
218
imported scene doesn't exist on a QQuickWindow until it is referenced. The
219
additional SceneManager makes sure that assets belonging to the imported
220
scene are created at least once per QQuickWindow they are referenced in.
221
222
While the scene manager is an internal API, it is important to know that the
223
scene manager is responsible for calling updateSpatialNode on all objects that
224
have been marked dirty by calling the update() method.
225
226
\section2 Frontend/Backend Synchronization
227
228
The objective of synchronization is to make sure that the states set on the
229
frontend (Qt Quick) match what is set on the backend (Qt Quick Spatial Scene
230
Graph Renderer). By default the frontend and backend live in separate threads:
231
the frontend in the Qt Main thread, and the backend in Qt Quick's render thread. The
232
synchronization phase is where the main thread and render thread can safely
233
exchange data. During this phase, the scene manager will call updateSpatialNode for each dirty
234
node in the scene. This will either create a new backend node or update an
235
existing one for use by the renderer.
236
237
\section2 Qt Quick Spatial Scene Graph
238
239
Qt Quick 3D is designed to use the same frontend/backend separation pattern
240
as Qt Quick: frontend objects are controlled by the Qt Quick engine, while
241
backend objects contain state data for rendering the scene. Frontend objects
242
inherit from QObject and are exposed to the Qt Quick engine. Items in QML
243
source files map directly to frontend objects.
244
245
As the properties of these frontend objects are updated, one or more backend nodes
246
are created and placed into a scenegraph. Because rendering 3D scenes
247
involves a lot more state than rendering 2D, there is a separate set of specialized scene
248
graph nodes for representing the state of the 3D scene objects.
249
This scene graph is know as the Qt Quick Spatial Scene Graph.
250
251
Both the frontend objects and backend nodes can be categorized into two classes.
252
The first are spatial, in the sense that they exist somewhere in the in 3D space.
253
What this means in practice is that each of these types contains a transform
254
matrix. For spatial items the parent child relationship is significant because
255
each child item inherits the transform of its parents.
256
257
The other class of items are resources. Resource items do not have a position
258
in 3D space, but rather are just state that is used by other items. There can
259
be a parent-child relationship between these items, but it has no other meaning
260
than ownership.
261
262
Unlike the 2D scene graph in Qt Quick, the spatial scene graph exposes resource
263
nodes to the user directly. So for example in Qt Quick, while QSGTexture is
264
public API, there is no QQuickItem that exposes this object directly. Instead
265
the user must either use an Image item, which describes both where the texture
266
comes from as well as how to render it, or write C++ code to operate on the
267
QSGTexture itself. In Qt Quick 3D these resources are exposed directly in the
268
QML API. This is necessary because resources are an important part of the scene
269
state. These resources can be referenced by many objects in the scene: for
270
example, many Materials could use the same Texture. It is also possible to
271
set properties of a Texture at runtime that would directly change how a texture
272
is sampled, for example.
273
274
\section3 Spatial Objects
275
276
All spatial Objects are subclasses of the Node component, which contains the
277
properties defining the position, rotation, and scale in 3D space.
278
279
\list
280
\li \l [QtQuick3D QML] {Node}{Node}
281
\li \l [QtQuick3D QML] {Light}{Light}
282
\list
283
\li DirectionalLight
284
\li PointLight
285
\li SpotLight
286
\endlist
287
\li \l [QtQuick3D QML] {Camera}{Camera}
288
\list
289
\li PerspectiveCamera
290
\li OrthographicCamera
291
\li FrustumCamera
292
\li CustomCamera
293
\endlist
294
\li \l [QtQuick3D QML] {Model}{Model}
295
\li Loader3D
296
\li Repeater3D
297
\li \l [QtQuick3D QML] {Skeleton}{Skeleton}
298
\li \l [QtQuick3D QML] {Joint}{Joint}
299
\endlist
300
301
\section3 Resource Objects
302
303
Resource objects are subclasses of the Object3D component. Object3D is just a
304
QObject subclass with some special helpers for use with the scene manager.
305
Resource objects do have parent/child associations, but these are mostly useful
306
for resource ownership.
307
308
\list
309
\li \l [QtQuick3D QML] {Texture}{Texture}
310
\li \l [QtQuick3D QML] {TextureData}{TextureData}
311
\li \l [QtQuick3D QML] {Geometry}{Geometry}
312
\li \l [QtQuick3D QML] {Material}{Material}
313
\list
314
\li DefaultMaterial
315
\li PrincipledMaterial
316
\li CustomMaterial
317
\endlist
318
\li \l [QtQuick3D QML] {Effect}{Effect}
319
\li SceneEnvironment
320
\endlist
321
322
\section3 View3D and Render Layers
323
324
With regard to the frontend/backend separation, View3D is the separation
325
point from the user perspective because a View3D is what defines what scene
326
content to render. In the Qt Quick Spatial Scene Graph, the root node for a
327
scene that will be rendered is a Layer node. Layer nodes are created by the
328
View3D using a combination of the the View3D's properties and the properties
329
of the SceneEnvironment. When rendering a scene for a View3D, it is this Layer
330
node that is being passed to the renderer to render a scene.
331
332
\section1 Scene Rendering
333
334
\image qtquick3d-rendergraph.drawio.svg {Qt Quick 3D render graph flow diagram}
335
336
\section2 Set up Render Target
337
338
The first step in the rendering process is to determine and set up the scene
339
render target. Depending on which properties are set in the SceneEnvironment,
340
the actual render target will vary. The first decision is whether content is
341
being rendered directly to a window surface, or to an offscreen texture.
342
By default, View3D will render to an offscreen texture. When using post
343
processing effects, rendering to an offscreen texture is mandatory.
344
345
Once a scene render target is determined, then some global states are set.
346
\list
347
\li window size - if rendering to a window
348
\li viewport - the size of the scene area being rendered
349
\li scissor rect - the subset of a window that the viewport should be
350
clipped to
351
\li clear color - what color to clear the render target with, if any.
352
\endlist
353
354
\section2 Prepare for Render
355
356
The next stage of rendering is the prepare stage where the renderer does
357
house-keeping to figure out what needs to be rendered for a given frame,
358
and that all necessary resources are available and up to date.
359
360
The prepare stage itself has two phases: the high-level preparation of
361
determining what is to be rendered and what resources are needed; and the
362
low-level preparation that uses RHI to actually set up rendering pipelines and
363
buffers, as well as setting up the rendering dependencies of the main scene pass.
364
365
\section3 High level render preparation
366
367
The purpose of this phase is to extract the state of the spatial scene graph
368
into something that can be used to create render commands. The overview here is
369
that the renderer is creating lists of geometry and material combinations to
370
render from the perspective of a single camera with a set of lighting states.
371
372
The first thing that is done is to determine the global common state for all
373
content. If the SceneEnvironment defines a \l {SceneEnvironment::lightProbe}{lightProbe}, then it checks if the
374
environment map associated with that light probe texture is loaded, and if its
375
not, a new environment map is is loaded or generated. The generation of
376
an environment will itself be a set of passes to convolve the source texture
377
into a cube map. This cube map will contain both specular reflection information
378
as well as irradiance, which is used for material shading.
379
380
The next thing is that the renderer needs to determine which camera in the
381
scene to use. If an active camera is not explicitly defined by a View3D, the
382
first camera available in the scene is used. If there are no cameras
383
in the scene, then no content is rendered and the renderer bails out.
384
385
With a camera determined, it is possible to calculate the projection matrix
386
for this frame. The calculation is done at this point because each renderable
387
needs to know how to be projected. This also means that it is now possible to
388
calculate which renderable items should be rendered. Starting with the list of
389
all renderable items, we remove all items that are not visible because they
390
are either disabled or completely transparent. Then, if frustum culling is
391
enabled on the active camera, each renderable item is checked to see if it is
392
completely outside of the view of the camera's frustum, and if so it is
393
removed from the renderable list.
394
395
In addition to the camera projection, the camera direction is also calculated
396
as this is necessary for lighting calculations in the shading code.
397
398
If there are light nodes in the scene, these are then gathered into a list the
399
length of the maximum available lights available. If more light nodes exist in
400
the scene than the amount of lights the renderer supports, any additional
401
light nodes over that limit are ignored and don't contribute to the lighting of
402
the scene. It is possible to specify the scope of light nodes, but note that
403
even when setting a scope the lighting state of each light is still sent to
404
every material which has lighting, but for lights not in scope the brightness
405
will be set to 0, so in practice those lights will not contribute to the
406
lighting of those materials.
407
408
Now with a hopefully shorter list of renderables, each of these items need to
409
be updated to reflect the current state of the scene. For each renderable we
410
check that a suitable material is loaded, and if not a new one is created.
411
A material is a combination of shaders and a rendering pipeline, and it is needed
412
for creating a draw call. In addition the renderer makes sure that any
413
resources needed to render a renderable is loaded, for example geometry and
414
textures set on the Model. Resources that are not loaded already are
415
loaded here.
416
417
See \l{quick3d-shadercache}{Disk-based caching of shaders} for a discussion
418
on the disk-based caching of shaders for materials and effects.
419
420
The renderables list is then sorted into 3 lists.
421
\list
422
\li Opaque Items: these are sorted from front to back, or in other words
423
from items that are closest to the camera to items that are furthest from the
424
camera. This is done to take advantage of hardware occlusion culling or
425
early z detection in the fragment shader.
426
\li 2D Items: these are QtQuick Items that are rendered by the Qt Quick
427
renderer.
428
\li Transparent Items: these are sorted from back to front, or in other
429
words from items that are farthest from the camera to items that are nearest
430
to the camera. This is done because transparent items need to be blended
431
with all items that are behind them.
432
\endlist
433
434
\section3 Low Level render preparation
435
436
Now that everything that needs to be considered for this frame has been
437
determined, the plumbing and dependencies for the main render pass can be
438
addressed. The first thing that is done in this phase is to render any
439
pre-passes that are required for the main pass.
440
441
\list
442
\li Render DepthPass - Certain features like Screen Space Ambient Occlusion
443
and Shadowing require a depth pre-pass. This pass consists of all opaque
444
items being rendered to a depth texture.
445
446
\li Render SSAOPass - The objective of the Screen Space Ambient Occlusion
447
pass is to generate an ambient occlusion texture. This texture is used
448
later by materials to darken certain areas when shading.
449
450
\li Render ShadowPasses - Each light in the scene that has shadow enabled,
451
contributes to an additional shadow pass. There are two different shadowing
452
techniques employed by the renderer, so depending on the light types there
453
will be different passes. When rendering shadows from a directional light,
454
the scene is rendered to a 2D occlusion texture from a combination of the
455
directional light's direction and the size of the camera frustum. When
456
rendering shadows from a point or spot light the light's occlusion texture is
457
a cube map representing the occlusion contribution relative to each face
458
direction of the light.
459
460
\li Render ScreenTexture - This pass will only occur when using a
461
CustomMaterial that requires a screen texture, which can be used for
462
rendering tecniques such as refraction. This pass works like a depth pass,
463
but instead renders all opaque items to a color texture.
464
\endlist
465
466
After the dependency renders are done, the rest of the passes are prepared but
467
not rendered. This preparation involves taking the state gathered in the
468
high-level prep stage and translating that to graphics primitives like
469
creating/updating uniform buffers values, associating samplers with dependency
470
textures, setup for shader resource bindings, and everything else involved in
471
creating a pipeline state necessary for performing a draw call.
472
473
\section2 Scene Rendering
474
475
Now that the hard work of preperation is done, the easy part is running the
476
commands that contribute to the main scene's content. That rendering works
477
in this order:
478
479
\list
480
\li Clear Pass - This isn't really a pass, but depending on what
481
backgroundMode is set on SceneEnvironment, different things can happen here.
482
If the background mode is either transparent or color, then the color buffer
483
will be cleared with either transparency or the color specified. If, however,
484
the background mode is set to SkyBox, then a pass will be run that renders
485
the SkyBox from the perspective of the camera, which will also fill the buffer
486
with initial data.
487
488
\li Opaque Pass - Next all opaque items will be drawn. This just involves
489
setting the pipeline state, and running the draw command for each item in
490
the order in the list since they are already sorted at this point.
491
492
\li 2D Pass - If there are any 2D items in the scene, then the Qt Quick
493
renderer is invoked to generate the render commands necessary to render
494
those items.
495
496
\li Transparent Pass - Then finally the transparent items in the scene are
497
rendered one by one in the same manner as the opaque items.
498
\endlist
499
500
This concludes the rendering of the scene.
501
502
\section2 Post-Processing
503
504
If any post-processing functionality is enabled, then it can be assumed that the
505
result of the scene renderer was a texture that is an input for the post
506
processing phase. All post-processing methods are additional passes that
507
operate on this scene input texture.
508
509
All steps of the Post-Processing phase are optional, and if no built-in
510
features and no user-defined effects are enabled, the output of the scene
511
render is what is used by the final render target. Note however that
512
\l{ExtendedSceneEnvironment::tonemapMode}{tonemapping} is enabled by default.
513
514
\image qtquick3d-postprocess-graph.drawio.svg
515
{Post-processing render pass graph}
516
517
\section3 Built-in Post-Processing
518
519
\l ExtendedSceneEnvironment and its parent type \l SceneEnvironment offer the
520
most common effects used in 3D scenes, as well as tonemapping that is used to
521
map the high dynamic range color values generated by the renderer to the 0-1
522
LDR range. The effects include depth of field, glow/bloom, lens flare,
523
vignette, color adjustment and grading, fog, and ambient occlusion.
524
525
\section3 Post-Processing Effects
526
527
Applications can specify their own custom post-processing effects as an ordered
528
list in the SceneEnvironment::effects property. When this list is non-empty,
529
the effects in it are applied \e before the built-in effects provided by \l
530
ExtendedSceneEnvironment. Each post-processing effect is part of a chain such
531
that the output of the previous effect is the input for the next. The first
532
effect in this chain gets its input directly from the output of the scene
533
renderer step. It is also possible for effects to access the depth texture
534
output of the scene renderer.
535
536
Each effect in this process can consist of multiple sub-passes, which means it
537
is possible to render content into intermediate buffers. The final pass of a
538
multi-pass effect is expected to output a single texture containing the color
539
data to be used by the next steps of the post-processing phase.
540
541
\section3 Temporal and Progressive Antialiasing
542
543
The Temporal and Progressive antialiasing steps are optionally enabled by
544
setting properties in the SceneEnvironment. While not strictly a part of the
545
post-processing phase, the actual results of Temporal and Progressive
546
antialiasing are realized during the post-processing phase.
547
548
Temporal Antialiasing is performed when a scene is being actively updated.
549
When enabled, the active camera makes very small adjustments to the camera
550
direction for each frame while drawing the scene. The current frame is then
551
blended with the previously rendered frame to smooth out what was rendered.
552
553
Progressive Antialiasing is only performed when a scene is not being updated.
554
When enabled, an update is forced and the current state of the scene is
555
rendered with very small adjustments to the active cameras direction. Up to 8
556
frames are accumulated and blended together with pre-defined weights. This has
557
the effect of smoothing out a non-animating scene, but comes at a
558
performance cost because several extra frames will be rendered for each update.
559
560
\section3 Super Sampling Antialiasing (SSAA)
561
562
Super Sampling Antialiasing is a brute force way of smoothing out a scene. It
563
works by rendering to a texture that is a multiple of the requested size of
564
the scene, and then afterwards downsampling it to the target size. So for
565
example if 2X SSAA is requested, then the scene would be rendered to a texture
566
that is 2 times the intended size, and then downsampled as part of this
567
phase. This can have a huge impact on performance and resource usage so
568
should be avoided if possible. It's also possible for the View3D size to be
569
too large to use this method, since the texture needed for this method may be
570
larger than what is supported by the rendering hardware.
571
572
*/
qtquick3d
src
quick3d
doc
src
qtquick3d-architecture.qdoc
Generated on
for Qt by
1.16.1