The best examples for UI Composer users can be found where you installed Studio. A typical install places these files in: C:\Program Files\NVIDIA Corporation\UI Composer %version%\Studio\Content\
Tip: You may find it useful to add a Storage Palette entry to the Samples folder, for easy access to the sample assets and behaviors.
COLLADA is the primary 3D import format for Studio. The installers for the COLLADA exporters we recommend are installed along with UI Composer.
For DCC packages that do not have COLLADA exporters, Studio also supports native import of the .3ds file format. You can translate your 3D assets into the 3ds format prior to importing into Studio.
Artists have a number of options when synchronizing external assets with their UI project:
For basic placement, our system for visualizing pivot points allows you to also see your cameras and lights. Make sure under the View menu that Pivot Points is checked (Hotkey: Ctrl-Alt-P) and select the object you wish to see. The blue line of the visible pivot point represents the direction of the camera/light (i.e. which way it is pointing), while the green line represents the local 'up'.
Tip: You can attach a cone object to the camera or light, with the material for that cone set to wireframe mode. While the angle of that cone does not automatically match the camera or light field of view, it can help provide a stronger visual of item placement.
By using the COLLADA format, animation can be imported directly from 3ds Max or Maya into Studio.
When you import resources into your scene, they are stored in the Library. If you remove these items from the scene, they are not automatically removed from the Library. This provides an easy way to include resources in your presentation that may not be present in the scene originally, but are still there if you want to add them later.
Removing this extra "bloat" if you don't need it is easy. In Studio, right-click anywhere in the background area of the Library palette and choose Delete Unused Resources. All items that are not instanced in your scene will be purged.
Tip: If you see some items still in the Library that you don't think are in the scene, select the item and press Ctrl-I (for "instance"); Studio will highlight the first item in the scene that uses that Library item. Successive presses of Ctrl-I will highlight successive instances.
Z-sorting is difficult, especially when there are multiple renderers involved. There render engine in Studio is different enough from the render engine in Viewer that there may be subtle differences:
Studio's renderer does perform z-sorting on transparent objects, but it does so on a per-model basis. This means that two intersecting, partially-transparent objects will always have one drawn over the other. There is no per-polygon or per-pixel z-sorting.
NVIDIA UI Composer brings the most value as a solution for creating the user interface for your application. However, the capabilities of NVIDIA UI Composer are sufficiently robust to prototype a wide variety of applications.
The focus of Studio's design has primarily been on authoring interactive 3D presentations. As such, it is (currently) lacking features common in 2D layout tools, such as guides, grids or automatic alignment. Such tools are high on our priority list for adding in a future release; however, similar functionality can be achieved to some degree right now:
Thus, using the mouse wheel and shift can be used to cause objects to move along a grid of 10 units.
Rest assured that we do not consider the above workarounds to be ideal, or even long-term solutions. However, they are ways to work with content in Studio to achieve specific 2D layout goals.
Studio will be adding additional 2D layout features in future releases.
There is no canvas in the sense you are thinking; everything in Studio is represented in 3D. There is no flat image to zoom in or out of. What you can do is:
To design your presentation against specific layout guidelines, you can create an image showing one (or more than one) of these features and bring it into your presentation. Place this image in a Layer behind (or in front) of all others, with the camera for that Layer set to Orthographic. Simply hide the Layer prior to exporting your presentation to ensure that it is not included in the scene.
Remember that each Layer in Studio is a full 3D scene, with depth and 3D objects. It is not a 2D canvas.
Studio was designed primarily for 3D composition and authoring interactivity, not content creation. It imports 3D models rather than being a modeler; it imports images rather than being an editor. As such, Studio does not include tools for 'drawing' a 2D interface. In a pinch, however, models may be pulled out from the Basic Objects palette (such as a rectangle) and manipulated to achieve the desired shape.
UI Composer supports masking of textures applied by geometry by using the alpha channel of images as an opacity mask. By applying multiple images and animated UV coordinates, complex masking effects can be achieved on the surface of an object.
Studio's text has limited support for text "styles". Drop shadows are not supported at this time.s
Certain items in Studio are not multi-selectable (in the current release). You can select multiple keyframes at once, either by 'rubber-band' selecting them or by shift-clicking the object in the timeline. Multiple keyframes may be moved, copy/pasted, have their interpolation changed, or deleted all at once.
It is also possible to select multiple files in Windows Explorer and drag them onto the library palette to import more than one file at a time.
A future version of Studio will include the ability to select and modify multiple objects at once.
Studio allows you to import PNG, PSD, JPG, TGA, TIF and BMP files. PNG, TGA, and PSD files support alpha channels. JPG files support lossy compression. Once imported, however, Studio uses the uncompressed raw buffer in its representation. It is this raw buffer of data which is exposed to the exporter, along with the path to the original file.
If your application supports JPG or PNG files, then you may use them (when appropriate) in Studio. Exporting the presentation to your application, your conditioner should use the path to the original file, and then load that file from disk directly.
If your application requires a file format not supported by Studio, then you may as well use PNG files for the visual representation. Your conditioner should convert the path to the original PNG file into a path to an image saved in your file format.
Studio's Layer objects provide a way to visually stack one scene on top of another. This feature should not, however, be used as the primary layering technique.
Each Layer object is actually a distinct 3D space with objects positioned inside it. To visually layer one item on top of another, bring that object closer to the camera within the space. By default, the camera is looking along the z-axis of the scene. With a default camera placement, decreasing an item's z position will bring the item closer to the camera.
With perspective rendering, bringing an item closer to the camera also makes it larger and (for items at the edge of the view) may cause one item to no longer appear on top of another. (Just like perspective in the real world.) Such problems may be avoided by setting the camera for the layer to Orthographic mode by checking the Orthographic checkbox at the top of the Inspector palette.
There are two answers to this question: "Lua" and "Whatever you want"
Mash your fists against the keyboard. It's really that easy, and something cool is BOUND to happen.
Well, OK, maybe it's not that easy. But it's not that hard, either.
To start with, drag a Behavior from the Basic Objects palette. Select the behavior in the Library and set the Script Type (in the Inspector Palette) to Lua. Double-click on the behavior to begin editing the script, and you're on your way.
UI Composer's integration of Lua is well documented. You can find this documentation on the Lua Bindings page of this manual.
If you prefer to start with working code (rather than a blank canvas), we recommend looking at some of the example behaviors that ship with UI Composer. These example behaviors are located at C:\Program Files\NVIDIA Corporation\UI Composer %version%\Studio\Content\Behavior Library\.
The Lua integration is high powered. You can gain access to any element in the scene, affect its properties, and control state-to-state flow, among many other things. Custom properties can be added to Behaviors (like any other object) through the Edit Custom Parameters dialog to expose script parameters in an artist-friendly way. Additionally, custom events can be exposed, and used by the Lua script or Studio's point-and-click Action system.
DebugView.exe by SysInternals displays standard debug message output using OutputDebugString (which is what Lua's output uses internally). The function you are looking for is:
output( 'Any old string can go here!' )
For more information on output(), see the Lua Binding Documentation included in this manual
With the Lua framework included in UI Composer, there are five times when Lua code inside Behaviors can be run:
onInitializefunction is run once for each instance of the Behavior in the scene.
onDeactivateis run when a behavior instance becomes active or inactive.
onUpdatefunction is run once for each frame of each instance of the Behavior in the scene.
Let's look at an example. Assume that a Behavior has the following code:
1 output( tostring( self ) .. " is being created." ) 2 3 function self:onInitialize( ) 4 output( tostring( self.element ) .. " is being initialized." ) 5 registerForEvent( 'onPoke', self.element, self.sayHey ) 6 end 7 8 function self:onActivate( ) 9 output( tostring( self.element ) .. " is being activated." ) 10 end 11 12 function self:onUpdate( ) 13 output( tostring( self.element ) .. " is being updated." ) 14 end 15 16 function self:onDeactivate( ) 17 output( tostring( self.element ) .. " is being deactivated." ) 18 end 19 20 function self:sayHey( ) 21 output( tostring( self ) .. " says hello." ) 22 end
If this Behavior is just in the Library of a presentation,
no code will ever be run. As soon as the scene moves to a slide where
there is an instance of this Behavior active in the scene, the
function (if present) is invoked. In this case, the value of
passed to the function is the Lua table created above.
Line 4 of the above Behavior will produce output like
"userdata: 1337feeb is being initialized." for
each instance. You will also see the output from the onActivate handler
as the behavior is being activated.
After the initialization occurs, then the
function (if present) is run every frame, for each instance of the Behavior.
self is (again) set to the Lua table representing that
Each frame, every active instance will produce output like
is being updated.".
onPoke event is fired on an instance
of the Behavior, the
sayHey function on lines 20-22 will
be run. The
self passed into the function
is again the Lua table representing this behavior.
onUpdatecalls will run for all active Behaviors. Again, these are processed in timeline order.
There are several frequently asked questions about scripting the 'active' Attribute.
"I see that setting active to false makes the Element disappear, what else does it do?"
That's 95% of it. If it's active = false, it won't be drawn. It can be compared to a component with two slides with an object that only exists on slide 2. When you're on slide 1, the object on slide 2 exists, but it is inactive. Also, events that are targeted at inactive Elements will not be handled.
"Will inactive Elements be destroyed or garbage collected?"
No. Elements are only destroyed upon destruction of their owning Presentation.
"Are inactive Elements still accessible from script?"
If you have a handle to an inactive Element, you can still use that handle in calls like setAttribute. Getting handles to inactive objects can be a bit tough however, for example getChildren() only returns a list of the 'active' children.
"Will changes I make to the attributes of an inactive Element persist when it becomes active again?"
You can modify the attributes of an inactive Element, and the changes will persist. Unless, of course, something else changes them. (e.g, an animation track, or slide change)
Studio has a toolbar called the Preview Bar. The preview bar is connected to an external NAnt based conditioning and preview pipeline that sits next to Studio in the Build Configurations directory. By editing the supplied NAnt files, or by creating your own, you can can easily expose different previewing options to your artists using Studio.
You can add custom properties, events, or actions to any object in Studio using the Custom Parameter dialog. These custom properties will show up in the interface with artist-friendly combo boxes, sliders, checkboxes and so on. The values specified by the artist are then available in exported files and, ultimately, your application.
The amount of memory that a presentation takes up is largely dependent upon the assets used in that presentation. The UI Composer Runtime itself is quite lightweight. At the time of writing, a release build uses less than 150k of memory on a PC with no presentation loaded. Memory requirements will obviously increase as your UI's complexity increases.
The example exporter that ships with UI Composer includes the full information for each asset (such as raw image buffers and model vertices) as well as the source path to the original file for that asset. Every item in a presentation is represented by an Element in the core of the Runtime. Each element may have an arbitrary number of named attributes stored on it.
The above information combined means that elements in the Runtime may be used to cache "heavy" information about assets, or may be used simply as unique lightweight references to the nodes. If the latter approach is chosen, your own framework context may be used to load, cache, and unload heavy information for assets as appropriate.
A single UI Composer instance can dynamically load and unload individual presentations, giving control to update and/or render only specific presentations at different times.
Anark Client was the core to Anark's web-enabled playback renderer and standalone playback application, Anark Player. After NVIDIA acquired Anark Studio and Anark Gameface, Anark Player was deprecated in favor of the new UI Composer Viewer.
Yes. You can set up custom events on a "UIContract" behavior that emulate various types of input from your application. Assets in your presentation can then be hooked up to respond to these events. Studio ships with an example file showing how to set this up.
Finding contracts is a simple matter of using the CContractManager, which is typically accessed from a presentation. Setting properties on an element is done with the CElementManager.
Assuming that xAxis and yAxis are custom properties exposed on your UIContract, you can do something like this:
IContractManager& theContractManager = somePresentation->GetContractManager(); IElementManager& theManager = somePresentation->GetElementManager( ); SAttributeKey theAttributeKey; UVariant theValue; TEventList theEventList; CElement* theContract = theContractManager.GetContractByIndex( 0, theEventList ); theAttributeKey.m_Hash = CHash::HashAttribute( "xAxis" ); theValue.m_FLOAT = 0.5f; theManager.SetAttribute( theContract, theAttributeKey.m_Hash, theValue ); theAttributeKey.m_Hash = CHash::HashAttribute( "yAxis" ); theValue.m_FLOAT = -0.5f; theManager.SetAttribute( theContract, theAttributeKey.m_Hash, theValue );
and then in Lua:
-- Assumes that the UIContract is attached to the scene local theContract = getElement( "Scene.UIContract" ) output( getAttribute( theContract, "xAxis" ) ) output( getAttribute( theContract, "yAxis" ) )
Events have two typed parameters that can be passed along with them if they are fired from C++ or Lua. The default keyboard implementation uses this to send keycodes along with keyboard events, but arbitrary events can use the parameters for whatever you like.
Each Presentation has a member called m_EventCallbacks. The CEventsCallbacks class has a method called RegisterCallback which allows you to register a static callback with the event processing system. This allows for easy notification of certain events being fired on certain elements.
In many custom UI creation solutions, seeing the UI changes requires constant attention and occasionally implementation work by a dedicated UI programmer. A simple preview of new functionality can take hours or, in some cases, days. UI Composer offers two major improvements over this method.
Exporting from NVIDIA UI Composer directly into your application is fast enough that artists can use it as an integral piece of their daily workflow.
These two terms can be confusing for a newcomer to Studio. Here's a short summary of each:
Currently, there is no integrated connection with content management systems inside NVIDIA UI Composer. Studio project files are binary files that can then be placed in a source control or content management system like any other binary file.
When an asset is imported into Studio, the path to that asset is stored along with the asset in the Library palette. If the asset on disk is updated (e.g. a newer version is synced down) Studio will present a yellow "yield" icon letting you know that it needs to be refreshed in the presentation.
In the future, we plan to implement standard source control support inside NVIDIA UI Composer. This feature is not currently scheduled. If you have specific needs in this area, NVIDIA would love to hear from you.
Keywords: localization, internationalization, i18n, l10nSee the Localization page of this manual.
A dynamic UI, by its very nature, cannot be fully and explicitly designed by the artist. Or, more accurately, the more control an artist has over the appearance, the less flexibility and dynamism is possible. The point at which to trade artist control for programmatic variation is up to you, and helps dictate the implementation.
For example, assume that your UI needs to display a table of text (like a scoreboard). Here are various ways to accomplish this goal, along the continuum from no artist control to near-total control.
The two extreme examples above are non-ideal, but hopefully serve to illustrate the variety of ways that UI Composer can be used to create dynamic content.
The behaviors and components that UI Composer supplies fall more into the middle ground. We suggest providing the artist with a customizable component with a single Lua behavior inside it. Custom properties on the outside of the component provide a simple way to customize the basic functionality. Custom properties on the Behavior inside the component allow the artist more control. Editing assets inside the component allow the artist to control other visual aspects of the component.
We suggest that the "control" Behavior inside the component communicate with the application through one or more custom properties on a global item like our suggested "UIContract" behavior. The component should reflect information from the application and "suggest" changes to that information - the UI should not be the sole repository of application or interface state.
The "Data Table" sample component included with NVIDIA UI Composer is designed like the third item above. See the UI Composer Samples Reference for more information.
Short answer: the same as models, images, or behaviors. :)
Long answer: view the Working with Refreshable Components document in this manual.
One of the following two situations has occurred:
If it's the first case, you can just say "OK" to the dialog and the component will be refreshed. (In the future, you can use the Refresh File... command explicitly instead of attempting to reimport the component.)
If it's the second case, then you (or the author of the component) need to update the components to have unique IDs. For more information (including step-by-step instructions on how to do this) see the section "Creating Refreshable Components" in the Working with Refreshable Components page of this manual.
The Studio application targets several different runtimes. (i.e., playback engines) Not all of these runtimes have exactly the same capabilities. To account for the differences in the various runtimes, and specifically to avoid showing artists using Studio a bunch of options/features that aren't going to function in their runtime, Studio has a feature that allows you to show and hide certain properties in the Inspector palette.
The mechanism that shows and hides built-in properties in the Inspector palette is called "startup.lua" and is so named because you interact with it by editing the file "startup.lua" that sits next to Studio.exe (typically in Program Files).
It is possible to edit startup.lua to expose a number of other features not supported by the UI Composer runtime, but settings you make for those properties will be ignored at runtime, as those features are unsupported. However, it is of course possible to augment UI Composer's integration with a given runtime to support these additional features, at which time it may make sense to re-expose them in the Inspector palette using startup.lua.
UI Composer presentations are fully 3D in all their aspects, so it is trivial to create a resolution-independent interface. 3D objects in UI Composer applications are, by their very nature, scalable. NVIDIA UI Composer also uses high resolution images that can then be exported in whatever reduced size that your target engine requires.
The Project Settings dialog of Studio allows you to change the size of a presentation. For presentations using perspective cameras, different size presentations (with the same aspect ratio) are completely equivalent.
"But how," you ask, "do I create a single presentation that works in 4:3 and 16:9?"
The answer is Lua Behaviors. If Lua script has knowledge about the size of a presentation, it is relatively trivial to write a "Screen Position" Behavior. Such a Behavior, when attached to an object, can position the object in an appropriate location on the screen, such as top left, bottom middle, and so on. Using this Behavior, an artist can create a UI that automatically adapts to arbitrary resolutions, specifying where each object should go on the screen.
In summary: have your renderer provide information about screen size (or aspect ratio) to your script framework, and write a Behavior to distribute objects as desired.
UI Composer exposes variable aspect ratios and information about presentation size to Lua script. The "Stick to Screen" sample Behavior included with NVIDIA UI Composer demonstrates this technique.
The right answer obviously depends on your team's setup and the roles people play in the process. Here are two approaches that we've seen our users take:
UI Composer has a polished, proven system for quickly putting together complex interactive projects. With the Slide palette, you can rapidly set up the various states of menus or buttons complete with animation and layout. With the Actions and Events palette, you can easily trigger various states based off button events or other event triggers. The entire system is easy to learn and capable of handling very large scale projects.
After several years of professional use on large training and industrial projects involving hundreds of interactive elements, our current customers have repeatedly told us that they prefer Studio's method to other multimedia tools that rely on a flowchart or hyper graph based authoring system. Studio produces clean, easily comprehensible presentations. Hyper graphs often produce a spaghetti nest of crossing links that are hard to decipher in practice.If you have suggestions about what you would like to see in terms of visualization, NVIDIA would love to talk to you. We are always looking at making our interactivity authoring easier and more comprehensible, so your input in this area is welcome, and valued.
An artist working in Studio edits a presentation both visually and informationally.
So, Studio provides a certain amount of live preview built-in. To see scripts and final rendering, you'll want to run the presentation with UI Composer integrated into your application.
The ideal answer is that if you want to see what your presentation will look like in your application, you should view it in your appliation. You can set up Studio so that pressing F12 (the shortcut key for File->Preview) will export your presentation using your own custom exporter and view it in the application of your choice. This allows artists to easily view the UI in the true final environment, iteratively designing and tweaking.
For those evaluating NVIDIA UI Composer, however, it would be unreasonable to require them to integrate UI Composer into their application before their artists can preview their work. NVIDIA UI Composer comes with a Viewer application that allows artist's to preview presentations.
In summary: view your presentation while you're editing it, and (as necessary) preview it in the playback application of choice. Initially, that playback application will be Viewer. When you have an integration with your own application, you will want to use that for true WYSIWYG results.