| Home Page | Recent Changes | Preferences

GUI Overview

Not sure where this should be linked in from, or if there's another page that would provide a better home for this information. If someone knows of a better place for this page, please feel free to move it, rename it, refactor it, whatever. In its present form it's basically just raw data.

What is the relationship between the Canvas, Interactions, GUI and HUD?

Canvas is the mid-level interface that handles rendering to the screen (that is, rendering things that are not part of the world). Anything that needs to be rendered directly to the screen (GUI, HUD, etc.) will render to the Canvas object.

Interactions are objects that are solely used for the purpose of interacting with the player - they [typically] don't really care what's going on in the world, and are not bound to the GUI. The GUI is simply the most prominent usage of the Interaction system.

Some interactions:

GUIController
it takes input from the player, passes that input to various GUI objects, and tells GUI objects when and how to render
Console
it takes input (in the form of text commands) from the player, and renders the results of those commands onscreen
StreamInteraction (UT2004)
responsible for sending/receiving messages with the low-level audio code to play MP3/OGG files, according to playlists configured by the user.

GUI objects are the graphical tools that the GUIController uses to interact with the player.

The HUD is used by the game to communicate game state information to the player. It is not an Interaction - it cannot process input from the player.

How do I make a new GUI?

The best way to understand how to make a new GUI is to read http://udn.epicgames.com/Two/GuiReference, browse the GUI Class Hierarchy and look at the UnrealScript source. You will probably find the best documentation is in the source code itself.

Generally speaking, every GUI "widget" is a subclass of GUIComponent. GUIComponent's that consist of or contain other GUIComponents are subclasses of GUIMultiComponent. There is an Interactive GUI mode for interactive GUI design, which will dump out to a file the specified components and properties, or you can code it in UnrealScript from scratch.

Typically, you would define your own subclass of GUIPage, for example, specify each subcomponent as an Automated Component and specify the properties, including position, size, appearance, and whether it's scaled and positioned according to its parent component, in the default properties section.

If you do need to directly instantiate a GUIComponent, e.g. button = new class'GUIButton', you would have to set the properties on the object and then add it to the parent component by calling AppendComponent.

How do I change the GUI look-and-feel?

Each GUIComponent has a styleName variable associated with a GUIStyles object that specifies the component appearance. See http://udn.epicgames.com/Two/GUIStylesExample for detail on how to construct one.

Note that the GUIStyles affect some components differently, e.g. GUILabel has member variables that explicitly set transparency, background color, and so forth. And a GUIMultiComponent does not automatically propagate its styleName to its subcomponents - you will have to set those explicitly on a case by case basis.

To use a GUIStyles object, it has to be registered with the GUIController , e.g. in the InitComponent function of the enclosing GUIPage

How do I use a new font?

The font used in each GUIComponent is specified in its GUIStyles object. A GUIStyles object has a 15-element string array called FontArrayNames, each entry identifying a GUIFont object. A GUIFont in turn has a dynamic array that references Font objects. A new Font can be loaded using the TrueTypeFontFactory Exec Directive. The new GUIFont must be added to the GUIController font stack, typically by subclassing GUIController (and then specifying the new GUIController in the game .ini file.

How do I use the UT2004 GUI animation system?

GUI animation is fairly simplistic. Each GUIComponent stores keyframe data using an array of vectors for each type of animation supported, where the X and Y components are dependent on the type of animation, and the Z component represents the remaining interpolation time. GUIComponent.MotionFrame tracks position keyframes, whereas GUIComponent.SizeFrame tracks dimension keyframes.

Interpolation is strictly linear - each frame, UGUIComponent::PreDraw adjusts the component's position/dimension based on the time remaining for the first keyframe, and decrements that keyframe's counter. Once the counter for that keyframe reaches 0, it is removed from the list of keyframes, the component is notified via delegateOnArrival (see below), and the process continues using the next keyframe, if it exists.

Script-side, position keyframes are added by calling Animate(), and dimension keyframes are added by calling Resize(). For the most part, you'll want to use these functions to add keyframes, rather than modifying the keyframe arrays directly, so that all of the various notifications are called in the correct order. There are several events and delegates related to GUI animation:

event BeginAnimation()
indicates that this component has started an animation sequence. Called from Animate/Resize when there are no existing keyframes, and will be propagated up the MenuOwner chain.
event EndAnimation()
indicates that this component has ended an animation sequence. Called from UGUIComponent::PreDraw(), after removing a keyframe when there are no additionl keyframes, and is propagated up the MenuOwner chain.
function KillAnimation()
immediately removes all keyframes and ends all animation for the component. This function would be called by client code to force animation sequences to end.
delegate OnArrival()
indicates that this component has arrived at a keypoint. Called from UGUIComponent::PreDraw(), each time a keyframe is removed from the current sequence.
delegate OnEndAnimation()
indicates that this component has ended an animation sequence. Called from GUIComponent.EndAnimation. While EndAnimation event would be called on each component in the MenuOwner chain, OnEndAnimation would only be called on the component that was animating.

Is the GUI animation system related to drag-n-drop?

No, drag-n-drop is completely separate from the GUI animation code. Here is an overview of the drag-n-drop system. The following properties are related to drag-n-drop:

GUIController.DropSource
Pointer to the GUIComponent that is the source of the drag-n-drop, or NULL if not currently performing a drag-n-drop. enerally, the DropSource will have the first opportunity to handle input events, such as mouse clicks, releases, and movement.
GUIController.DropTarget
Pointer to the GUIComponent that will receive the drag-n-drop - generally the component that the mouse is currently hovering over (the GUIController's ActiveControl), or NULL if no valid target exists (if we're not performing a drag-n-drop, or the ActiveControl does not support drag-n-drop
GUIComponent.bDropSource
Must be true in order for a component to be the source of a drag-n-drop operation.
GUIComponent.bDropTarget
Must be true in order for the component to be the target of a drag-n-drop. (Note: it is perfectly acceptable for a GUIComponent to be a valid drop-source, but not a drop-target, or vice-versa)
GUIComponent.DropState
The current drag-n-drop state of the component

The following events and delegates are related to drag-n-drop (all located in GUIComponent.uc):

event DropStateChange(eDropState NewState)
Called when the drag-n-drop state of the component changes, e.g. the component becomes the GUIController's DropSource (beginning a drag-n-drop), the component becomes the GUIController's DropTarget (moused over while a drag-n-drop operation is in progress), etc.
delegate bool OnBeginDrag(GUIComponent Sender)
Called once on a component when it becomes the DropSource. Return false to prevent the drag-n-drop operation from beginning.
delegate OnEndDrag(GUIComponent Sender, bool bAccepted)
Called on the DropSource when the user releases the mouse during a drag-n-drop operation. Accepted indicates whether the drag-n-drop operation was successful. A false value indicates that there was no valid DropTarget (if Sender == None), or that the DropTarget didn't want to receive the drag-n-drop for any reason.
delegate bool OnDragDrop(GUIComponent Sender)
Called once on the DropTarget when the user releases the mouse. The return value of this delegate is used as the value for bAccepted in OnEndDrag.
delegate OnDragEnter(GUIComponent Sender)
Called once on a component that is a valid DropTarget (that is, bDropTarget is true), when the mouse enters that component's bounds.
delegate OnDragLeave(GUIComponent Sender)
Similar to the above, for leaving the component's Bounds.
delegate OnDragOver(GUIComponent Sender)
Called on the DropTarget once each cycle.

In most cases, the actual work of a drag-n-drop operation will be performed from OnEndDrag (for the dropsource) and OnDragDrop (for the droptarget), and depending on exactly what you're doing, may be the only delegates you'll need to hook up.

For a working example, see the GUIListBase class. It contains all functionality necessary to enable drag-n-drop operations to and from lists. It is structured so that child classes of GUIListBase need only implement the two delegates I mentioned above - OnEndDrag and OnDragDrop, which have been assigned in GUIListBase to InternalOnEndDrag and InternalOnDragDrop, repectively, to simplify overriding the default behavior in child classes.

How do I handle mouse click events?

GUI mouse clicks are handled by UGUIController::MousePressed and UGUIController::MouseReleased. For most input events (including mouse clicks), the general order in which components will receive notification of the input event is:

  1. GUIController.ActiveControl (which is updated natively each tick)
  2. GUIController.ActivePage (the last opened menu)
  3. each page in the GUIController's menustack, starting with the most recently opened menu

GUIComponents indicate that they've handled the input by returning true from the appropriate input function. Once that happens, the input chain stops. For mouse input, this function is UGUIComponent::MousePressed and UGUIComponent::MouseReleased. For example, if the ActiveControl returns true from its MousePressed function, then the ActivePage and MenuStack would not be notified of the mouse press.

When a component receives a call to MousePressed/MouseReleased, it notifies unrealscript by calling delegates. To have unrealscript do something when mouse input is received, you should assign these delegates and perform whatever logic you want to do from there. Here's an overview of the mouse-related delegates:

delegate OnMousePressed()
called from UGUIComponent::MousePressed, when the left mouse button is pressed or held.
delegate OnMouseReleased()
called from UGUIComponent::MouseReleased, when the left mouse button is released
delegate OnClick
This is the primary mouse input function; called immediately after OnMouseReleased. The return value from this delegate is used as the return value for the native MouseRelease().
delegate OnDblClick
when the user double-clicks on a component
delegate OnRightClick
when the right mouse button is released

How do I handle mouseover events?

delegate OnHover
when the mouse is moved over a component

Related Topics

Category FAQ

The Unreal Engine Documentation Site

Wiki Community

Topic Categories

Image Uploads

Random Page

Recent Changes

Offline Wiki

Unreal Engine

Console Commands

Terminology

FAQs

Help Desk

Mapping Topics

Mapping Lessons

UnrealEd Interface

UnrealScript Topics

UnrealScript Lessons

Making Mods

Class Tree

Modeling Topics

Chongqing Page

Log In