The track selection menu

Starting in this lesson, we will design a mouse and touch interface to let the player select tracks.

The player will be able to navigate the tracklist with a horizontal drag motion and press the “go” button to play the selected track, represented by a colored square.

There is quite a lot to cover, so we will split the whole menu creation in three parts.

In this one, we will design the menu and add the ability to play a track by clicking a button. In the next one, we’ll generate the tracklist procedurally. And in the third one, we’ll implement a carousel to select different tracks by touching and dragging on the screen.

Designing the track selection menu

Let’s start creating the menu that will display the selected track at the top, its name, and a button to play the game.

Create a new scene with a Control node as its root named UITrackSelector. You can save it in the RhythmGame/UI/TrackSelector directory.

First, we’ll dedicate an area at the top to displaying the available audio tracks.

We will use 2D nodes for that as we will manipulate them in code. We want to smoothly animate them when the player drags their finger over a certain area.

You can mix Control and 2D nodes in Godot’s interface scenes without problems. It depends on your needs.

In Godot 3, animating UI nodes is not easy or convenient. It can be much easier to do so with 2D nodes.

The only limitation when you don’t use UI nodes inside an interface is that you lose their layout and anchoring features. So if we wanted our tracklist to resize with the game window, we would have to write some extra code.

Create a new Position2D node named TrackCarousel and center it around the screen’s top. I set its position to (960, 240).

As a child, create a new Node2D named TrackTiles. We will populate it with a row of track icons, and when dragging, we will move this node to have the tracks slide horizontally. We’ll see how in the next lesson.

Then, as a child of TrackTiles, add an Area2D named TrackTile with a Sprite and CollisionShape2D as its children.

You can assign the icon_cephalopod.png file to the Sprite’s Texture.

Then, set the CollisionShape2D’s Shape to a new RectangleShape2D that encompasses the sprite. We will use this area to detect the track the player selected.

We will instantiate the track tile for each of the tracks on our screen, so right-click on TrackTiles and select Save Branch As Scene. Once again, you can save it in the RhythmGame/UI/TrackSelector directory.

That’s the upper area set up. Let’s now move to the label and button.

To display the track’s name, we instantiate the LabelCustom scene as a child of UITrackSelector and rename it to TrackName.

We can anchor the node to the center of the viewport using the Layout -> HCenter Wide option. You can give it the placeholder text “Cephalopod” to preview how it will look.

Finally, add a TextureButton named GoButton as a child of UITrackSelector.

We prepared two textures for its normal and hover states. Assign the button_go_hover.png to the node’s Textures -> Hover property, and button_go_normal.png to its Textures -> Normal texture.

And we anchor it to the bottom of the screen using the Layout -> Center Bottom option. You can move the node up so it isn’t stuck against the viewport’s bottom edge.

That’s it for the menu’s layout; it should roughly look like this.

Next, we move on to adding support for audio playback.

Audio playback

We want to give the player an audio preview of the selected track. To do so, we add two new nodes as children of UITrackSelector: an AudioStreamPlayer and an AnimationPlayer.

The AudioStreamPlayer should have a Volume Db of -80 Db by default so it’s not audible. We will use animation to fade in the sound.

Speaking of which, in the AnimationPlayer, we create two animations, fade_in_track and fade_out_track, which we’ll use to transition between soundtracks smoothly.

The fade_in_track animation should have one keyframe for the AudioStreamPlayer’s Volume Db at half a second with a value of 0 Db.

Be sure to set the animation mode to capture by going to the right of the animation track and clicking the leftmost icon.

The capture mode automatically interpolates from the property’s current value to the first keyframe. It does so linearly, which isn’t ideal for audio volume, but it’ll do for now.

Why isn’t it ideal? We measure audio volume in Decibels, following a logarithmic scale. An increase of 3 Db doubles the audio signal’s amplitude. So audio editing software typically uses a different interpolation curve by default for fades.

Your animation should look like this.

For the fade_out_track animation, you can do the same with one keyframe at 0.5 seconds but set to a value of -80 DB. Once again, the animation track should be set to capture mode.