In this lesson, we’ll get started coding the game.
The first element we’ll add is the backbone of any rhythm game: the Synchronizer
.
For now, the Synchronizer
will:
The Synchronizer
will play the selected track, and given a song’s BPM (Beats Per Minute), it will map the current playback time to beats and half beats. We’ll then use that to know if the player hit a button in rhythm or not.
To calculate that, we’ll use functions found in the AudioStreamPlayer
class.
Let’s get to it!
Create a new 2D Scene and name it RhythmGameDemo. This scene will hold all of our other scenes, including the Synchronizer
.
Add a Node as a child and name it Synchronizer
. Attach a script and save it as res://RhythmGame/Synchronizer.gd
. Also, add an AudioStreamPlayer
, which we’ll use to play soundtracks.
I saved the scene as res://RhythmGameDemo.tscn
.
In Synchronizer.gd
, we start by defining some properties.
extends Node # The track's Beats Per Minute. export var bpm := 124 # We store the number of beats and half-beats per per second. We'll use that to calculate how many beats elapsed in the song. var _bps := 60.0 / bpm var _hbps := _bps * 0.5 # Stores the index of the last half-beat we passed. var _last_half_beat := 0 onready var _stream := $AudioStreamPlayer
You can choose to play whatever track you like. The only information you need to know is its BPM beats per minute of the track.
Make sure to check the license of the audio you use as well. The start project includes three royalty-free tracks from Kevin McLeod, which we’ll work with for the remainder of the course:
Every time we play a song, we’ll update the Synchronizer.bpm
property accordingly.
Note: Make sure to check the track’s BPM! The BPM of Disco Lounge differs from the one listed on the site, for example. You can use a free metronome app like Metronome on Android to tap along with a song and get an accurate value.
You can find these tracks in res://RhythmGame/Tracks/
along with placeholder icons for each song.
Select the AudioStreamPlayer and drag your audio file to the Stream property on the right. Godot can import .wav, .ogg, and .mp3 audio files from Godot 3.2.4.
Now, let’s add a function to start music playback. We make sure to start the audio as the next mix occurs. The output latency is how long the sound gets from the game to the speakers.
func play_audio() -> void: var time_delay := AudioServer.get_time_to_next_mix() + AudioServer.get_output_latency() yield(get_tree().create_timer(time_delay), "timeout") _stream.play()
For now, we’ll play the audio when running the scene, using the _ready()
function.
func _ready() -> void: play_audio()
When you run the game using F5, you’ll be prompted to choose the main scene.
Select RhythmGameDemo.tscn as the main scene, and you’ll hear the track play.
Now, we’re going to check whether a new half-beat is reached each frame in the _process()
function.
As we’re making a rhythm game, we must know how far into a track we are at any given time.
We’ll use other functions found in the AudioStreamPlayer
node to be as accurate as possible, so gameplay elements align with the music track.
Every frame, we start by calculating the last played half-beat.
func _process(_delta: float) -> void: var time: float = _stream.get_playback_position() # Calculate the current half-beat using # half-beats-per-second var half_beat := int(time / _hbps) if half_beat > _last_half_beat: _last_half_beat = half_beat
This might seem adequate on the surface, but there’s a problem. The issue is that although we call AudioStreamPlayer.get_playback_position()
every frame, the returned value is calculated every time there’s an audio mix.
In Godot, audio is mixed in small chunks before playback. To demonstrate the problem this may cause us, here’s an explanation with extreme values.
Let’s assume that Godot mixes audio every 0.5 seconds (it mixes audio much quicker than this in reality).
Also, assume that our track is fast enough that half-beats occur in the following way:
If we run with out code through the first second of the song, we can see the value half_beat
would take:
This is not ideal and will cause objects to spawn out of sync with the playing track!
We need to be more accurate when calculating the playback position, so we’ll alter our code to consider the amount of time elapsed since the last mix.
To be even more accurate, we’ll factor in the output latency, how long it takes for the mixed audio to reach the speakers.
We update the time
calculation with the following:
func _process(_delta: float) -> void: var time: float = ( _stream.get_playback_position() + AudioServer.get_time_since_last_mix() - AudioServer.get_output_latency() ) #...
With this, we have a reliable way to calculate the half-beats of a song!
In the next lesson, we’ll set up the event bus design pattern to eventually connect the Synchronizer
to the HitSpawner
.
Synchronizer.gd
extends Node export var bpm := 124 var _bps := 60.0 / bpm var _hbps := _bps * 0.5 var _last_half_beat := 0 onready var _stream := $AudioStreamPlayer func _ready() -> void: play_audio() func play_audio() -> void: var time_delay := AudioServer.get_time_to_next_mix() + AudioServer.get_output_latency() yield(get_tree().create_timer(time_delay), "timeout") _stream.play() func _process(_delta: float) -> void: var time: float = ( _stream.get_playback_position() + AudioServer.get_time_since_last_mix() - AudioServer.get_output_latency() ) var half_beat := int(time / _hbps) if half_beat > _last_half_beat: _last_half_beat = half_beat