Flax HTML5 Game Engine Development Diary Part 4
First off, I have to apologise. This post is a few days late, and I’m sure that all none of you who hang on my every word were beginning to froth at the mouth. Anyway, in this dev diary is an explanation of our audio system, because I worked on that this week. First, I’m going to explain how we handle files, because that’s pretty cool too.
The problem, though, is that these methods are non-blocking. This is, naturally, one of the reasons that AJAX is used. There’s a degree of multi-tasking going on. (Browsers generally limit network connections to two at a time, so it’s a small degree, but a degree regardless)
Non-blocking methods mean that the code doesn’t stop running while the call completes. Here’s the rather big problem with that:
The client could ask the server to create a file asynchronously. Then the client may write to the file. However, the file the server’s been asked to create might not actually exist yet, so your code breaks.
We get around this by using an event system. We were going to use events for other things anyway (like when loading certain parts of the engine, and for in-game things as well), so it was almost trivial to add an onFileLoaded event. I didn’t architect the events system, so I’ll shut up about them before I talk myself into a hole. Ciarán will probably talk about them in his next dev diary.
So, in summary, we read files like so:
- The client asks the server to read a file and starts watching for an event, onFileLoadedEvent (we have such awesome names for things).
- Once the server has read the file, it fires off an onFileLoadedEvent, which takes the content of the file with it.
- The client catches that event and continues doing whatever it was doing with the contents of the file.
It’s cheating a little because we’re sort of blocking the non-blocking methods, but it works.
Now for the fun stuff.
Flax’s audio, in keeping with the “we don’t need no damn plugins” mentality, doesn’t need any damn plugins. Provided, of course, that you’re using a decent browser with support for the <audio> tag.
Really, what Flax’s audio system does, is take our custom JSON audio objects, convert them into audio tags, and insert them into the page. Then, JSNI methods are used to play or pause separate audio tags.
Audio, as well as many other sections of the engine, is based on JSON objects. Originally, we were going to use XML to do a straight conversion (XML to HTML is rather easy), but we switched to JSON when we got serialisation/de-serialisation working, for consistency’s sake. Besides, doing it this way means it could also conceivably be done programatically (without using JSON files), though not without some work.
Really, it’s a rather simple system. We’ve got an Audio object (accessed statically), which contains play, pause stop, load JSNI methods. We’ve then also got AudioContainer objects, which are used to create and add the html audio tags to the page, and contain things like the tag name, sources, etc. The audio JSON contains a large number of AudioContainer objects, which, when constructed, add themselves to the page.
It’s perhaps not the most straightforward system I’ve ever thought of. However, it’s the fourth time I’ve re-engineered the system, so I’m prepared to forgive myself. If you’ve got any bright ideas about how we should implement audio, I’d love to hear them (because I’m only like 60% happy with this solution).
Sorry for the somewhat rambling post today, college is starting again on Monday and I’m a little preoccupied in getting ready. Never fear, though, Flax development will keep going. A little thing like college won’t stop us.