Flax HTML5 Game Engine Development Diary Part 7
So, as Ciarán’s mentioned, we’re back at work on Flax. This is, to put it lightly, a good thing. Neither of us want Flax to turn out like so many other student projects (that is, “we’re making a magical [insert project here], and it’s going to be made of unicorn tears and fairies, if we ever actually do it”). That’s why we armed ourselves with whiteboards and the rationalisation that anything we do outside college would have the same, if not more, weight than whatever it is we’re told to do by the college. Initiative, right?
We started fairly slowly, rewriting the way the Flax Engine handles files (which only took two hours or so, something I’m proud of). Then we moved on to rewriting and/or fixing how it handles Audio. That’s what I’ll be talking about today, because the way Flax’s handled Audio has been my pet hatred since the first time I thought about it.
It made sense at the time…
A disclaimer: The old way I did this was done over a weekend, after virtually no thought. I don’t condone doing it that way; the new method makes far more sense.
I documented the old way I did audio in this blog post (co-incidentally, I also talked about file handling there, which, as I said above, we also re-wrote. That says something about me, doesn’t it?), but I’ll quickly summarise it: You made an Audio JSON through Weave, that Audio JSON constructed an Audio object which contained AudioContainers, these AudioContainers played/paused/et cetera themselves when told to.
On top of the ridiculous complexity in that structuring, I didn’t actually test any of it. Theoretically, it worked. Theoretically, I can consistently think of witty remarks revolving around things that work in theory.
Now with added less complexity!
Now, the audio is handled by the map. I know that, if you think about it in a certain way, that makes no sense. To put it in different terms: Why would an atlas also tell you what somewhere sounded like? To that I say: Wouldn’t atlases be far more awesome if they did?
On to the technical details. Now, as you might know, our map file is actually a JSON file full of serialised objects, be they tiles, monsters, et cetera. Now, perhaps you want a tile to make a sound when you step on it, or a monster shriek horribly when you kill it in front of its family. You essentially just add the path to the correct audio file at the right place in the code, and pow, it works. How easy is that? There’s no more worrying about setting up an AudioContainer object, now it’s just Audio.play(“path/to/audio.mp3”). (Audio is a static service, in case you’re wondering.) It even (rather, it will even) transparently work with encodings, so it’ll work in browsers that don’t like the MP3 format (I’m looking at you, Firefox), as long as you also provide the other encoding. If you’ve got two or even three encodings, it will order the src attribute of the audio tag so that they’ll actually work in each browser.
How’s that for not being complex? Still kind of bad? If you’ve got any suggestions, I’d like to hear them, be they about the way we program Flax or about the particular coffee that I’m currently drinking (some anonymous beans from Indonesia from the last trip there, thanks for asking).