Skip to main content

News

Topic: Audialis: Sphere 2.0 low-level audio API (Read 7795 times) previous topic - next topic

0 Members and 1 Guest are viewing this topic.
  • Fat Cerberus
  • [*][*][*][*][*]
  • Global Moderator
  • Sphere Developer
Audialis: Sphere 2.0 low-level audio API
So this is an idea prompted by NEO's earlier posting of his YM2612 emulator.  I realized that using it required some lower-level way to access the audio hardware than Sphere currently provides, so I came up with this idea.  I call this new API "Audial".

With Audial, you create Streams and Mixers instead of Sound objects.  The Stream is fed with audio data and played by way of a Mixer object.

I haven't implemented anything for this yet in minisphere, but my basic idea is something like this:
Code: (javascript) [Select]
var mixer = new Mixer(2, 44100, 16);  // stereo, 44.1kHz 16-bit
var stream = new Stream(2, 44100, 16);
// maybe do some pre-buffering...
stream.play();

// in update script...
// get the audio data into a buffer/bytearray somehow
stream.feed(buffer);
mixer.update();  // or maybe the engine can even do this?

  • Last Edit: July 06, 2015, 11:23:14 pm by Lord English
neoSphere 5.9.2 - neoSphere engine - Cell compiler - SSj debugger
forum thread | on GitHub

Re: Audial: Sphere 2.0 low-level audio API
Reply #1
This reminds me, I have wanted to expose the image and sound loading functions to script so that instead of an entire Sound or Surface object, you just get an ArrayBuffer with the contents of the decoded data (along with a POJS object that specifies basic information about the decoded data), and then you'd be able to load that into a Sound or Surface (that last part already works, it's how loading tilesets and spritesets in script isn't slow in the Turbo map engine).

Perhaps implementing a system like that would make this easier and more symmetric. It would make the API slightly more complex, but make things much more flexible.

  • Fat Cerberus
  • [*][*][*][*][*]
  • Global Moderator
  • Sphere Developer
Re: Audial: Sphere 2.0 low-level audio API
Reply #2
Actually I'm beginning to wonder if the Mixers are really necessary or are just bloat.  It might be better just to have Streams be mixed internally, like Sounds already are.  I can't really think of a use case where having more than one Mixer would be useful.
neoSphere 5.9.2 - neoSphere engine - Cell compiler - SSj debugger
forum thread | on GitHub

  • Fat Cerberus
  • [*][*][*][*][*]
  • Global Moderator
  • Sphere Developer
Re: Audial: Sphere 2.0 low-level audio API
Reply #3
Well, initial tests are promising!

Code: (javascript) [Select]
	var stream = new Stream(44100, 16);
var ba = new ByteArray(44100 * 2 * 2);
for (var i = 0; i < ba.size; ++i)
ba[i] = RNG.range(0, 255);
stream.feed(ba);


The above produces about a second of choppy white noise. ;D
neoSphere 5.9.2 - neoSphere engine - Cell compiler - SSj debugger
forum thread | on GitHub

Re: Audial: Sphere 2.0 low-level audio API
Reply #4
Only signed integer support, or will floating point samples be possible?

Also, duktape could really use ArrayBuffers...

  • Fat Cerberus
  • [*][*][*][*][*]
  • Global Moderator
  • Sphere Developer
Re: Audial: Sphere 2.0 low-level audio API
Reply #5
I started out with just signed int for now, just to test it.  This is how I develop things so fast, I start with the bare minimum necessary to get results.  Once I see it works, then I build on it.  Much more productive than trying to set up a whole framework ahead of time, and more fun to boot. :)

As for ArrayBuffers, they will be in the next Duktape release.  And I get the impression that's not too far off either. :D
neoSphere 5.9.2 - neoSphere engine - Cell compiler - SSj debugger
forum thread | on GitHub

Re: Audial: Sphere 2.0 low-level audio API
Reply #6
Just checking that floating point is in the ether.

  • N E O
  • [*][*][*][*][*]
  • Administrator
  • Senior Administrator
Re: Audial: Sphere 2.0 low-level audio API
Reply #7
Personally, I like how Web Audio (as an API, I have not yet seen the internals for it) is set up, especially for "ScriptProcessorNode"s. Quetzal by emulator author Cydrak is a great example (search for "onaudioprocess" to be at the function that sets up the connections); it's a MIDI sequence player that uses a JS OPL2 he originally wrote for desktop and ported to JS and I LOVE IT.

  • Fat Cerberus
  • [*][*][*][*][*]
  • Global Moderator
  • Sphere Developer
Re: Audialis: Sphere 2.0 low-level audio API
Reply #8
I decided to rename Audial to Audialis.  It's a much catchier name.

As for that JS MIDI player, that's awesome.  It's driving me nuts what MOON.MID is now.  I know the tune well... but I can't remember what it's from!
neoSphere 5.9.2 - neoSphere engine - Cell compiler - SSj debugger
forum thread | on GitHub

  • DaVince
  • [*][*][*][*][*]
  • Administrator
  • Used Sphere for, like, half my life
Re: Audialis: Sphere 2.0 low-level audio API
Reply #9
Sailor Moon! And wow, I'm impressed with this too.
  • Last Edit: July 07, 2015, 08:20:46 am by DaVince

  • Fat Cerberus
  • [*][*][*][*][*]
  • Global Moderator
  • Sphere Developer
Re: Audialis: Sphere 2.0 low-level audio API
Reply #10
For now I decided to just use 8-bit unsigned for all Audialis streams, as that's the easiest to manipulate in Sphere until Duktape has proper TypedArray support.
neoSphere 5.9.2 - neoSphere engine - Cell compiler - SSj debugger
forum thread | on GitHub

  • FBnil
  • [*][*]
Re: Audialis: Sphere 2.0 low-level audio API
Reply #11
Nice link, Neo.

  • Fat Cerberus
  • [*][*][*][*][*]
  • Global Moderator
  • Sphere Developer
Re: Audialis: Sphere 2.0 low-level audio API
Reply #12
So I ended up implementing Mixers in minisphere 1.5 after all, despite initial misgivings about bloat.  I realized that they do have their uses, for instance if you want to make a game where the sound and music volume can be adjusted separately, you can create a separate mixer for each and adjust its volume directly instead of manually for each sound.

Classic Sounds can take advantage of Mixers, as well.  For backwards compatibility, however, a Sound doesn't need to have a mixer explicitly provided--a default mixer is provided if not.  SoundStreams on the other hand, being a 2.0 API must have the mixer specified explicitly.  Of course you can still use the default (CD quality) mixer via GetDefaultMixer().

So basically, Audialis is the audio equivalent to Galileo.  A Mixer is a Group, and a Sound or SoundStream is a Shape. :)
neoSphere 5.9.2 - neoSphere engine - Cell compiler - SSj debugger
forum thread | on GitHub