Using GUI toolkits
Using GUI toolkits
Choosing a toolkit
Do you need a toolkit?
Processing
Languages with widgets
Csound
Supercollider
Pure Data
Max/MSP
OSC interface
Mobile
interface.js
Previous knowledge
Language
Editors/Dev tools
Considerations
Platform
Capabilities
Licensing
Maintanability
Some options
Cocoa/Objective C
Windows APIs
Python or other scripting language?
many others
Cross-platform
Qt
qt-project.org
JUCE
rawmaterialsoftware.com > Juce
wxwidgets
wxwidgets.org
OpenGL
https://www.opengl.org/
Java
javascript
FLTK
fltk.org > Index
etc..
General issues
GUI toolkits are generally event driven
This means each graphical controller triggers (in some way or another) a callback function
So events are asynchronous
They can then be stored in a lock free queue to be read by the engine
Or be passed directly to the audio class
If atomic operations are possible, the event can be processed straight away
(in most cases, even when atomic, these values might only have effect on the next callback)
If not, the event can be queue for the next callback
Always separate your audio engine from the GUI code
Better maintainability
Code reuse
Easier Porting