MusiKraken app for iPhone and iPad


4.4 ( 8704 ratings )
Music Productivity
Developer: Snarp
9.99 USD
Current version: 1.69, last update: 4 months ago
First release : 24 Nov 2020
App size: 93.09 Mb

MusiKraken is an Experimental MIDI Controller Construction Kit.

Winner of the 2022 MIDI Innovation Awards in the category "Software Prototypes / Non-Commercial Products".

Make music using device sensors like Touch, Accelerometer, Microphone and Camera (face, hand, body joint and color tracking and depth sensor support), or connected devices like Game Controllers or an Apple Watch.

Choose from several types of modules in the editor and connect the ports to create your own MIDI controller setup. Use Route the MIDI signals through effect modules to control multiple instruments simultaneously or generate creative combinations.

MusiKraken supports sending and receiving MIDI data over Wi-Fi, Bluetooth or via Core MIDI (for example by using Inter-Device Audio and MIDI (IDAM)). And it can host Audio Unit Instruments.

Important: Please note that some of the modules only work on iOS devices with specific hardware: ARKit Face Tracking and the TrueDepth module need a TrueDepth front camera, and Hand and Body Joint Tracking needs at least iOS 14, and might be too slow on older devices.

Input Modules:
-Keyboard: MPE support, scale highlighting, control multiple values by sliding and touch pressure or finger radius.
-Chords Pad: Play chords and its inversions of a selected scale.
-Touchpad: Slide multiple fingers on a touchpad to control up to 5 values simultaneously.
-ARKit Face Tracking: Uses the TrueDepth camera to generate notes or control parameters using your mouth, eyes, tongue or by moving your head.
-Mouth Tracking: Uses the normal camera to control parameters using your mouth.
-TrueDepth: Use the depth signal of your camera to generate MIDI data by moving your hand (or anything else) in front of the device.
-Hand Tracking: Uses the camera to convert your hand position and simple hand gestures to MIDI. You can use one or two hands and it can also use the TrueDepth sensor if available.
-Body Tracking: Uses the camera to track hand, feet and head positions of up to two persons.
-Color Tracking: Track objects by color.
-Motion Sensors: Uses the combination of Accelerometer, Gyroscope and Magnetometer to detect the current rotation of the device.
-Accelerometer: Measures the acceleration of the device.
-Apple Watch: Use the motion sensors, heart rate, touch and digital crown.
-Game Controller: Use buttons, thumbsticks and everything else.
-Microphone: Detects pitch and volume using the microphone.
-External Input Device: Receives MIDI events of connected input devices.

Output Modules:
-External Output Device: Send MIDI events to any device accessible via Core MIDI.
-AudioUnit: Load any Music Device or MIDI Processor Audio Unit and generate sounds directly in the app.
-Core MIDI Network: Send MIDI via WiFi.
-Snarp Network MIDI: My own implementation to send MIDI via WiFi.
-SimpleSynth: Generate simple synthesizer sounds directly on the device.

Effects:
-Chord Splitter: Splits incoming chords into separate notes and sends these to separate channels.
-Channel Switcher: Change the MIDI channel of events.
-Value to MIDI Converter: Converter to convert numerical values (green ports) to MIDI events (orange ports).
-Transposer: Transpose MIDI notes.
-Arpeggiator: Cycles through the notes of incoming chords in a specific pattern.
-Speed: Computes a new value based on the change speed of another value.
-Beat: Play a beat if a value is changed quickly.
-Threshold: Plays notes if a value goes over a specific threshold.
-LFO: Generates an LFO from incoming values.
-Envelope: Generates an ADSR output value based on note input.
-Latch: Keeps notes alive until the same note is played again.
-Filter: Filter events by channel, note, velocity or CC.

HealthKit is used in the app to access Apple Watch data like heart beat or motion sensors. The app only uses sensor data to generate MIDI events.