.SUSS

.SUSS

51327 bookmarks
Custom sorting
Music Analysis API: Use Cyanite API to Analyze Music Data on Genre, Mood, Bpm & Key - Cyanite.ai
Music Analysis API: Use Cyanite API to Analyze Music Data on Genre, Mood, Bpm & Key - Cyanite.ai
Genre: ambient, blues, classical, electronicDance, folkCountry, funkSoul, jazz, latin, metal, pop,rapHipHop, reggae, rnb, rock, singerSongwriter Mood: aggressive, calm, chilled, dark, energetic, epic, happy, romantic, sad, scary, sexy, ethereal, uplifting Sub-genre: bluesRock, folkRock, hardRock, indieAlternative, psychedelicProgressiveRock, punk, rockAndRoll, popSoftRock, abstractIDMLeftfield, breakbeatDnB, deepHouse, electro, house, minimal, synthPop, techHouse, techno, trance, contemporaryRnB, gangsta, jazzyHipHop, popRap, trap, blackMetal, deathMetal, doomMetal, heavyMetal, metalcore, nuMetal, disco, funk, gospel, neoSoul, soul, bigBandSwing, bebop, contemporaryJazz, easyListening, fusion, latinJazz, smoothJazz, country, folk BPM and key: are essentials for every library and DJ. We knew about the problems with double or half-time BPM results. So, we put extra effort to prevent that issue and create one of the best BPM classifiers out there. Key is a no-brainer, and in addition to the dominant key, the API also lets you find the less dominant keys. Voice: The voice classifier categorizes the audio as female or male singing voice or instrumental (non-vocal).  Instrument:  predicts the presence of the following instruments: percussion, synth, piano, acousticGuitar, electricGuitar, strings, bass, bassGuitar and brassWoodwinds. Energy Level: indicates the intensity of an analyzed track which can be variable, medium, high, or low.  Energy Dynamics: describes the progression of the Energy Level from low to high and variable. Movement: describes the rhythmic structure of music on a high level: bouncy, driving, flowing, groovy, nonrhythmic, pulsing, robotic, running, steady, stomping. Character: depicts an expressive form of music that is rather headed towards its appearance than its mood. Provides the following labels: bold, cool, epic, ethereal, heroic, luxurious, magical, mysterious, playful, powerful, retro, sophisticated, sparkling, sparse, unpolished, warm.
·cyanite.ai·
Music Analysis API: Use Cyanite API to Analyze Music Data on Genre, Mood, Bpm & Key - Cyanite.ai
Analyzing Music Using Neural Network: 4 Essential Steps - Cyanite.ai
Analyzing Music Using Neural Network: 4 Essential Steps - Cyanite.ai
The 4 steps for analyzing music with neural networks include: 1. Collecting data 2. Preprocessing audio data 3. Training the neural network 4. Testing and evaluating the network
Regarding our example of genre classification, recent research has shown that the accuracy of a CNN architecture (82%) can surpass human accuracy (70%), which is quite impressive. Depending on the specific task, accuracy can be even higher.
·cyanite.ai·
Analyzing Music Using Neural Network: 4 Essential Steps - Cyanite.ai
Translating Sonic Languages: Different Perspectives On Music Analysis - Cyanite.ai
Translating Sonic Languages: Different Perspectives On Music Analysis - Cyanite.ai
Let’s say music is a code.
music is systematically structured, categorised, it follows a strict “grammar”. It is not mysterious, but enigmatic. Music is auditory code. A code that needs to be deciphered and translated. And we can process this code by technological means, like any other sign system. Unlike other codes, however, the code of music is not stable and predictable, but surprising and diverse.
One example is the digital scene “aurora” from the series Sound Data Sculpture Sketches. The creation process starts with a set of dots that move on a sphere. Over time their path is traced to form tubes, this produces an organic appearance. A representation of the underlying song’s frequencies is texture-mapped onto the geometry of the tubes and used to generate colour gradients that react to music. From this interpretative, digitally mediated translation of the original song, a dreamy audio-sculpture is created. By interpreting the musical parameters, this artwork goes further than a mere technical analysis. It thereby contemplates the poetry and beauty of the sonic language, seeking to visually formulate an accurate translation.
The whole theme of “translation” points to the fact, that music is socially formalised and follows symbolic structures. Music is deeply connected to our human experience because it works like a language, because it translates into emotion and bodily reactions. The notion that music is tangible and rests upon patterns that we can calculate and process with digital technologies is not as weird or scary as it might seems. Music is a code – and that is a beautiful thing.
·cyanite.ai·
Translating Sonic Languages: Different Perspectives On Music Analysis - Cyanite.ai
Introducing: Cyanite's Keyword Cleaning System for Music Libraries - Cyanite.ai
Introducing: Cyanite's Keyword Cleaning System for Music Libraries - Cyanite.ai
1. Objective catalog language (tagging): the entity of keywords and tags often described as taxonomy or tag anthology (quantity, classes and wording). „Which tags do I use. 2. Subjective catalog language (understanding of tagging): the understanding of tags and their connection to certain sound qualities. „When do I assign a certain tag?“
·cyanite.ai·
Introducing: Cyanite's Keyword Cleaning System for Music Libraries - Cyanite.ai