iTunes Visualization -- What type of code is it written in and what does that code look like?
Being a web developer, I know how event driven user interfaces are written, but do not have insight into other families 开发者_StackOverflow社区of code (embedded software like automotive software, automation software on assembly lines, drivers, or the crawling lower-thirds on CNN, etc.)
I was looking at the iTunes visualizer (example) and am curious:
What code is used to write the visualizer? Objective C?
Does it use Core Animation? What type of abstraction does that library offer?
What does the code look like? Is it a list of mathematical equations for producing the crazy graphics? Is it a list of key frames with tweening? Is there an array of images, fractals, worm holes, flowers, sparkles, and some magic mixing them together. Or something totally different?
I am not looking for a tutorial, just an understanding of how something very different than web development works.
Oh yah, I know iTunes is closed source, so all of this is conjecture.
For Specific iTunes Visualizer, I think it is created by Flight404...in cinder. google his site. I think he did it with Andrew Bell few years ago
Although the default iTunes visualizer is written in Objective-C/C++, you can also write iTunes visualizers using Quartz Composer, which is included with XCode on the Mac. It is a node-based compositing environment for visual effects. It has a template for creating Music Visualizers.
In my opinion:
Probably C and/or Objective-C with OpenGL.
Possibly. Core Animation provides layers (images) that can be animated very easily and efficiently (fade-in, fade-out, translation, rotations, etc.). It probably uses the same hardware acceleration as OpenGL does. These layers may be used for transitions in the visualizer.
Your bet is as good as mine, but you're probably right. They may use some set of mathematical equations that takes as input a number of variables (such as the amplitude of the sound) and produce an image.
iTunes appears to use the G-Force visualizer (or at least, G-Force was licensed for use in iTunes 8.x):
http://en.wikipedia.org/wiki/Music_visualization
http://en.wikipedia.org/wiki/SoundSpectrum
In a more general sense, visualizations are typically combinations of various geometric elements whose parameters are linked to certain sound measurements (volumne, pitch, et cetera), waveforms, and spectrum graphs, with various visual transformations/filters layered on top of those source elements. That's why you tend to see a lot of squiggly lines in visualizations - they're a common form of representing waveforms and spectrums.
Somewhat related — if you want to learn how to do "ol' skool" visualizations like iTunes or, for those who remember, WinAmp, and do it in JavaScript (which you then could put through stuff like React Native or Native Script to have it compile for iOS or Android), see this very good talk done by Ruth Johh at the CascadiaJS 2018 conference in Seattle, WA —
https://www.youtube.com/watch?v=Dt4I-96C-pg
精彩评论