How To Make a Music Visualizer in iOS
Learn how to create your own music visualizer!
In the mid-seventies, Atari released the Atari Home Music player that connected a television to a stereo and thereby produced abstract images in sync with the music. Consumers could manipulate the images by twisting knobs and pushing buttons on the device.
The device was a market failure but it was the first time that the world was exposed to music visualization. Now, music visualization is a common technology that can be found in almost every digital media player such as iTunes or Windows Media Player.
To see an example of music visualization in action, simply launch iTunes, start a good tune, then choose View/Show Visualizer and allow the psychedelics to free your mind! :]
In this tutorial, you’ll create your very own music visualizer. You’ll learn how to configure the project to play music as well as support background audio and to create particle effects using UIKit’s particle system. You’ll also learn how to make those particles dance to the beat of a song.
So cue up the music and break out the disco ball, things are about to get visual!
Note: You can try out most of the tutorial using the iPhone Simulator, but you will need to run the project on a device to select different songs and to play the music in the background.
Starter project
To start things off, download this starter project. The starter project has the following functionality:
- It provides a simple user interface for the application.
- The supported interface orientation is set to landscape.
- The MediaPlayer.framework has been added to the project.
- It contains a method which allows you to pick songs from your iPod library.
- An image named particleTexture.png was added to the project for use by the particle system.
- The MeterTable.h and MeterTable.cpp C++ files were also added to the project. These were taken from the Apple sample project avTouch, and will be explained later on in this tutorial.
First, extract the downloaded project, open it in Xcode, and build and run. You should see the following:
You can tap the play button to switch between play and pause modes but you won’t hear any music until after you’ve added some code. Tap on the black area in the middle to hide/show the navigation bar and tool bar.
If you’re running in the iPhone Simulator and tap the magnifying glass icon on the bottom left, you’ll see the following warning:
This is because the iPhone Simulator doesn’t support accessing the music library. But if you are running on a device, a tap on that icon will make the media picker appear, so that you can choose a song.
Once you are familiar with the user interface, let’s get started.
Let the Music Play
Using AVAudioPlayer
is an easy way to play music on an iOS device. AVAudioPlayer
can be found in theAVFoundation.framework
, so you need to add this framework to your project.
Note: If you are interested in learning more about the AVAudioPlayer
class and what it can do, take a look at our Audio 101 for iPhone Developers: Playing Audio Programatically tutorial.
Select iPodVisualizer in the Project Navigator and then select iPodVisualizer under TARGETS. Choose the Build Phases tab, expand the Link Binary With Libraries section, then click the + (plus) button.
Search for AVFoundation.framework
in the pop up list, select it, and click Add. The framework should now appear in your project.
It’s time to write some code. Open ViewController.m and make the following changes:
// Add to the #imports section at the top of the file |
This imports the AVFoundation.h
header file so you can access AVAudioPlayer
, and then adds a property that will hold the AVAudioPlayer
instance your app will use to play audio.
And now, it’s time to play a music file.
The starter project includes a music file named DemoSong.m4a in the Resources folder that you can use. Feel free to use a different audio file if you’d like. Just remember, only the following audio codecs are supported on iOS devices for playback:
- AAC (MPEG-4 Advanced Audio Coding)
- ALAC (Apple Lossless)
- HE-AAC (MPEG-4 High Efficiency AAC)
- iLBC (internet Low Bitrate Codec, another format for speech)
- IMA4 (IMA/ADPCM)
- Linear PCM (uncompressed, linear pulse-code modulation)
- MP3 (MPEG-1 audio layer 3)
- µ-law and a-law
Still in ViewController.m, add the following method:
- (void)configureAudioPlayer { |
This method creates a reference to the music file and stores it as an audioFileURL
. It then create a newAVAudioPlayer
instance initialized with the audioFileURL
and sets its numberOfLoops
property to -1 to make the audio loop forever.
Note: If you decide to use a music file other than the provided one, do remember to add the new file to the Xcode project and to change the music file name (and perhaps the extension) in the above method.
Add the following line to the end of viewDidLoad
:
[self configureAudioPlayer]; |
By calling configureAudioPlayer:
in viewDidLoad:
, you set up the audio player as soon as the view loads and so are able to press the play button on app start and have the app play your song.
Now add the following line inside playPause
, just after the comment that reads // Pause audio here
:
[_audioPlayer pause]; |
Next, add the following line in the same method, just after the comment that reads // Play audio here
:
[_audioPlayer play]; |
Tapping the play/pause button calls playPause
. The code you just added tells audioPlayer
to play or pause according to its current state as defined by _isPlaying
. As the name indicates, this property identifies whether the audio player is currently playing audio or not.
Now build and run. If you did everything correctly the app will look exactly the same. But now you can play/pause your music.
Take this brief moment to get your funk on! :]
Selecting a Song
A music player that just plays one song, no matter how cool that song may be, isn’t very useful. So you’ll add the ability to play audio from the device’s music library.
If you don’t plan on running on a device, or know how to set that up already, you can skip to the next section.
The starter project you downloaded is set up so that when the user chooses a song from the media picker, a URL for the selected song is passed to playURL:
inside ViewController.m
. Currently, playURL:
just toggles the icon on the play/pause button.
Inside ViewController.m, add the following code to playURL:
just after the comment that reads // Add audioPlayer configurations here
:
self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil]; |
The above code is much the same as what you wrote in configureAudioPlayer
. However, instead of hardcoding the filename, you create a new AVAudioPlayer
instance with the URL passed into the method.
Build and run on a device, and you’ll be able to choose and play a song from your music library.
Note: If you have iTunes Match, you may see items in the media picker that are not actually on your device. If you choose a song that is not stored locally, the app dismisses the media picker and does not play the audio. So if you want to hear (and soon see) something, be sure to choose a file that’s actually there :]
While running the project on a device, press the home button. You’ll notice that your music is paused. This isn’t a very good experience for a music player application, if a music player is what you’re after.
You can configure your app so that the music will continue to play even when the app enters the background. Keep in mind that this is another feature not supported in the iPhone Simulator, so run the app on a device if you want to see how it works.
To play music in the background, you need to do two things: set the audio session category, then declare the app as supporting background execution.
First, set the audio session category.
An audio session is the intermediary between your application and iOS for configuring audio behavior. Configuring your audio session establishes basic audio behavior for your application. You set your audio session category according to what your app does and how you want it to interact with the device and the system.
Add the following new method to ViewController.m:
- (void)configureAudioSession { |
In configureAudioSession
, you get the audio session using [AVAudioSession sharedInstance]
and set its category to AVAudioSessionCategoryPlayback
. This identifies that the current audio session will be used for playing back audio (as opposed to recording or processing audio).
Add the following line to viewDidLoad
, just before the call to [self configureAudioPlayer];
:
[self configureAudioSession]; |
This calls configureAudioSession
to configure the audio session.
Note: To learn more about audio sessions, read Apple’s Audio Session Programming Guide. Or take a look at our Background Modes in iOS Tutorial which also covers the topic, albeit not in as much detail.
Now you have to declare that your app supports background execution.
Open iPodVisualizer-Info.plist (it’s in the Supporting Files folder), select the last line, and click the plus button to add a new item. Select Required background modes as the Key from the dropdown, and the type of the item will change to Array automatically. (If it does not automatically become Array
, double check the Key.)
Expand the item, set the value of Item0 to App plays audio. (If you have a wide Xcode window, you might not notice that the value is a dropdown list. But you can access the list by simply tapping the dropdown icon at the end of the field.)
When you are done, build and run on a device, pick a song and play it, press the home button, and this time your music should continue to play without interruption even if your app is in the background.
Visualizing with Music
Your music visualizer will be based on a UIKit
particle system. If you don’t know much about particle systems, you may want to read UIKit Particle Systems In iOS 5 or How To Make a Letter / Word Game with UIKit: Part 3/3 to familiarize yourself with the necessary background information; this tutorial does not go into detail explaining the particle system basics.
First, add the QuartzCore.framework
to your project (the same way you added theAVFoundation.framework
).
Now choose File/New/File…, and select the iOS/Cocoa Touch/Objective-C class template. Name the class VisualizerView, make it a subclass of UIView, click Next and then Create.
Select VisualizerView.m in the Xcode Project Navigator and change its extension from .m to .mm. (You can rename it by clicking the file twice slowly in the Project Navigator. That is, do not click it fast enough to be considered a double-click.) The .mm extension tells Xcode that this file needs to be compiled as C++, which is necessary because later it will access the C++ class MeterTable
.
Open VisualizerView.mm and replace its contents with the following:
#import "VisualizerView.h" |
The above code mainly configures a UIKit particle system, as follows:
- Overrides
layerClass
to returnCAEmitterLayer
, which allows this view to act as a particle emitter. - Shapes the emitter as a rectangle that extends across most of the center of the screen. Particles are initially created within this area.
- Creates a
CAEmitterCell
that renders particles usingparticleTexture.png
, included in the starter project. - Sets the particle color, along with a range by which each of the red, green, and blue color components may vary.
- Sets the speed at which the color components change over the lifetime of the particle.
- Sets the scale and the amount by which the scale can vary for the generated particles.
- Sets the amount of time each particle will exist to between .75 and 1.25 seconds, and sets it to create 80 particles per second.
- Configures the emitter to create particles with a variable velocity, and to emit them in any direction.
- Adds the emitter cell to the emitter layer.
Again, read the previously mentioned tutorials if you would like to know more about the fun things you can do with UIKit particle systems and how the above configuration values affects the generated particles.
Next open ViewController.m and make the following changes:
//Add with the other imports |
Now add the following to viewDidLoad
, just before the line that reads [self configureAudioPlayer];
:
self.visualizer = [[VisualizerView alloc] initWithFrame:self.view.frame]; |
This creates a VisualizerView
instance that will fill its parent view and adds it to _backgroundView
. (_backgroundView
was defined as part of the starter project, and is just a view layered behind the music controls.)
Build and run, you will see the particle system in action immediately:
While that looks very cool indeed, you want the particles to “beat” in sync with your music. This is done by changing the size of particles when the decibel level of the music changes.
First, open VisualizerView.h and make the following changes:
//Add with the other imports |
The new property will give your visualizer access to the app’s audio player, and hence the audio levels, but before you can use that information, you need to set up one more thing.
Switch to ViewController.m and search for setNumberOfLoops
. If you skipped the section about running on the device, it will appear only once (in configureAudioPlayer
); otherwise, it will appear twice (inconfigureAudioPlayer
and in playURL:
).
Add the following code just after any occurrence of the line [_audioPlayer setNumberOfLoops:-1];
:
[_audioPlayer setMeteringEnabled:YES]; |
With the above code, you instruct the AVAudioPlayer
instance to make audio-level metering data available. You then pass _audioPlayer
to the _visualizer
so that it can access that data.
Now switch to VisualizerView.mm and modify it as follows:
// Add with the other imports |
The above code gives you access to a MeterTable
instance named meterTable
. The starter project includes the C++ class MeterTable
, which you’ll use to help process the audio levels fromAVAudioPlayer
.
What’s all this talk about metering? It should be easy to understand once you see the image below:
You’ve most likely seen something similar on the front of a sound system, bouncing along to the music. It simply shows you the relative intensity of the audio at any given time. MeterTable
is a helper class that can be used to divide decibel values into ranges used to produce images like the one above.
You will use MeterTable
to convert values into a range from 0 to 1 and you will use that new value to adjust the size of the particles in your music visualizer.
Add the following method to VisualizerView.mm:
- (void)update |
Each time the above method is called, it updates the size of the visualizer’s particles. Here’s how it works:
- You set
scale
to a default value of 0.5 and then check to see whether or not_audioPlayer
is playing. - If it is playing, you call
updateMeters
on_audioPlayer
, which refreshes theAVAudioPlayer
data based on the current audio. - This is the meat of the method. For each audio channel (e.g. two for a stereo file), the average power for that channel is added to
power
. The average power is a decibel value. After the powers of all the channels have been added together,power
is divided by the number of channels. This meanspower
now holds the average power, or decibel level, for all of the audio. - Here you pass the calculated average
power
value tometerTable
‘sValueAt
method. It returns a value from 0 to 1, which you multiply by 5 and then set that as thescale
. Multiplying by 5 accentuates the music’s effect on the scale.Note: Why use
meterTable
to convertpower
‘s value? The reason is that it simplifies the code that you have to write. Otherwise, your code will have to cover broad range of values returned byaveragePowerForChannel
. A return value of 0 indicates full scale, or maximum power; a return value of -160 indicates minimum power (that is, near silence). But the signal provided to the audio player may actually exceed the range of what’s considered full scale, so values can still go beyond those limits. UsingmeterTable
gives you a nice value from 0 to 1. No fuss, no muss. - Finally, the scale of the emitter’s particles is set to the new
scale
value. (If_audioPlayer
was not playing, this will be the default scale of 0.5; otherwise, it will be some value based on the current audio levels.
Right now your app doesn’t call update
and so the new code has no effect. Fix that by modifyinginitWithFrame:
in VisualizerView.mm by adding the following lines just after emitterLayer.emitterCells = @[cell];
(but still inside the closing curly brace):
CADisplayLink *dpLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(update)]; |
Here you set up a CADisplayLink
. A CADisplayLink
is a timer that allows your application to synchronize its drawing to the refresh rate of the display. That is, it behaves much like a NSTimer
with a 1/60 second time interval, except that it’s guaranteed to be called each time the device prepares to redraw the screen, which is usually at a rate of 60 times per second.
The first line you added above creates an instance of CADisplayLink
set up to call update
on the targetself
. That means it will call the update
method you just defined during each screen refresh.
The second line calls addToRunLoop:forMode:
, which starts the display link timer.
Note: Adding the CADisplayLink
to a run loop is a low-level concept related to threading. For this tutorial, you just need to understand that the CADisplayLink
will be called for every screen update. But if you want to learn more, you can check out the class references for CADisplayLink orNSRunLoop, or read through the Run Loops chapter in Apple’s Threading Programming Guide.
Now build, run, and play some music. You will notice that particles will change their size but they don’t “beat” with the music. This is because the change we make can not affect the particles that already exist on the screen. Only new particles are changed.
This needs to be fixed.
Open VisualizerView.mm and modify initWithFrame:
as follows:
// Remove this line |
Like CAEmitterLayer
, CAEmitterCell
also has a property named emitterCells
. This means that a CAEmitterCell
can contain another CAEmitterCell
. This results in particles emitting particles. That’s right, folks, it’sturtles particles all the way down! :]
Also notice that you set the child’s lifetime
to 1/60 seconds. This means that particles emitted bychildCell
will have a lifetime which is the same length as a screen refresh. You set birthRate
to 60, which means that there will be 60 particles emitted per second. Since each dies in 1/60th of a second, there will always be a particle created when the previous particle dies. And you thought your day was short :]
Build and run, you will see the particle system works the same as it did before – but it still doesn’t beat to the music. You can try setting birthRate
to 30 to help you understand how the setting works (just don’t forget to set it back to 60).
So how do you get the particle system to beat to the music?
The last line of update
currently looks like this:
[emitterLayer setValue:@scale forKeyPath:@"emitterCells.cell.scale"]; |
Replace that line with the following:
[emitterLayer setValue:@(scale) forKeyPath:@"emitterCells.cell.emitterCells.childCell.scale"]; |
Now build and run, you will see that all the particles beat with your music now.
So what did the above change do?
Particles are created and destroyed at the same rate as a screen refresh. That means that every time the screen is redrawn, a new set of particles is created and the previous set is destroyed. Since new particles are always created with a size calculated from the audio-levels at that moment, the particles appear to pulse with the music.
Congratulations, you have just made a cool music visualizer application!
Where to go from here?
Here is the complete example project with all of the code from the above tutorial.
This tutorial gave you a basic idea as to how to add a music visualisation system to your app. But you can take it further:
- You can add more music controls to make the project a fully functional music player.
- You could create a slightly more sophisticated visualizer that modified a separate particle system for each audio channel, rather than blending all audio channels into a single value.
- Try creating a different kinds of particle systems (this tool, UIEffectDesigner, may help).
- Or maybe try changing the shape of your emitter layer and moving it around within the view.
While you’re at it, check out Apple’s sample project aurioTouch2. It’s an advanced use of music visualization and a great way to learn more about the subject.
Have fun!
How To Make a Music Visualizer in iOS的更多相关文章
- iPhone Tutorials
http://www.raywenderlich.com/tutorials This site contains a ton of fun written tutorials – so many t ...
- (史上最全的ios源码汇总)
按钮类 按钮 Drop Down Control http://www.apkbus.com/android-106661-1-1.html 按钮-Circular M ...
- iOS苹果官方Demo合集
Mirror of Apple’s iOS samples This repository mirrors Apple’s iOS samples. Name Topic Framework Desc ...
- iOS可视化动态绘制连通图
上篇博客<iOS可视化动态绘制八种排序过程>可视化了一下一些排序的过程,本篇博客就来聊聊图的东西.在之前的博客中详细的讲过图的相关内容,比如<图的物理存储结构与深搜.广搜>.当 ...
- 【疯狂造轮子-iOS】JSON转Model系列之二
[疯狂造轮子-iOS]JSON转Model系列之二 本文转载请注明出处 —— polobymulberry-博客园 1. 前言 上一篇<[疯狂造轮子-iOS]JSON转Model系列之一> ...
- 【疯狂造轮子-iOS】JSON转Model系列之一
[疯狂造轮子-iOS]JSON转Model系列之一 本文转载请注明出处 —— polobymulberry-博客园 1. 前言 之前一直看别人的源码,虽然对自己提升比较大,但毕竟不是自己写的,很容易遗 ...
- iOS总结_UI层自我复习总结
UI层复习笔记 在main文件中,UIApplicationMain函数一共做了三件事 根据第三个参数创建了一个应用程序对象 默认写nil,即创建的是UIApplication类型的对象,此对象看成是 ...
- iOS代码规范(OC和Swift)
下面说下iOS的代码规范问题,如果大家觉得还不错,可以直接用到项目中,有不同意见 可以在下面讨论下. 相信很多人工作中最烦的就是代码不规范,命名不规范,曾经见过一个VC里有3个按钮被命名为button ...
- JS调用Android、Ios原生控件
在上一篇博客中已经和大家聊了,关于JS与Android.Ios原生控件之间相互通信的详细代码实现,今天我们一起聊一下JS调用Android.Ios通信的相同点和不同点,以便帮助我们在进行混合式开发时, ...
随机推荐
- sharepoint2013小技巧
一.创建基于经典认证的应用程序 New-SPWebApplication -Name "Contoso Internet Site" -ApplicationPool " ...
- JS判断图片是否加载完成三种方式
1.img的complete属性 轮询不断监测img的complete属性,如果为true则表明图片已经加载完毕,停止轮询.该属性所有浏览器都支持. <p id="p1"&g ...
- java制作简单的坦克大战
坦克大战是我们小时候玩红白机时代的经典游戏,看到有不少小伙伴都使用各种语言实现了一下,手痒痒,也使用java做的一个比较简单的坦克大战,主要面向于学过Java的人群,与学了一段时间的人,有利于面向对象 ...
- zxing生成和解析二维码
今天忙了一天的二维码,用了QRcode和ZXing两个开源包.结果发现 ZXing比QRcode更好用一些,它直接可以定义二维码生成图案的大小,而QRcode生成的二维码是根据二维码包含的内容多少来定 ...
- 一个php小白找工作的历程
一个php小白找工作的历程其实对新工作还是有点忐忑的,对于我这样一个有着特殊工作经历的来说更是如此.为了更好的迎接未来,不得不总结下过去.在经历一段时间的职业生涯探索期后,还是觉得自己更适合做程序员这 ...
- HTML5数据存储
介绍两种对象使用方法: sessionStorage方法如果关闭了浏览器,这个保存的数据就丢失. 1.sessionStorage 保存数据:sessionStorage.setItem(key,va ...
- 猜数字-js
var n = Math.round(Math.random()*10); //随机数 // alert(n); while(true){ var Onum = prompt('请输入1-10之间的数 ...
- 11061160顾泽鹏homework-01
我的Github地址是buaa11061160 教材:中文版 代码大全 (第二版) 斯蒂夫·迈克康奈尔 设计思路: 输入了一串数组a[0].a[1]..... 从a[0]开始向后扫,在以数字a[i]结 ...
- == 与 is
Python中的对象包含三要素:id.type.valueid:用来唯一标识一个对象: type:标识对象的类型: value是对象的值. ==是判断对象的值是否相等,也就是value,is是判断对象 ...
- iOS开发——C篇&函数解析
关于函数,作为一个开发者事必须掌握的知识不管你在那一个领域,所以今天我就来说一说函数. 一:函数的介绍 关于函数,其实笔者在前面都已经演示不少了,其中用的最多的就是main函数,虽然直接说函数可能不太 ...