背景:我找到了一个名为“AVAudioEngine in Practice”的Apple WWDC会话,并试图做出类似于43:35(https://youtu.be/FlMaxen2eyw?t=2614)所示的最后一个演示的类似内容。我使用的是SpriteKit而不是SceneKit,但原理是一样的:我想生成球体,将它们扔到身边,当它们发生碰撞时,发动声音,对每个球体都是独一无二的。有关使用AVAudioEngine的详细信息
问题:
我想连接到每个SpriteKitNode,这样我可以发挥不同的声音对每个球的唯一AudioPlayerNode。即现在,如果我为每个AudioPlayerNode创建两个球体并设置不同的音高,则只有最近创建的AudioPlayerNode似乎正在播放,即使原始球体发生碰撞。在演示期间,他提到“我正在绑定一名球员,一名专职球员到每一球”。我会怎么做呢?
每次发生新的碰撞时都会有声音点击/伪影。我假设这与AVAudioPlayerNodeBufferOptions和/或每次发生联系时我试图创建,调度和使用缓冲区的事实有很大关系,这不是最有效的方法。这将是一个很好的解决方法?
代码:由于在视频中提到,“...对于每一个真实降生到这个世界上的球,一个新的球员节点也创造”。我有一个球一个单独的类,有一个返回SpriteKitNode的方法,并创建一个AudioPlayerNode每次调用时间:
class Sphere {
var sphere: SKSpriteNode = SKSpriteNode(color: UIColor(), size: CGSize())
var sphereScale: CGFloat = CGFloat(0.01)
var spherePlayer = AVAudioPlayerNode()
let audio = Audio()
let sphereCollision: UInt32 = 0x1 << 0
func createSphere(position: CGPoint, pitch: Float) -> SKSpriteNode {
let texture = SKTexture(imageNamed: "Slice")
let collisionTexture = SKTexture(imageNamed: "Collision")
// Define the node
sphere = SKSpriteNode(texture: texture, size: texture.size())
sphere.position = position
sphere.name = "sphere"
sphere.physicsBody = SKPhysicsBody(texture: collisionTexture, size: sphere.size)
sphere.physicsBody?.dynamic = true
sphere.physicsBody?.mass = 0
sphere.physicsBody?.restitution = 0.5
sphere.physicsBody?.usesPreciseCollisionDetection = true
sphere.physicsBody?.categoryBitMask = sphereCollision
sphere.physicsBody?.contactTestBitMask = sphereCollision
sphere.zPosition = 1
// Create AudioPlayerNode
spherePlayer = audio.createPlayer(pitch)
return sphere
}
这里是我的音频类与创建AudioPCMBuffers和AudioPlayerNodes
class Audio {
let engine: AVAudioEngine = AVAudioEngine()
func createBuffer(name: String, type: String) -> AVAudioPCMBuffer {
let audioFilePath = NSBundle.mainBundle().URLForResource(name as String, withExtension: type as String)!
let audioFile = try! AVAudioFile(forReading: audioFilePath)
let buffer = AVAudioPCMBuffer(PCMFormat: audioFile.processingFormat, frameCapacity: UInt32(audioFile.length))
try! audioFile.readIntoBuffer(buffer)
return buffer
}
func createPlayer(pitch: Float) -> AVAudioPlayerNode {
let player = AVAudioPlayerNode()
let buffer = self.createBuffer("PianoC1", type: "wav")
let pitcher = AVAudioUnitTimePitch()
let delay = AVAudioUnitDelay()
pitcher.pitch = pitch
delay.delayTime = 0.2
delay.feedback = 90
delay.wetDryMix = 0
engine.attachNode(pitcher)
engine.attachNode(player)
engine.attachNode(delay)
engine.connect(player, to: pitcher, format: buffer.format)
engine.connect(pitcher, to: delay, format: buffer.format)
engine.connect(delay, to: engine.mainMixerNode, format: buffer.format)
engine.prepare()
try! engine.start()
return player
}
}
在我GameScene类,然后我测试碰撞,安排一个缓冲并播放AudioPlayerNode如果接触发生
func didBeginContact(contact: SKPhysicsContact) {
let firstBody: SKPhysicsBody = contact.bodyA
if (firstBody.categoryBitMask & sphere.sphereCollision != 0) {
let buffer1 = audio.createBuffer("PianoC1", type: "wav")
sphere.spherePlayer.scheduleBuffer(buffer1, atTime: nil, options: AVAudioPlayerNodeBufferOptions.Interrupts, completionHandler: nil)
sphere.spherePlayer.play()
}
}
我是新至S wift,只有基本的编程知识,所以任何建议/批评都是受欢迎的。
虽然此链接可以回答这个问题,最好是在这里有答案的主要部件,并提供链接以供参考时在gameView实例化这个。如果链接页面更改,则仅链接答案可能会失效。 - [来自评论](/ review/low-quality-posts/11350414) –
@BeauNouvelle我已经编辑完整的测试代码和一个额外的功能 – triple7