I’ve a aircraft node whose level coordinate is projected to the display:
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
guard let faceAnchor = anchor as? ARFaceAnchor,
let arFrame = sceneView.session.currentFrame else {
return
}
let capturedImage = arFrame.capturedImage
node.enumerateChildNodes { childNode, _ in
guard let aircraft = node.geometry as? SCNPlane else { return }
let width = aircraft.width
let top = aircraft.top
let topLeft = node.convertPosition(SCNVector3(width/2, -height/2, 0), to: nil)
let screenTopLeft = self.sceneView.projectPoint(topLeft)
print(screenTopLeft)
print(capturedImage)
}
}
How do I convert the coordinate of screenTopLeft
to that of capturedImage
which is CVPixelBuffer
. For instance, logging the sizes of each present the next
let pixelBuffer = capturedImage
let bufferWidth = CVPixelBufferGetWidth(pixelBuffer) // 1440
let bufferHeight = CVPixelBufferGetHeight(pixelBuffer) // 1080
// sceneView.bounds.measurement: (375.0, 530.0)
One other caveat appears to be that the capturedImage
appears to be in a landscapeLeft
orientation for some unknown purpose.
I attempted to transform coordinates by following:
let displayTransform = arFrame.displayTransform(for: .landscapeLeft, viewportSize: sceneView.bounds.measurement)
let topLeftPoint = CGPoint(x: CGFloat(screenTopLeft.x), y: CGFloat(screenTopLeft.y))
let topLeftPointInCapturedImage = topLeftPoint.making use of(displayTransform.inverted())
print(topLeftPointInCapturedImage)
Nonetheless, the ensuing coordinates appear to be inaccurate.