Skip to content

A setup guide on VisionOS Main Camera API (enterprise)

Notifications You must be signed in to change notification settings

Waley-Z/visionos-main-camera

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

Main Camera Access

This is a setup guide on how to access the main camera feed in a VisionOS app using the Main Camera Access Enterprise API.

Setting Up a New VisionOS Xcode Project

Follow these steps to create your first VisionOS app:

  1. Open Xcode.
  2. Navigate to File -> New -> Project -> visionOS -> App.
  3. Enter your Product Name and click Next.

For more details, refer to the official guide: Creating your first VisionOS app.

Enterprise API Setup

  1. Add the License File:

    • Go to File -> Add Files to "MyApp".
    • Select the Enterprise.license file and check the box to Copy files to destination.
    • Click Finish.
  2. Add Entitlement for Main Camera Access:

    • Select your project in the Project Navigator.
    • Go to the Signing & Capabilities tab for the target.
    • Click the + button to add a new capability.
    • Search for Main Camera Access and double-click to add it. This will create a .entitlements file with the capability com.apple.developer.arkit.main-camera-access.allow.
  3. Edit Info.plist:

    • Add the key NSEnterpriseMCAMUsageDescription and provide a description for the main camera usage.

image-20241016080550550

For more details, check out: Building Spatial Experiences for Business Apps with Enterprise APIs.

Main Camera Access API

Here is an example of how to modify ContentView.swift to integrate example codes from WWDC24 session 10139.

// ContentView.swift
import SwiftUI
import RealityKit
import RealityKitContent
import ARKit

struct ContentView: View {
    @State private var pixelBuffer: CVPixelBuffer? = nil

    var body: some View {
        VStack {
            Model3D(named: "Scene", bundle: realityKitContentBundle)
                .padding(.bottom, 50)

            Text("Hello, world!")

            ToggleImmersiveSpaceButton()
        }
        .padding()
        .task {
            await startCameraSession()
        }
    }
    
    private func startCameraSession() async {
        // Main Camera Feed Access Example copied from WWDC24 session 10139
        let formats = CameraVideoFormat.supportedVideoFormats(for: .main, cameraPositions:[.left])
        let cameraFrameProvider = CameraFrameProvider()

        var arKitSession = ARKitSession()
        var pixelBuffer: CVPixelBuffer?

        await arKitSession.queryAuthorization(for: [.cameraAccess])

        do {
            try await arKitSession.run([cameraFrameProvider])
        } catch {
            return
        }

        guard let cameraFrameUpdates =
            cameraFrameProvider.cameraFrameUpdates(for: formats[0]) else {
            return
        }

        for await cameraFrame in cameraFrameUpdates {
            guard let mainCameraSample = cameraFrame.sample(for: .left) else {
                continue
            }
            self.pixelBuffer = mainCameraSample.pixelBuffer
        }
    }
}

#Preview(windowStyle: .automatic) {
    ContentView()
        .environment(AppModel())
}

Running on Vision Pro

  1. Build and run the project on a Vision Pro device.

  2. When you first open the app, you’ll be prompted to allow access to the Main Camera.

    Note: Main Camera Access is only available in Immersive Space. Once you toggle the Immersive Space on, you’ll see the camera being accessed, indicated by the green dot on the top of the screen.

image-20241016083000593

About

A setup guide on VisionOS Main Camera API (enterprise)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published