概观

更新

Reincubate's Camo SDK provides the ability to send real-time audiovisual data from your application to Camo Studio on macOS and Windows. It functions over USB and is optimised for low-latency and performance; consuming as few resources in your app as possible.

Currently, the SDK only supports iOS applications running on a physical iOS or iPadOS device (excluding simulators and Mac Catalyst) on iOS 12 or above. The Camo SDK does not have Objective-C compatibility out of the box, so if your app is primarily Objective-C, you may need to create a wrapper.


Known Issues

As the Camo SDK is currently an early alpha, be aware of the following issues:

  • Accessing the CamoController's state is currently inefficient.
  • camoController(_:stateDidChange:) is not currently called.
  • Only 30 FPS video is currently supported.
  • Building for x86_64 architecture is not currently supported. A future build will allow embedding CamoProducerKit in an x86_64 app, with the ability to start the Camo service disabled.
  • The SDK is currently larger than intended. This will be resolved in future builds.

API Reference

If you would like to jump right in, you can view the full API reference, which documents every symbol provided by the Camo SDK. Otherwise, this document will walk you through a sample integration with the SDK.


Installing the Camo SDK

When you receive the Camo SDK, you will have received a file called CamoProducerKit.xcframework. This self-contained framework has everything you need to get started, and does not include any non-Apple dependencies, so installation is easy.

  1. Select your application's project in the Project Navigator.
  2. Choose your application's target from the project's target sidebar. If you don't see a sidebar, it might be in a dropdown at the top of the page.
  3. Make sure you're on the 'General' tab, and scroll down to the "Frameworks, Libraries, and Embedded Content" header.
  4. Drag the CamoProducerKit.xcframework file and drop it into the file list underneath that header.
  5. Make sure the SDK is set to "Embed & Sign".

When you are done, you should see the framework as pictured:

Integrating with your app

Controlling the Camo service

At the core of CamoProducerKit is the CamoController, which provides your app with one centralized interface for controlling the Camo SDK.

In your app, you should initialise an instance of the CamoController. You can choose to do this at app launch or save it for later. Until explicitly started with start(), the Camo controller will use very few resources. However, it will prepare for audio and video encoding upon initialisation.

import CamoProducerKit

class MyVideoManager {
    let controller = CamoController()

    // ...
}

When you want to activate the Camo integration in your app, starting up the Camo service to facilitate connections, you can call start(). Once you're done with it, you can stop it by calling stop().

controller.start()
// ...
controller.stop()

This code is a start, but it still can't accept new connections from Camo Studio.

Responding to new connections

Before your app will be able to accept new connections from Camo Studio, you will need to implement the CamoControllerDelegate. By implementing these two delegate methods, you can be notified about changes to the connection state and decide whether to accept or reject a connection.

extension MyVideoManager: CamoControllerDelegate {
    // Upon receiving a connection request, you can decide whether or not to accept it.
    func camoControllerShouldAcceptIncomingConnection(_ controller: CamoController) -> Bool {
        // You could return false if you aren't ready to accept connections, such as during onboarding.
        return true
    }

    // Called whenever the connection state changes, such as if the Camo service starts or stops, or if a new connection is made.
    func camoController(_ controller: CamoController, stateDidChangeTo state: CamoControllerState?) {
        // From here, you can update your UI and other state
        print("New state:", state)
    }
}

With that done, you should now be able to open your app, connect to USB, and view your device in Camo Studio.

Determining connection state

The CamoController provides a CamoControllerState property, which is also included in the state change delegate method as seen above. This enum provides information useful to your app, such as the connected computer's name for display in the UI.

Here's an example of how you could update your UI to reflect the connection state:

func camoController(_ controller: CamoController, stateDidChangeTo state: CamoControllerState?) {
    let statusText = { () -> String in
        guard case let .running(serviceState) = state else {
            return "Camo service not running"
        }
        switch serviceState {
        case .active(let connection): return "Connected to \(connection.name)"
        case .paused(let connection): return "Paused, but connected to \(connection.name)"
        case .notConnected: return "Camo service running, but no connection"
        }
    }()

    DispatchQueue.main.async {
        self.statusTextLabel.text = statusText
    }
}

Dispatching audiovisual data

To allow clients as much flexibility as possible, the Camo SDK does not provide or control any capture. Instead, you are responsible for providing the CamoController with audio and video data. This can be done with two simple API calls:

// upon receiving video from the camera or elsewhere
camoController.enqueueVideo(sampleBuffer: sampleBuffer)

// upon receiving audio from the microphone or elsewhere
camoController.enqueuePCMAudio(data: chunk)

Sending video

If you have access to a CMSampleBuffer in your video pipeline, its trivial to pass this data to the Camo SDK.

Because of this, setting this up with a basic video AVCaptureSession is extremely simple. Here's an example of that in action.

class CaptureController: AVCaptureVideoDataOutputSampleBufferDelegate {
    // ...

    func startSession() throws {
        guard let camera = AVCaptureDevice.default(for: .video) else { fatalError("No camera found") }

        let input = try AVCaptureDeviceInput(device: camera)
        captureSession.addInput(input)

        let output = AVCaptureVideoDataOutput()
        output.alwaysDiscardsLateVideoFrames = true
        output.setSampleBufferDelegate(self, queue: videoDataOutputQueue)
        captureSession.addOutput(output)

        captureSession.startRunning()
    }

    // ...

    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        camoController.enqueueVideo(sampleBuffer: sampleBuffer)
    }
}

When sending video frames, ensure that:

  • The frame duration is 30 FPS.
  • The pixel format is kCVPixelFormatType_32BGRA.

You can view more details in the API reference.

Sending audio

Sending audio is similar to video, but may require a few more steps depending on how your application receives it. For a sample implementation, see the demo app included with the Camo SDK.

When sending audio data, ensure that:

  • Your sample rate is 48 kHz.
  • The audio codec is LPCM.
  • The number of channels is 2.
  • The bit depth is 32.
  • The number of samples per audio packet is 256.

You can view more details in the API reference.

我们能帮你什么吗?

我们的支持团队在这里提供帮助!

我们的办公时间是格林威治标准时间周一至周五上午9点至下午5点。 时间目前是 3:37 AM的 GMT。

我们力争在一个工作日内答复所有垂询。

转到支持部分 › 联系我们 ›
我们的支持团队非常棒

© 2008 - 2021 Reincubate Ltd. 保留所有权利。 在英格兰和威尔士注册 #5189175, VAT GB151788978. Reincubate®和Camo®是注册商标。 隐私政策 & 条款. 在伦敦建立了爱情。