App Development

Camera2 and You: Leveraging Android Lollipop’s New Camera

android-camera2-and-you blog-featured-image NV-510x290

Custom cameras! Chances are, if you’re an Android developer, something about that phrase makes you shudder. Thanks to all of the hardware variations out there, adding a customized camera feature to your app has been notoriously fragile and time-consuming, and in the end provides little to no benefit over the stock camera app.

Enter Lollipop; this past year saw the release of Android 5.0, and with it the brand new Camera2 framework. These new components are Google’s attempt to give developers much more granular control over your phone’s camera functionality. DSLR-like levels of customization, such as native control over exposure and raw sensor capture are finally possible. And while this new approach requires a little more thought and legwork, the level of control you get in return is well worth the effort.

Let’s dive into the implementation. First, you’ll need access to a TextureView in your layout, and its associated surface. This is where your camera preview will be displayed:

TextureView previewView = (TextureView) findViewById(R.id.preview);
previewView.setSurfaceTextureListener(new SurfaceTextureListener() {
    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
        mPreviewSurfaceTexture = surface;
        // to next step...
    }


    // ...
});

Once the SurfaceTexture is available, you can then go about fetching your camera data:

CameraManager cameraManager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
String cameraId = cameraManager.getCameraIdList()[0];
CameraCharacteristics cc = cm.getCameraCharacteristics(cameraId);
StreamConfigurationMap streamConfigs = cc.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size[] rawSizes = streamConfigs.getOutputSizes(ImageFormat.RAW_SENSOR);
Size[] jpegSizes = streamConfigs.getOutputSizes(ImageFormat.JPEG);

That CameraCharacteristics object contains everything you could hope to know about the camera’s lens, sensor, and supported controls; you just have to access the specific values using the get() method. A key component that you’ll need is the StreamConfigurationMap, which contains the image formats and sizes that are supported by the camera. Use these to then set up your surface(s) for capturing images:

ImageReader rawImageReader = ImageReader.newInstance(rawWidth, rawHeight, ImageFormat.RAW_SENSOR, 1);
rawImageReader.setOnImageAvailableListener(new OnImageAvailableListener() {
    @Override
    public void onImageAvailable(ImageReader reader) {
        // save raw
    }
});


ImageReader jpegImageReader = ImageReader.newInstance(jpegWidth, jpegHeight, ImageFormat.JPEG, 1);
jpegImageReader.setOnImageAvailableListener(new OnImageAvailableListener() {
    @Override
    public void onImageAvailable(ImageReader reader) {
        // save jpeg
    }
});


Surface previewSurface = new Surface(mPreviewSurfaceTexture);
Surface rawCaptureSurface = rawImageReader.getSurface();
Surface jpegCaptureSurface = jpegImageReader.getSurface();

Now it’s finally time to open the camera, with a simple call to openCamera(). If successful, this will call back with a CameraDevice object; hold on to that.

cameraManager.openCamera(cameraId, new CameraDevice.StateCallback() {
    @Override
    public void onOpened(CameraDevice camera) {
        mCamera = camera;
        // to next step...
    }


    // ...
});

The last thing you need to do before all the fun stuff starts is to open a capture session, which you’ll use to send requests to the device’s camera.

IMPORTANT : The list of Surfaces that you pass to createCaptureSession() MUST contain every Surface that you’ll be using, that includes your preview Surface and any capture Surface that you created earlier. A successful configuration will give you a CameraCaptureSession, to which you’ll also want to keep a reference.

List<Surface> surfaces = Arrays.asList(previewSurface, rawCaptureSurface, jpegCaptureSurface);
mCamera.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback() {
    @Override
    public void onConfigured(CameraCaptureSession session) {
        mSession = session;
        // to next step...
    }


    // ...
});

That all wasn’t so bad, was it? That capture session is the last piece of the puzzle, and now you can finally start the preview and take some pictures! This is where you’re given full control over image capture. Everything here is based around the CaptureRequest, which is supplied by the CameraDevice object that you saved earlier. Create a request using one of the available templates (here we’ll use “preview,” but there are also others for still image and video capture), specify a target surface (in this case, the preview surface we created), tweak any settings that you’d like with the set() method, and send the request off using the capture session.

CaptureRequest.Builder request = mCamera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
request.addTarget(previewSurface);


// set capture options: fine-tune manual focus, white balance, etc.


mSession.setRepeatingRequest(request.build(), new CaptureCallback() {
    @Override
    public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
        // updated values can be found here
    }
}, null);

Since this request is meant to continuously update the camera’s preview display, we make use of the session’s setRepeatingRequest(), rather than the one-off capture() method.

IMPORTANT : Note that there can only be one repeating request at any given time. With this in mind, if anything changes that should be reflected in the preview (e.g., the user adjusts the focus), simply create a new request as outlined above while setting the new values, then set it as the session’s repeating request. The new will simply replace the old.

Of course, what good would a camera app be if it couldn’t take pictures? An image capture request is made in much the same way as a preview request, with a few small differences. First, initialize the request using the “still capture” template; then, set the target of the request to be the desired image capture surface, rather than the preview surface; and finally, send the request using the session’s capture() method.

CaptureRequest.Builder request = mCamera.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
request.addTarget(rawCaptureSurface);
request.set(CaptureRequest.STATISTICS_LENS_SHADING_MAP_MODE,
        CaptureRequest.STATISTICS_LENS_SHADING_MAP_MODE_ON); // required for raw


// set capture options


mSession.capture(request.build(), new CaptureCallback() {
    @Override
    public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
        mCaptureResult = result;
    }
}, null);

This brings us to (in my opinion) the most exciting part about the new framework: native raw image capture. This is HUGE. Most photography fanatics can go on for days about the benefits to shooting in raw over JPEG; but for the rest, we’ll just say that a raw image offers much more flexibility for editing. Someone at Google has realized this, and thrown in a handy until class called DngCreator with the latest Android release (a DNG, or “digital negative,” is simply Adobe’s open raw image format).

Using the DngCreator is surprisingly simple; you just construct a new one using a CameraCharacteristics (that you received from the CameraManager earlier) and a CaptureResult (obtained from the CaptureCallback; see code block above).

IMPORTANT : Notice that TotalCaptureResult is a subclass of CaptureResult when you construct the DngCreator you’ll want to use the exact result object that is returned from the capture callback. After that, it’s just a call to writeImage() using the Image that you receive from the capture surface’s associated ImageReader:

DngCreatorrawImageReader.setOnImageAvailableListener(new OnImageAvailableListener() {
    @Override
    public void onImageAvailable(ImageReader reader) {
        Image image = reader.acquireLatestImage();
        DngCreator dngCreator = new DngCreator(cc, mCaptureResult);
        // can set some metadata, like orientation, here
        dngCreator.writeImage(new FileOutputStream("filePath"), image);
        dngCreator.close();
        image.close();
    }
});

Unfortunately, there aren’t a ton of apps out there just yet that are able to edit raw files; but now that Lollipop makes it so easy to generate raw images on-device, and with leading manufacturers like Samsung and HTC embracing the format on their flagship devices, we may be seeing much more support for editing raw images in the very near future.

Happy camera making! Hopefully this will help get you started on developing the next great camera app.

Join our team to work with Fortune 500 companies in solving real-world product strategy, design, and technical problems.

Find Your Role
Quickstart-Guide-to-Kotlin-Multiplatform

A Quick Start Guide to Kotlin Multiplatform

Kotlin Multiplatform, though still experimental, is a great up-and-coming solution...

Read the article