引言

UVCCamera工程主要驱动USB摄像头,进行打开,关闭,录像,拍照等一些操作,调用平台为Android,通过外接USBCamera进行调用。

libuvccamera调用过程简介

工程主要依赖库有libuvccamerausbCameraCommon两个库,其中libuvccamera库是底层库,包括C++实现部分,外加几个基本控制类,如USBMonitorUVCCamera类等。

USBMonitor类负责管理USB设备类,从该类可以选出自己需要的USBCamera设备。

UVCCamera该类就是一个USB的摄像头,基本方法均调用native,通过jni方式直接调用C++实现。该类是一个桥接类,连接上层控制到底层实现的基本事件转发。

比如聚焦,亮度,缩放,白平衡,预览,打开,关闭等,均实际通过该类来实现。

CameraDialog则是以对话框的形式提供对USBMonitor的调用,如发现并找寻自己需要操作的设备,里面用到了DeviceFilter类,主要用来过滤设备,如果不过滤设备,则认为是所有USB设备,主要还是通过USBManager来获取设备,然后进行筛选,如:

1
HashMap<String, UsbDevice> deviceList = mUsbManager.getDeviceList();

DeviceFilter类,这个是实际的筛选类,主要依据XML中的filter进行筛选,也就是说可以在XML中定义自己想要的设备的class,或者说不想要的设备的class,然后用XMLParser进行解析该XML文件,之后用此filter就可以过滤出实际想要的deviceList

usbCameraCommon调用过程简介

调用过程分析

usbCameraCommon库是上层封装库,主要用来操作调用摄像头,也就是USBMonitorUVCCamera等。

上层调用比较主要的是AbstractUVCCameraHandler类,该类基本提供了对于UVCCamera的调用,然后同时开放API给Activity使用。

因为AbstractUVCCameraHandler类是一个Handler类,类定义是:

1
abstract class AbstractUVCCameraHandler extends Handler { ... }

也就是说它主要负责消息发送接收,任何调用它的方法它都会转发给内部类,也就是其实内部主要维护的是一个Thread类,叫CameraThread,定义为:

1
static final class CameraThread extends Thread { ... }

CameraThread类为static类。任何发送给AbstractUVCCameraHandler的方法都会以消息的方式发送给CameraThread进行处理,当没有消息的时候,就什么也不做。以下可以看到转发过程:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
public void handleMessage(final Message msg) {
final CameraThread thread = mWeakThread.get();
if (thread == null) return;
switch (msg.what) {
case MSG_OPEN:
thread.handleOpen((USBMonitor.UsbControlBlock)msg.obj);
break;
case MSG_CLOSE:
thread.handleClose();
break;
case MSG_PREVIEW_START:
thread.handleStartPreview(msg.obj);
break;
case MSG_PREVIEW_STOP:
thread.handleStopPreview();
break;
case MSG_CAPTURE_STILL:
thread.handleCaptureStill((String)msg.obj);
break;
case MSG_CAPTURE_START:
thread.handleStartRecording();
break;
case MSG_CAPTURE_STOP:
thread.handleStopRecording();
break;
case MSG_MEDIA_UPDATE:
thread.handleUpdateMedia((String)msg.obj);
break;
case MSG_RELEASE:
thread.handleRelease();
break;
default:
throw new RuntimeException("unsupported message:what=" + msg.what);
}
}

然后CameraThread类内部持有UVCCamera类实例,所有调用CameraThread的方法又会发送给UVCCamera实例进行处理,举例如下:

开始预览的事件传递:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
public void handleStartPreview(final Object surface) {
if (DEBUG) Log.v(TAG_THREAD, "handleStartPreview:");
if ((mUVCCamera == null) || mIsPreviewing) return;
try {
mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 31, mPreviewMode, mBandwidthFactor);
} catch (final IllegalArgumentException e) {
try {
// fallback to YUV mode
mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 31, UVCCamera.DEFAULT_PREVIEW_MODE, mBandwidthFactor);
} catch (final IllegalArgumentException e1) {
callOnError(e1);
return;
}
}
if (surface instanceof SurfaceHolder) {
mUVCCamera.setPreviewDisplay((SurfaceHolder)surface);
} if (surface instanceof Surface) {
mUVCCamera.setPreviewDisplay((Surface)surface);
} else {
mUVCCamera.setPreviewTexture((SurfaceTexture)surface);
}
mUVCCamera.startPreview();
mUVCCamera.updateCameraParams();
synchronized (mSync) {
mIsPreviewing = true;
}
callOnStartPreview();
}

开始录像:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
public void handleStartRecording() {
if (DEBUG) Log.v(TAG_THREAD, "handleStartRecording:");
try {
if ((mUVCCamera == null) || (mMuxer != null)) return;
final MediaMuxerWrapper muxer = new MediaMuxerWrapper(".mp4"); // if you record audio only, ".m4a" is also OK.
MediaVideoBufferEncoder videoEncoder = null;
switch (mEncoderType) {
case 1: // for video capturing using MediaVideoEncoder
new MediaVideoEncoder(muxer, getWidth(), getHeight(), mMediaEncoderListener);
break;
case 2: // for video capturing using MediaVideoBufferEncoder
videoEncoder = new MediaVideoBufferEncoder(muxer, getWidth(), getHeight(), mMediaEncoderListener);
break;
// case 0: // for video capturing using MediaSurfaceEncoder
default:
new MediaSurfaceEncoder(muxer, getWidth(), getHeight(), mMediaEncoderListener);
break;
}
if (true) {
// for audio capturing
new MediaAudioEncoder(muxer, mMediaEncoderListener);
}
muxer.prepare();
muxer.startRecording();
if (videoEncoder != null) {
mUVCCamera.setFrameCallback(mIFrameCallback, UVCCamera.PIXEL_FORMAT_NV21);
}
synchronized (mSync) {
mMuxer = muxer;
mVideoEncoder = videoEncoder;
}
callOnStartRecording();
} catch (final IOException e) {
callOnError(e);
Log.e(TAG, "startCapture:", e);
}
}

可以看到,实际都是通过mUVCCamera进行调用的。
UVCCamera则直接调用native层进行处理。

所以事件的处理实际上是这个流程:

1
Activity-->UVCCameraHandler--->AbstractUVCCameraHandler--->CameraThread--->UVCCamera--->native

为什么说Activity先发送给了UVCCameraHandler,需要看构造过程。

在Activity调用示例中引用了UVCCameraHandler,也就是说UVCCameraHandler是实际操作摄像头的类,AbstractUVCCameraHandler因为是abstract,无法实例化,同时UVCCameraHandler继承了AbstractUVCCameraHandler,类关系为:

1
public class UVCCameraHandler extends AbstractUVCCameraHandler { ... }

所以Activity中实例化了UVCCameraHandler后,发送给UVCCameraHandler的事件都经过消息发送给了CameraThread,然后CameraThread又发送给了UVCCamera类。

如下为实例化过程:

在Activity中实例化UVCCameraHandler代码为:

1
2
3
4
5
final View view = findViewById(R.id.camera_view);
mUVCCameraView = (CameraViewInterface)view;
mUSBMonitor = new USBMonitor(this, mOnDeviceConnectListener);
mCameraHandler = UVCCameraHandler.createHandler(this, mUVCCameraView,
USE_SURFACE_ENCODER ? 0 : 1, PREVIEW_WIDTH, PREVIEW_HEIGHT, PREVIEW_MODE);

这里多贴了几行,方便后面介绍。
先看这句:

1
2
mCameraHandler = UVCCameraHandler.createHandler(this, mUVCCameraView,
USE_SURFACE_ENCODER ? 0 : 1, PREVIEW_WIDTH, PREVIEW_HEIGHT, PREVIEW_MODE);

等于:

1
2
3
4
5
6
public static final UVCCameraHandler createHandler(
final Activity parent, final CameraViewInterface cameraView,
final int encoderType, final int width, final int height, final int format) {

return createHandler(parent, cameraView, encoderType, width, height, format, UVCCamera.DEFAULT_BANDWIDTH);
}

等于:

1
2
3
4
5
6
7
8
public static final UVCCameraHandler createHandler(
final Activity parent, final CameraViewInterface cameraView,
final int encoderType, final int width, final int height, final int format, final float bandwidthFactor) {

final CameraThread thread = new CameraThread(UVCCameraHandler.class, parent, cameraView, encoderType, width, height, format, bandwidthFactor);
thread.start();
return (UVCCameraHandler)thread.getHandler();
}

也就是说,在Activity中实例化UVCCameraHandler其实就是构造了一个CameraThread,并将CameraThread中引用的UVCCameraHandler返回了回去。也就是说UVCCameraHandler一旦实例化,则CameraThread线程也创建了。
可以看到CameraThread中确实引用了UVCCameraHandler

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
static final class CameraThread extends Thread {
private static final String TAG_THREAD = "CameraThread";
private final Object mSync = new Object();
private final Class<? extends AbstractUVCCameraHandler> mHandlerClass;
private final WeakReference<Activity> mWeakParent;
private final WeakReference<CameraViewInterface> mWeakCameraView;
private final int mEncoderType;
private final Set<CameraCallback> mCallbacks = new CopyOnWriteArraySet<CameraCallback>();
private int mWidth, mHeight, mPreviewMode;
private float mBandwidthFactor;
private boolean mIsPreviewing;
private boolean mIsRecording;
/**
* shutter sound
*/
private SoundPool mSoundPool;
private int mSoundId;
private AbstractUVCCameraHandler mHandler;
/**
* for accessing UVC camera
*/
private UVCCamera mUVCCamera;
/**
* muxer for audio/video recording
*/
private MediaMuxerWrapper mMuxer;
private MediaVideoBufferEncoder mVideoEncoder;

/**
*
* @param clazz Class extends AbstractUVCCameraHandler
* @param parent parent Activity
* @param cameraView for still capturing
* @param encoderType 0: use MediaSurfaceEncoder, 1: use MediaVideoEncoder, 2: use MediaVideoBufferEncoder
* @param width
* @param height
* @param format either FRAME_FORMAT_YUYV(0) or FRAME_FORMAT_MJPEG(1)
* @param bandwidthFactor
*/
CameraThread(final Class<? extends AbstractUVCCameraHandler> clazz,
final Activity parent, final CameraViewInterface cameraView,
final int encoderType, final int width, final int height, final int format,
final float bandwidthFactor) {

super("CameraThread");
mHandlerClass = clazz;
mEncoderType = encoderType;
mWidth = width;
mHeight = height;
mPreviewMode = format;
mBandwidthFactor = bandwidthFactor;
mWeakParent = new WeakReference<Activity>(parent);
mWeakCameraView = new WeakReference<CameraViewInterface>(cameraView);
loadShutterSound(parent);
}
// ...
}

其中mHandlerClass就是UVCCameraHandler

在上面Activity中的调用中,初始化代码为:

1
2
3
4
5
final View view = findViewById(R.id.camera_view);
mUVCCameraView = (CameraViewInterface)view;
mUSBMonitor = new USBMonitor(this, mOnDeviceConnectListener);
mCameraHandler = UVCCameraHandler.createHandler(this, mUVCCameraView,
USE_SURFACE_ENCODER ? 0 : 1, PREVIEW_WIDTH, PREVIEW_HEIGHT, PREVIEW_MODE);

其中mUSBMonitor类负责扫描设备并发现目标USB Camera设备,然后mUVCCameraView负责预览页面,并将该预览页面传给UVCCameraHandler并传给CameraThread,见弱引用变量:

1
private final WeakReference<CameraViewInterface> mWeakCameraView;

也就是CameraThread完成了承上启下的作用,负责对上提供API调用,对下进行实际操作。

既有页面预览,声音播放,操作控制UVCCamera,进行录像等各功能。

native层

native层由Java类UVCCamera发起调用,调用传递为jni-->UVCCamera-->UVCCamear.cpp--->UVCPreview.cpp

举例,如startPreviewstopPreview两个方法则又调用了UVCPreview 的方法,因为UVCCamear.cpp有引用UVCPreview.cpp指针:UVCPreview *mPreview;,代码如下所示:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
int UVCCamera::startPreview() {
ENTER();

int result = EXIT_FAILURE;
if (mDeviceHandle) {
return mPreview->startPreview();
}
RETURN(result, int);
}

int UVCCamera::stopPreview() {
ENTER();
if (LIKELY(mPreview)) {
mPreview->stopPreview();
}
RETURN(0, int);
}

UVCPreview.cpp的startPreviewstopPreview方法内容为:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
int UVCPreview::startPreview() {
ENTER();

int result = EXIT_FAILURE;
if (!isRunning()) {
mIsRunning = true;
pthread_mutex_lock(&preview_mutex);
{
if (LIKELY(mPreviewWindow)) {
result = pthread_create(&preview_thread, NULL, preview_thread_func, (void *)this);
}
}
pthread_mutex_unlock(&preview_mutex);
if (UNLIKELY(result != EXIT_SUCCESS)) {
LOGW("UVCCamera::window does not exist/already running/could not create thread etc.");
mIsRunning = false;
pthread_mutex_lock(&preview_mutex);
{
pthread_cond_signal(&preview_sync);
}
pthread_mutex_unlock(&preview_mutex);
}
}
RETURN(result, int);
}

int UVCPreview::stopPreview() {
ENTER();
bool b = isRunning();
if (LIKELY(b)) {
mIsRunning = false;
pthread_cond_signal(&preview_sync);
pthread_cond_signal(&capture_sync);
if (pthread_join(capture_thread, NULL) != EXIT_SUCCESS) {
LOGW("UVCPreview::terminate capture thread: pthread_join failed");
}
if (pthread_join(preview_thread, NULL) != EXIT_SUCCESS) {
LOGW("UVCPreview::terminate preview thread: pthread_join failed");
}
clearDisplay();
}
clearPreviewFrame();
clearCaptureFrame();
pthread_mutex_lock(&preview_mutex);
if (mPreviewWindow) {
ANativeWindow_release(mPreviewWindow);
mPreviewWindow = NULL;
}
pthread_mutex_unlock(&preview_mutex);
pthread_mutex_lock(&capture_mutex);
if (mCaptureWindow) {
ANativeWindow_release(mCaptureWindow);
mCaptureWindow = NULL;
}
pthread_mutex_unlock(&capture_mutex);
RETURN(0, int);
}

UVCCameraTextureView类,该类是显示摄像头预览页面的类,非常重要

1
2
3
4
public class UVCCameraTextureView extends AspectRatioTextureView
implements TextureView.SurfaceTextureListener, CameraViewInterface {
//...
}

UVCCamera相关内容

参考链接:https://blog.csdn.net/bawang_cn/article/details/126271084