HarmonyOS Developer 媒体开发指导

丶龙八夷
发布于 2023-3-26 20:40
浏览
0收藏

视频播放开发指导

简介

视频播放的主要工作是将视频数据转码并输出到设备进行播放,同时管理播放任务,包括开始播放、暂停播放、停止播放、资源释放、音量设置、跳转播放位置、设置倍数、获取轨道信息等功能控制。本文将对视频播放全流程、视频切换、视频循环播放等场景开发进行介绍说明。

运作机制

该模块提供了视频播放状态变化示意图和视频播放外部模块交互图。

图1 视频播放状态变化示意图

HarmonyOS Developer 媒体开发指导-鸿蒙开发者社区

图2 视频播放外部模块交互图

HarmonyOS Developer 媒体开发指导-鸿蒙开发者社区

说明:三方应用通过调用JS接口层提供的js接口实现相应功能时,框架层会通过Native Framework的媒体服务,调用音频部件将软件解码后的音频数据,输出至音频HDI,和图形子系统将硬件接口层的解码HDI部件的解码后的图像数据,输出至显示HDI,实现视频播放功能。

注意:视频播放需要显示、音频、编解码等硬件能力。

  1. 三方应用从Xcomponent组件获取SurfaceID。
  2. 三方应用把SurfaceID传递给VideoPlayer JS。
  3. 媒体服务把帧数据flush给Surface buffer。

兼容性说明

推荐使用视频软件主流的播放格式和主流分辨率,不建议开发者自制非常或者异常码流,以免产生无法播放、卡住、花屏等兼容性问题。若发生此类问题不会影响系统,退出码流播放即可。

主流的播放格式和主流分辨率如下:

视频容器规格

规格描述

分辨率

mp4

视频格式:H264/MPEG2/MPEG4/H263 音频格式:AAC/MP3

主流分辨率,如1080P/720P/480P/270P

mkv

视频格式:H264/MPEG2/MPEG4/H263 音频格式:AAC/MP3

主流分辨率,如1080P/720P/480P/270P

ts

视频格式:H264/MPEG2/MPEG4 音频格式:AAC/MP3

主流分辨率,如1080P/720P/480P/270P

webm

视频格式:VP8 音频格式:VORBIS

主流分辨率,如1080P/720P/480P/270P

开发指导

详细API含义可参考:​​媒体服务API文档VideoPlayer​

全流程场景

视频播放的全流程场景包含:创建实例,设置url,设置SurfaceId,准备播放视频,播放视频,暂停播放,获取轨道信息,跳转播放位置,设置音量,设置倍速,结束播放,重置,释放资源等流程。

VideoPlayer支持的url媒体源输入类型可参考:​​url属性说明​

Xcomponent创建方法可参考:​​Xcomponent创建方法​

import media from '@ohos.multimedia.media'
import fs from '@ohos.file.fs'
export class VideoPlayerDemo {
  // 函数调用发生错误时用于上报错误信息
  failureCallback(error) {
    console.info(`error happened,error Name is ${error.name}`);
    console.info(`error happened,error Code is ${error.code}`);
    console.info(`error happened,error Message is ${error.message}`);
  }

  // 当函数调用发生异常时用于上报错误信息
  catchCallback(error) {
    console.info(`catch error happened,error Name is ${error.name}`);
    console.info(`catch error happened,error Code is ${error.code}`);
    console.info(`catch error happened,error Message is ${error.message}`);
  }

  // 用于打印视频轨道信息
  printfDescription(obj) {
    for (let item in obj) {
      let property = obj[item];
      console.info('key is ' + item);
      console.info('value is ' + property);
    }
  }

  async videoPlayerDemo() {
    let videoPlayer = undefined;
    let SurfaceID = 'test' // SurfaceID用于播放画面显示,具体的值需要通过Xcomponent接口获取,相关文档链接见上面Xcomponent创建方法
    let fdPath = 'fd://'
    // path路径的码流可通过"hdc file send D:\xxx\H264_AAC.mp4 /data/app/el1/bundle/public/ohos.acts.multimedia.video.videoplayer/assets/entry/resources/rawfile" 命令,将其推送到设备上,其中ohos.acts.multimedia.video.videoplayer需替换为实际的bundleName
    let path = '/data/app/el1/bundle/public/ohos.acts.multimedia.video.videoplayer/ohos.acts.multimedia.video.videoplayer/assets/entry/resources/rawfile/H264_AAC.mp4';
    let file = await fs.open(path); 
    fdPath = fdPath + '' + file.fd;
    // 调用createVideoPlayer接口返回videoPlayer实例对象
    await media.createVideoPlayer().then((video) => {
      if (typeof (video) != 'undefined') {
        console.info('createVideoPlayer success!');
        videoPlayer = video;
      } else {
        console.info('createVideoPlayer fail!');
      }
    }, this.failureCallback).catch(this.catchCallback);
    // 设置播放源
    videoPlayer.url = fdPath;

    // 设置SurfaceID用于显示视频画面
    await videoPlayer.setDisplaySurface(SurfaceID).then(() => {
      console.info('setDisplaySurface success');
    }, this.failureCallback).catch(this.catchCallback);

    // 调用prepare完成播放前准备工作
    await videoPlayer.prepare().then(() => {
      console.info('prepare success');
    }, this.failureCallback).catch(this.catchCallback);

    // 调用play接口正式开始播放
    await videoPlayer.play().then(() => {
      console.info('play success');
    }, this.failureCallback).catch(this.catchCallback);

    // 暂停播放
    await videoPlayer.pause().then(() => {
      console.info('pause success');
    }, this.failureCallback).catch(this.catchCallback);

    // 通过promise回调方式获取视频轨道信息communication_dsoftbus
    let arrayDescription;
    await videoPlayer.getTrackDescription().then((arrlist) => {
      if (typeof (arrlist) != 'undefined') {
        arrayDescription = arrlist;
      } else {
        console.log('video getTrackDescription fail');
      }
    }, this.failureCallback).catch(this.catchCallback);

    for (let i = 0; i < arrayDescription.length; i++) {
      this.printfDescription(arrayDescription[i]);
    }

    // 跳转播放时间到50s位置,具体入参意义请参考接口文档
    let seekTime = 50000;
    await videoPlayer.seek(seekTime, media.SeekMode.SEEK_NEXT_SYNC).then((seekDoneTime) => {
      console.info('seek success');
    }, this.failureCallback).catch(this.catchCallback);

    // 音量设置接口,具体入参意义请参考接口文档
    let volume = 0.5;
    await videoPlayer.setVolume(volume).then(() => {
      console.info('setVolume success');
    }, this.failureCallback).catch(this.catchCallback);

    // 倍速设置接口,具体入参意义请参考接口文档
    let speed = media.PlaybackSpeed.SPEED_FORWARD_2_00_X;
    await videoPlayer.setSpeed(speed).then(() => {
      console.info('setSpeed success');
    }, this.failureCallback).catch(this.catchCallback);

    // 结束播放
    await videoPlayer.stop().then(() => {
      console.info('stop success');
    }, this.failureCallback).catch(this.catchCallback);

    // 重置播放配置
    await videoPlayer.reset().then(() => {
      console.info('reset success');
    }, this.failureCallback).catch(this.catchCallback);

    // 释放播放资源
    await videoPlayer.release().then(() => {
      console.info('release success');
    }, this.failureCallback).catch(this.catchCallback);

    // 相关对象置undefined
    videoPlayer = undefined;
    SurfaceID = undefined;
  }
}
正常播放场景

import media from '@ohos.multimedia.media'
import fs from '@ohos.file.fs'
export class VideoPlayerDemo {
  // 函数调用发生错误时用于上报错误信息
  failureCallback(error) {
    console.info(`error happened,error Name is ${error.name}`);
    console.info(`error happened,error Code is ${error.code}`);
    console.info(`error happened,error Message is ${error.message}`);
  }

  // 当函数调用发生异常时用于上报错误信息
  catchCallback(error) {
    console.info(`catch error happened,error Name is ${error.name}`);
    console.info(`catch error happened,error Code is ${error.code}`);
    console.info(`catch error happened,error Message is ${error.message}`);
  }

  // 用于打印视频轨道信息
  printfDescription(obj) {
    for (let item in obj) {
      let property = obj[item];
      console.info('key is ' + item);
      console.info('value is ' + property);
    }
  }

  async videoPlayerDemo() {
    let videoPlayer = undefined;
    let SurfaceID = 'test' // SurfaceID用于播放画面显示,具体的值需要通过Xcomponent接口获取,相关文档链接:
    let fdPath = 'fd://'
    // path路径的码流可通过"hdc file send D:\xxx\H264_AAC.mp4 /data/app/el1/bundle/public/ohos.acts.multimedia.video.videoplayer/ohos.acts.multimedia.video.videoplayer/assets/entry/resources/rawfile" 命令,将其推送到设备上
    let path = '/data/app/el1/bundle/public/ohos.acts.multimedia.video.videoplayer/ohos.acts.multimedia.video.videoplayer/assets/entry/resources/rawfile/H264_AAC.mp4';
    let file = await fs.open(path); 
    fdPath = fdPath + '' + file.fd;
    // 调用createVideoPlayer接口返回videoPlayer实例对象
    await media.createVideoPlayer().then((video) => {
      if (typeof (video) != 'undefined') {
        console.info('createVideoPlayer success!');
        videoPlayer = video;
      } else {
        console.info('createVideoPlayer fail!');
      }
    }, this.failureCallback).catch(this.catchCallback);
    // 设置播放源
    videoPlayer.url = fdPath;

    // 设置SurfaceID用于显示视频画面
    await videoPlayer.setDisplaySurface(SurfaceID).then(() => {
      console.info('setDisplaySurface success');
    }, this.failureCallback).catch(this.catchCallback);

    // 调用prepare完成播放前准备工作
    await videoPlayer.prepare().then(() => {
      console.info('prepare success');
    }, this.failureCallback).catch(this.catchCallback);

    // 调用play接口正式开始播放
    await videoPlayer.play().then(() => {
      console.info('play success');
    }, this.failureCallback).catch(this.catchCallback);

    // 结束播放
    await videoPlayer.stop().then(() => {
      console.info('stop success');
    }, this.failureCallback).catch(this.catchCallback);

    // 释放播放资源
    await videoPlayer.release().then(() => {
      console.info('release success');
    }, this.failureCallback).catch(this.catchCallback);

    // 相关对象置undefined
    videoPlayer = undefined;
    SurfaceID = undefined;
  }
}
切视频场景

import media from '@ohos.multimedia.media'
import fs from '@ohos.file.fs'
export class VideoPlayerDemo {
  // 函数调用发生错误时用于上报错误信息
  failureCallback(error) {
    console.info(`error happened,error Name is ${error.name}`);
    console.info(`error happened,error Code is ${error.code}`);
    console.info(`error happened,error Message is ${error.message}`);
  }

  // 当函数调用发生异常时用于上报错误信息
  catchCallback(error) {
    console.info(`catch error happened,error Name is ${error.name}`);
    console.info(`catch error happened,error Code is ${error.code}`);
    console.info(`catch error happened,error Message is ${error.message}`);
  }

  // 用于打印视频轨道信息
  printfDescription(obj) {
    for (let item in obj) {
      let property = obj[item];
      console.info('key is ' + item);
      console.info('value is ' + property);
    }
  }

  async videoPlayerDemo() {
    let videoPlayer = undefined;
    let SurfaceID = 'test' // SurfaceID用于播放画面显示,具体的值需要通过Xcomponent接口获取,相关文档链接:
    let fdPath = 'fd://'
    // path路径的码流可通过"hdc file send D:\xxx\H264_AAC.mp4 /data/app/el1/bundle/public/ohos.acts.multimedia.video.videoplayer/ohos.acts.multimedia.video.videoplayer/assets/entry/resources/rawfile" 命令,将其推送到设备上
    let path = '/data/app/el1/bundle/public/ohos.acts.multimedia.video.videoplayer/ohos.acts.multimedia.video.videoplayer/assets/entry/resources/rawfile/H264_AAC.mp4';
    let nextPath = '/data/app/el1/bundle/public/ohos.acts.multimedia.video.videoplayer/ohos.acts.multimedia.video.videoplayer/assets/entry/resources/rawfile/MP4_AAC.mp4';
    let file = await fs.open(path); 
    fdPath = fdPath + '' + file.fd;
    // 调用createVideoPlayer接口返回videoPlayer实例对象
    await media.createVideoPlayer().then((video) => {
      if (typeof (video) != 'undefined') {
        console.info('createVideoPlayer success!');
        videoPlayer = video;
      } else {
        console.info('createVideoPlayer fail!');
      }
    }, this.failureCallback).catch(this.catchCallback);
    // 设置播放源
    videoPlayer.url = fdPath;

    // 设置SurfaceID用于显示视频画面
    await videoPlayer.setDisplaySurface(SurfaceID).then(() => {
      console.info('setDisplaySurface success');
    }, this.failureCallback).catch(this.catchCallback);

    // 调用prepare完成播放前准备工作
    await videoPlayer.prepare().then(() => {
      console.info('prepare success');
    }, this.failureCallback).catch(this.catchCallback);

    // 调用play接口正式开始播放
    await videoPlayer.play().then(() => {
      console.info('play success');
    }, this.failureCallback).catch(this.catchCallback);

    // 重置播放配置
    await videoPlayer.reset().then(() => {
      console.info('reset success');
    }, this.failureCallback).catch(this.catchCallback);

    // 获取下一个视频fd地址
    fdPath = 'fd://'
    let nextFile = await fs.open(nextPath);
    fdPath = fdPath + '' + nextFile.fd;
    // 设置第二个视频播放源
    videoPlayer.url = fdPath;

    // 调用prepare完成播放前准备工作
    await videoPlayer.prepare().then(() => {
      console.info('prepare success');
    }, this.failureCallback).catch(this.catchCallback);

    // 调用play接口正式开始播放
    await videoPlayer.play().then(() => {
      console.info('play success');
    }, this.failureCallback).catch(this.catchCallback);

    // 释放播放资源
    await videoPlayer.release().then(() => {
      console.info('release success');
    }, this.failureCallback).catch(this.catchCallback);

    // 相关对象置undefined
    videoPlayer = undefined;
    SurfaceID = undefined;
  }
}
单个视频循环场景

import media from '@ohos.multimedia.media'
import fs from '@ohos.file.fs'
export class VideoPlayerDemo {
  // 函数调用发生错误时用于上报错误信息
  failureCallback(error) {
    console.info(`error happened,error Name is ${error.name}`);
    console.info(`error happened,error Code is ${error.code}`);
    console.info(`error happened,error Message is ${error.message}`);
  }

  // 当函数调用发生异常时用于上报错误信息
  catchCallback(error) {
    console.info(`catch error happened,error Name is ${error.name}`);
    console.info(`catch error happened,error Code is ${error.code}`);
    console.info(`catch error happened,error Message is ${error.message}`);
  }

  // 用于打印视频轨道信息
  printfDescription(obj) {
    for (let item in obj) {
      let property = obj[item];
      console.info('key is ' + item);
      console.info('value is ' + property);
    }
  }

  async videoPlayerDemo() {
    let videoPlayer = undefined;
    let SurfaceID = 'test' // SurfaceID用于播放画面显示,具体的值需要通过Xcomponent接口获取,相关文档链接:
    let fdPath = 'fd://'
    // path路径的码流可通过"hdc file send D:\xxx\H264_AAC.mp4 /data/app/el1/bundle/public/ohos.acts.multimedia.video.videoplayer/ohos.acts.multimedia.video.videoplayer/assets/entry/resources/rawfile" 命令,将其推送到设备上
    let path = '/data/app/el1/bundle/public/ohos.acts.multimedia.video.videoplayer/ohos.acts.multimedia.video.videoplayer/assets/entry/resources/rawfile/H264_AAC.mp4';
    let file = await fs.open(path); 
    fdPath = fdPath + '' + file.fd;
    // 调用createVideoPlayer接口返回videoPlayer实例对象
    await media.createVideoPlayer().then((video) => {
      if (typeof (video) != 'undefined') {
        console.info('createVideoPlayer success!');
        videoPlayer = video;
      } else {
        console.info('createVideoPlayer fail!');
      }
    }, this.failureCallback).catch(this.catchCallback);
    // 设置播放源
    videoPlayer.url = fdPath;

    // 设置SurfaceID用于显示视频画面
    await videoPlayer.setDisplaySurface(SurfaceID).then(() => {
      console.info('setDisplaySurface success');
    }, this.failureCallback).catch(this.catchCallback);

    // 调用prepare完成播放前准备工作
    await videoPlayer.prepare().then(() => {
      console.info('prepare success');
    }, this.failureCallback).catch(this.catchCallback);
    // 设置循环播放属性
    videoPlayer.loop = true;
    // 调用play接口正式开始循环播放
    await videoPlayer.play().then(() => {
      console.info('play success, loop value is ' + videoPlayer.loop);
    }, this.failureCallback).catch(this.catchCallback);
  }
}

图片开发指导

场景介绍

图片开发的主要工作是将获取到的图片进行解码,将解码后的pixelmap编码成支持的格式,本文将对图片的解码、编码等场景开发进行介绍说明。

接口说明

详细API含义请参考:​​图片处理API文档​

开发步骤

全流程场景

包含流程:创建实例、读取图片信息、读写pixelmap、更新数据、打包像素、释放资源等流程。

const color = new ArrayBuffer(96); // 用于存放图像像素数据
let opts = { alphaType: 0, editable: true, pixelFormat: 4, scaleMode: 1, size: { height: 2, width: 3 } } // 图像像素数据

// 创建pixelmap对象
image.createPixelMap(color, opts, (err, pixelmap) => {
    console.log('Succeeded in creating pixelmap.');
    // 创建pixelmap对象失败
    if (err) {
        console.info('create pixelmap failed, err' + err);
        return
    }

    // 用于读像素
    const area = {
        pixels: new ArrayBuffer(8),
        offset: 0,
        stride: 8,
        region: { size: { height: 1, width: 2 }, x: 0, y: 0 }
    }
    pixelmap.readPixels(area,() => {
        let bufferArr = new Uint8Array(area.pixels);
        let res = true;
        for (let i = 0; i < bufferArr.length; i++) {
            console.info(' buffer ' + bufferArr[i]);
            if(res) {
                if(bufferArr[i] == 0) {
                    res = false;
                    console.log('readPixels end.');
                    break;
                }
            }
        }
    })
 
    // 用于存像素
    const readBuffer = new ArrayBuffer(96);
    pixelmap.readPixelsToBuffer(readBuffer,() => {
        let bufferArr = new Uint8Array(readBuffer);
        let res = true;
        for (let i = 0; i < bufferArr.length; i++) {
            if(res) {
                if (bufferArr[i] !== 0) {
                    res = false;
                    console.log('readPixelsToBuffer end.');
                    break;
                }
            }
        }
    })
    
    // 用于写像素
    pixelmap.writePixels(area,() => {
        const readArea = { pixels: new ArrayBuffer(20), offset: 0, stride: 8, region: { size: { height: 1, width: 2 }, x: 0, y: 0 }}
        pixelmap.readPixels(readArea,() => {
            let readArr = new Uint8Array(readArea.pixels);
            let res = true;
            for (let i = 0; i < readArr.length; i++) {
                if(res) {
                    if (readArr[i] !== 0) {
                        res = false;
                        console.log('readPixels end.please check buffer');
                        break;
                    }
                }
            }
        })
    })

    const writeColor = new ArrayBuffer(96); //图像像素数据
    // 用于写像素到缓冲区
    pixelmap.writeBufferToPixels(writeColor).then(() => {
        const readBuffer = new ArrayBuffer(96);
        pixelmap.readPixelsToBuffer(readBuffer).then (() => {
            let bufferArr = new Uint8Array(readBuffer);
            let res = true;
            for (let i = 0; i < bufferArr.length; i++) {
                if(res) {
                    if (bufferArr[i] !== i) {
                        res = false;
                        console.log('readPixels end.please check buffer');
                        break;
                    }
                }
            }
        })
    })

    // 用于获取图片信息
    pixelmap.getImageInfo((err, imageInfo) => {
        // 获取图片信息失败
        if (err || imageInfo == null) {
            console.info('getImageInfo failed, err' + err);
            return
        }
        if (imageInfo !== null) {
            console.log('Succeeded in getting imageInfo');
        } 
    })

    // 用于释放pixelmap
    pixelmap.release(()=>{
        console.log('Succeeded in releasing pixelmap');
    })
})

// 用于创建imagesource(uri)
let path = '/data/local/tmp/test.jpg';
const imageSourceApi1 = image.createImageSource(path);

// 用于创建imagesource(fd)
let fd = 29;
const imageSourceApi2 = image.createImageSource(fd);

// 用于创建imagesource(data)
const data = new ArrayBuffer(96);
const imageSourceApi3 = image.createImageSource(data);

// 用于释放imagesource
imageSourceApi3.release(() => {
    console.log('Succeeded in releasing imagesource');
})
    
// 用于编码
const imagePackerApi = image.createImagePacker();
const imageSourceApi = image.createImageSource(0);
let packOpts = { format:"image/jpeg", quality:98 };
imagePackerApi.packing(imageSourceApi, packOpts, (err, data) => {
    if (err) {
        console.info('packing from imagePackerApi failed, err' + err);
        return
    }
    console.log('Succeeded in packing');
})
 
// 用于释放imagepacker
imagePackerApi.release();
解码场景

let path = '/data/local/tmp/test.jpg'; // 设置创建imagesource的路径

// 用路径创建imagesource
const imageSourceApi = image.createImageSource(path); // '/data/local/tmp/test.jpg'

// 设置参数
let decodingOptions = {
    sampleSize:1, // 缩略图采样大小
    editable: true, // 是否可编辑
    desiredSize:{ width:1, height:2}, // 期望输出大小
    rotateDegrees:10, // 旋转角度
    desiredPixelFormat:2, // 解码的像素格式
    desiredRegion: { size: { height: 1, width: 2 }, x: 0, y: 0 }, // 解码的区域
    index:0 // 图片序号
    };
    
// 用于回调方式创建pixelmap
imageSourceApi.createPixelMap(decodingOptions, (err, pixelmap) => {
    // 创建pixelmap对象失败
    if (err) {
        console.info('create pixelmap failed, err' + err);
        return
    }
    console.log('Succeeded in creating pixelmap.');
})

// 用于promise创建pixelmap
imageSourceApi.createPixelMap().then(pixelmap => {
    console.log('Succeeded in creating pixelmap.');

    // 用于获取像素每行字节数
    let num = pixelmap.getBytesNumberPerRow();

    // 用于获取像素总字节数
    let pixelSize = pixelmap.getPixelBytesNumber();

    // 用于获取pixelmap信息
    pixelmap.getImageInfo().then( imageInfo => {});

    // 用于释放pixelmap
    pixelmap.release(()=>{
        console.log('Succeeded in releasing pixelmap');
    })
}).catch(error => {
    console.log('Failed in creating pixelmap.' + error);
})
编码场景

let path = '/data/local/tmp/test.png'; // 设置创建imagesource的路径

// 用于设置imagesource
const imageSourceApi = image.createImageSource(path); // '/data/local/tmp/test.png'
 
// 如果创建imagesource失败,打印错误信息
if (imageSourceApi == null) {
    console.log('Failed in creating imageSource.');
}
   
// 如果创建imagesource成功,则创建imagepacker
const imagePackerApi = image.createImagePacker();

// 如果创建失败,打印错误信息
if (imagePackerApi == null) {
    console.log('Failed in creating imagePacker.');
}

// 如果创建imagepacker成功,则设置编码参数
let packOpts = { format:"image/jpeg", // 支持编码的格式为jpg
                 quality:98 } // 图片质量0-100

// 用于编码
imagePackerApi.packing(imageSourceApi, packOpts)
.then( data => {
    console.log('Succeeded in packing');
})
         
// 编码完成,释放imagepacker
imagePackerApi.release();

// 用于获取imagesource信息
imageSourceApi.getImageInfo((err, imageInfo) => {
    console.log('Succeeded in getting imageInfo');
})

const array = new ArrayBuffer(100); //增量数据
// 用于更新增量数据
imageSourceApi.updateData(array, false, 0, 10,(error, data)=> {})
ImageReceiver的使用

示例场景:camera作为客户端将拍照数据传给服务端

public async init(SurfaceId: any) {

    // 服务端代码,创建ImageReceiver
    let receiver = image.createImageReceiver(8 * 1024, 8, image.ImageFormat.JPEG, 1);

    // 获取Surface ID
    receiver.getReceivingSurfaceId((err, SurfaceId) => {
    // 获取Surface ID失败
        if (err) {
            console.info('getReceivingSurfaceId failed, err' + err);
            return
        }
        console.info("receiver getReceivingSurfaceId success");
    });
    // 注册Surface的监听,在Surface的buffer准备好后触发
    receiver.on('imageArrival', () => {
        // 去获取Surface中最新的buffer
        receiver.readNextImage((err, img) => {
            img.getComponent(4, (err, component) => {
                // 消费component.byteBuffer,例如:将buffer内容保存成图片。
            })
        })
    })

    // 调用Camera方法将SurfaceId传递给Camera。camera会通过SurfaceId获取Surface,并生产出Surface buffer。
}




文章转载自:​​https://developer.harmonyos.com/cn/docs/documentation/doc-guides-V3/image-0000001478181277-V3?catalogVersion=V3​

标签
收藏
回复
举报
回复