音频采集器,音频管理提供管理音频的一些基础能力,包括对音频音量、音频设备的管理,以及对音频数据的采集

音频采集器

HarmonyOS
2024-05-20 21:08:28
1318浏览
收藏 0
回答 1
回答 1
按赞同
/
按时间
伊普洛先生

使用AudioCapturer采集音频数据

使用的核心API

@ohos.multimedia.audio (音频管理)

音频管理提供管理音频的一些基础能力,包括对音频音量、音频设备的管理,以及对音频数据的采集和渲染等。

该模块提供以下音频相关的常用功能:

  • AudioManager:音频管理。
  • AudioRenderer:音频渲染,用于播放PCM(Pulse Code Modulation)音频数据。
  • AudioCapturer:音频采集,用于录制PCM音频数据。
  • TonePlayer:用于管理和播放DTMF(Dual Tone Multi Frequency,双音多频)音调,如拨号音、通话回铃音等。

用时导入模块

import audio from '@ohos.multimedia.audio';

核心代码如下:

@Builder TabBuilder(index: number, btnId: string) { 
    Column() { 
      Text(index === 0 ? $r('app.string.NORMAL_CAPTURER') : $r('app.string.PARALLEL_CAPTURER')) 
        .fontColor(this.currentIndex === index ? this.selectedFontColor : this.fontColor) 
        .opacity(this.currentIndex === index ? 1 : 0.6) 
        .fontSize(16) 
        .fontWeight(this.currentIndex === index ? 500 : 400) 
        .lineHeight(22) 
        .margin({ top: 17, bottom: 7 }); 
      Divider() 
        .strokeWidth(2) 
        .color('#007DFF') 
        .opacity(this.currentIndex === index ? 1 : 0); 
    }.width(78).id('btn_' + btnId); 
  } 
  
  async aboutToAppear(): Promise<void> { 
    console.log('NomalCapturer aboutToAppear'); 
    await this.initResource(); 
  } 
  
  async initResource(): Promise<void> { 
    console.log('initResource 0'); 
    try { 
      console.log('initResource 1'); 
      this.audioCapturer = await audio.createAudioCapturer(this.audioCapturerOptions); 
      console.log('initResource 2'); 
      this.bufferSize = await this.audioCapturer.getBufferSize(); 
      this.recordState = 'init'; 
      this.title = `${this.getDate(2)}_${Math.floor(Math.random() * RANDOM_NUM)}`; 
      this.path = `/data/storage/el2/base/haps/entry/files/normal_capturer_${this.title}.pcm`; 
      this.date = this.getDate(1); 
      console.log('initResource 3'); 
      await this.openFile(this.path); 
    } catch (err) { 
      let error = err as BusinessError; 
      console.log(`NormalCapturer:createAudioCapturer err=${JSON.stringify(error)}`); 
    } 
  } 
  
  async releseResource(): Promise<void> { 
    if (this.fd > 0) { 
      this.closeFile(); 
      this.fd = 0; 
    } 
    if (this.interval) { 
      clearInterval(this.interval); 
    } 
    if (this.audioCapturer) { 
      console.log('NomalCapturer,audioCapturer released'); 
      await this.audioCapturer.release(); 
      this.audioCapturer = undefined; 
      this.recordState = 'init'; 
      clearInterval(this.interval); 
    } 
    if (this.audioRenderer) { 
      console.log('NomalCapturer,audioRenderer released'); 
      await this.audioRenderer.release(); 
      this.audioRenderer = undefined; 
    } 
  } 
  
  async aboutToDisappear(): Promise<void> { 
    console.log('NomalCapturer,aboutToDisappear is called'); 
    await this.releseResource(); 
  } 
  
  async openFile(path: string): Promise<void> { 
    console.log(path); 
    try { 
      await fs.open(path, 0o100); 
      console.log('file created success'); 
    } catch (err) { 
      let error = err as BusinessError; 
      console.log('file created err:' + JSON.stringify(error)); 
      return; 
    } 
  
    try { 
      let file = await fs.open(path, 0o2); 
      this.fd = file.fd; 
      console.log(`file open success for read and write mode,fd=${file.fd}`); 
    } catch (err) { 
      let error = err as BusinessError; 
      console.log('file open err:' + JSON.stringify(error)); 
      return; 
    } 
  } 
  
  async closeFile(): Promise<void> { 
    try { 
      await fs.close(this.fd); 
      console.log('file close success'); 
    } catch (err) { 
      let error = err as BusinessError; 
      console.log('file close err:' + JSON.stringify(error)); 
      return; 
    } 
  } 
  
  async capturerStart(): Promise<void> { 
    if (!this.audioCapturer) { 
      console.log(`NormalCapturer,capturerStart:audioCapturer is null`); 
      return; 
    } 
  
    try { 
      await this.audioCapturer.start(); 
      // when start,init recordSec 
      this.recordSec = 0; 
      this.recordState = 'started'; 
      console.log('audioCapturer start ok'); 
      clearInterval(this.interval); 
      this.interval = setInterval(async () => { 
        if (this.recordSec >= TOTAL_SECOND) { 
          // over TOTAL_SECOND,need to stop auto 
          clearInterval(this.interval); 
          if (this.audioCapturer && this.audioCapturer.state === audio.AudioState.STATE_RUNNING) { 
            await this.capturerStop(); 
          } 
          return; 
        } 
        this.recordSec++; 
        this.showTime = this.getTimesBySecond(this.recordSec); 
  
      }, INTERVAL_TIME); 
      setTimeout(async () => { 
        await this.readCapturer(); 
      }, READ_TIME_OUT); 
    } catch (err) { 
      let error = err as BusinessError; 
      console.log(`NormalCapturer:audioCapturer start err=${JSON.stringify(error)}`); 
    } 
  } 
  
  async renderCreate(): Promise<void> { 
    try { 
      this.audioRenderer = await audio.createAudioRenderer(this.audioRendererOptions); 
      this.renderState = this.audioRenderer.state; 
      this.audioRenderer.on('stateChange', (state) => { 
        this.renderState = state; 
      }); 
    } catch (err) { 
      let error = err as BusinessError; 
      console.log(`createAudioRenderer err=${JSON.stringify(error)}`); 
    } 
  } 
  
  async renderStart(): Promise<void> { 
    if (!this.audioRenderer) { 
      return; 
    } 
    let bufferSize = 0; 
    try { 
      bufferSize = await this.audioRenderer.getBufferSize(); 
      await this.audioRenderer.start(); 
    } catch (err) { 
      let error = err as BusinessError; 
      console.log(`start err:${JSON.stringify(error)}`); 
    } 
  
    try { 
      let stat = await fs.stat(this.path); 
      let buf = new ArrayBuffer(bufferSize); 
      console.log(`audioRenderer write start..........`); 
      let startOffset = this.start; 
      while (startOffset <= stat.size) { 
        if (this.audioRenderer.state === audio.AudioState.STATE_PAUSED) { 
          break; 
        } 
        // change tag,to stop 
        if (this.audioRenderer.state === audio.AudioState.STATE_STOPPED) { 
          break; 
        } 
        if (this.audioRenderer.state === audio.AudioState.STATE_RELEASED) { 
          return; 
        } 
        let options: Options = { 
          offset: startOffset, 
          length: bufferSize 
        }; 
        console.log('renderStart,options=' + JSON.stringify(options)); 
  
        await fs.read(this.fd, buf, options); 
        await this.audioRenderer.write(buf); 
        this.playSec = Math.round(startOffset / stat.size * this.recordSec); 
        startOffset = startOffset + bufferSize; 
        this.start = startOffset; 
      } 
      console.log(`audioRenderer write end..........`) 
      if (this.audioRenderer.state === audio.AudioState.STATE_RUNNING) { 
        this.start = 0; 
        await this.renderStop(); 
      } 
    } catch (err) { 
      let error = err as BusinessError; 
      console.log(`write err:${JSON.stringify(error)}`); 
    } 
  } 
  
  async renderPause(): Promise<void> { 
    if (!this.audioRenderer) { 
      return; 
    } 
    try { 
      await this.audioRenderer.pause(); 
    } catch (err) { 
      let error = err as BusinessError; 
      console.log(`pause err:${JSON.stringify(error)}`); 
    } 
  } 
  
  async renderStop(): Promise<void> { 
    if (!this.audioRenderer) { 
      return; 
    } 
    try { 
      await this.audioRenderer.stop(); 
      this.start = 0; 
    } catch (err) { 
      let error = err as BusinessError; 
      console.log(`stop err:${JSON.stringify(error)}`); 
    } 
  }
  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21.
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
  • 31.
  • 32.
  • 33.
  • 34.
  • 35.
  • 36.
  • 37.
  • 38.
  • 39.
  • 40.
  • 41.
  • 42.
  • 43.
  • 44.
  • 45.
  • 46.
  • 47.
  • 48.
  • 49.
  • 50.
  • 51.
  • 52.
  • 53.
  • 54.
  • 55.
  • 56.
  • 57.
  • 58.
  • 59.
  • 60.
  • 61.
  • 62.
  • 63.
  • 64.
  • 65.
  • 66.
  • 67.
  • 68.
  • 69.
  • 70.
  • 71.
  • 72.
  • 73.
  • 74.
  • 75.
  • 76.
  • 77.
  • 78.
  • 79.
  • 80.
  • 81.
  • 82.
  • 83.
  • 84.
  • 85.
  • 86.
  • 87.
  • 88.
  • 89.
  • 90.
  • 91.
  • 92.
  • 93.
  • 94.
  • 95.
  • 96.
  • 97.
  • 98.
  • 99.
  • 100.
  • 101.
  • 102.
  • 103.
  • 104.
  • 105.
  • 106.
  • 107.
  • 108.
  • 109.
  • 110.
  • 111.
  • 112.
  • 113.
  • 114.
  • 115.
  • 116.
  • 117.
  • 118.
  • 119.
  • 120.
  • 121.
  • 122.
  • 123.
  • 124.
  • 125.
  • 126.
  • 127.
  • 128.
  • 129.
  • 130.
  • 131.
  • 132.
  • 133.
  • 134.
  • 135.
  • 136.
  • 137.
  • 138.
  • 139.
  • 140.
  • 141.
  • 142.
  • 143.
  • 144.
  • 145.
  • 146.
  • 147.
  • 148.
  • 149.
  • 150.
  • 151.
  • 152.
  • 153.
  • 154.
  • 155.
  • 156.
  • 157.
  • 158.
  • 159.
  • 160.
  • 161.
  • 162.
  • 163.
  • 164.
  • 165.
  • 166.
  • 167.
  • 168.
  • 169.
  • 170.
  • 171.
  • 172.
  • 173.
  • 174.
  • 175.
  • 176.
  • 177.
  • 178.
  • 179.
  • 180.
  • 181.
  • 182.
  • 183.
  • 184.
  • 185.
  • 186.
  • 187.
  • 188.
  • 189.
  • 190.
  • 191.
  • 192.
  • 193.
  • 194.
  • 195.
  • 196.
  • 197.
  • 198.
  • 199.
  • 200.
  • 201.
  • 202.
  • 203.
  • 204.
  • 205.
  • 206.
  • 207.
  • 208.
  • 209.
  • 210.
  • 211.
  • 212.
  • 213.
  • 214.
  • 215.
  • 216.
  • 217.
  • 218.
  • 219.
  • 220.
  • 221.
  • 222.
  • 223.
  • 224.

实现效果

适配版本信息

1. 本示例为Stage模型,支持API version 10。

2. 本示例需要使用DevEco Studio 3.1    Release以上版本进行编译运行。

分享
微博
QQ
微信
回复
2024-05-21 16:53:48
相关问题
HarmonyOS 音频设备管理
837浏览 • 1回复 待解决
HarmonyOS 希望提供napi版音频采集demo
895浏览 • 1回复 待解决
HarmonyOS 音频设备变化监听
928浏览 • 1回复 待解决
HarmonyOS 音频设备切换问题
1171浏览 • 1回复 待解决
鸿蒙JS开发音频管理,导入media错误
4933浏览 • 1回复 已解决
音频音频焦点请求和释放
2232浏览 • 1回复 待解决
HarmonyOS 音频播放设备切换
1244浏览 • 1回复 待解决
音频发声设备切换与查询
1845浏览 • 1回复 待解决