# WebAudioContext

Start from base library version 2.19.0. Please remaining backward compatible.

WebAudioContext Example, bywx.createWebAudioContext Interface to get the instance.

# attribute

# string state

The state of the current WebAudio context. Possible values are as follows: suspended, running, closed. It's important to note that you don't audioContext And then access the state property

# function onstatechange

Writable attribute, developers can set a listening function on the attribute, when the Web Audio state changes, will trigger the listening function set by the developer.

# number currentTime

Gets the timestamp of the current context.

# WebAudioContextNode destination

The final destination node of the current context, typically the audio rendering device.

# AudioListener listener

Space Audio Listener.

# number sampleRate

Sampling rates are usually between 8000 and 96000, usually 44100 hz sampling rate is most common.

# method

# Promise WebAudioContext.close()

Close Web Audio Context

# Promise WebAudioContext.resume()

Synchronize recovery of a paused WebAudioContext context

# Promise WebAudioContext.suspend()

Synchronize Pause WebAudioContext Context

# IIRFilterNode WebAudioContext.createIIRFilter(Array.&ltnumber&gt feedforward, Array.&ltnumber&gt feedback)

Create an IIR FilterNode

# WaveShaperNode WebAudioContext.createWaveShaper()

Create a Wave ShaperNode

# ConstantSourceNode WebAudioContext.createConstantSource()

Create a ConstantSourceNode

# OscillatorNode WebAudioContext.createOscillator()

Create an OscillatorNode

# GainNode WebAudioContext.createGain()

Create a GainNode

# PeriodicWaveNode WebAudioContext.createPeriodicWave(Float32Array real, Float32Array imag, object constraints)

Create a Periodic Wave Node

# BiquadFilterNode WebAudioContext.createBiquadFilter()

Create a BiquadFilterNode

# BufferSourceNode WebAudioContext.createBufferSource()

Create a BufferSourceNode instance to play the audio data through the AudioBuffer object.

# Channelmergernode WebAudioContext.createChannelMerger(number numberOfInputs)

Create a Channel MergerNode

# ChannelSplitterNode WebAudioContext.createChannelSplitter(number numberOfOutputs)

Create a Channel Splitter Node

# DelayNode WebAudioContext.createDelay(number maxDelayTime)

Create a DelayNode

# DynamicsCompressorNode WebAudioContext.createDynamicsCompressor()

Create a Dynamics CompressorNode

# ScriptProcessorNode WebAudioContext.createScriptProcessor(number bufferSize, number numberOfInputChannels, number numberOfOutputChannels)

Create a ScriptProcessorNode

# PannerNode WebAudioContext.createPanner()

Create a PannerNode

# AudioBuffer WebAudioContext.createBuffer(number numOfChannels, number length, number sampleRate)

Create an AudioBuffer that represents a short audio segment that resides in memory

# AudioBuffer WebAudioContext.decodeAudioData()

Asynchronously decodes a resource as an AudioBuffer.

# sample code

// Monitoring state
const audioCtx = wx.createWebAudioContext()
audioCtx.onstatechange = () => {
  console.log(ctx.state)
}
setTimeout(audioCtx.suspend, 1000)
setTimeout(audioCtx.resume, 2000)