# Operator Support List

# The AI inference of the Mini Program is responsible for running your model in an optimized way and utilizing the hardware acceleration of the specific device when available. This page provides information on which Ops are currently supported by various devices.

Note:
1.The following operators are generally consistent with ONNX without special remarks. Operator definition, ref: https://github.com/onnx/onnx/blob/Main/docs/Operators.md
2.At present, GPU reasoning is not open to the outside world, please look forward to it.

Operator CPU IOS NPU IOS GPU Android GPU Remarks
Activation &#10004 &#10004 &#10004 &#10004 Refer to the activation list below for details of supported activation types
ArgMax &#10004 &#10004 &#10004
ArgMin &#10004 &#10004 &#10004
BatchNorm &#10004 &#10004 &#10004 &#10004
Bias &#10004 &#10004 &#10004 &#10004
Binary &#10004 &#10004 &#10004 &#10004 See the Binary list below for a detailed list of supported binary operations
Bucketize &#10004
Cast &#10004 &#10004
Concat &#10004 &#10004 &#10004 &#10004
Const &#10004 &#10004
ConstOfShape &#10004
Conv1D &#10004 &#10004 &#10004
Conv1DTranspose &#10004 &#10004
Conv2D &#10004 &#10004 &#10004 &#10004
Conv2DTranspose &#10004 &#10004 &#10004 &#10004
Conv3D &#10004
Conv3DTranspose &#10004
Crop &#10004
CropAndResize &#10004
CumSum &#10004
DepthToSpace &#10004 &#10004
Dropout &#10004
ElementWise &#10004 &#10004 &#10004 &#10004
Expand &#10004 &#10004
FakeQuantize &#10004
Flatten &#10004 &#10004
FullyConnected &#10004 &#10004 &#10004 &#10004
Gather &#10004 &#10004
GatherND &#10004
Gemm &#10004 &#10004 &#10004
GlobalPooling &#10004 &#10004 &#10004 &#10004
GroupNorm &#10004
Gru &#10004
InstanceNorm &#10004 &#10004 &#10004
LayerNorm &#10004
LpNorm &#10004
Lrn &#10004
Lstm &#10004
MatMul &#10004 &#10004
Normalize &#10004 &#10004 &#10004
OneHot &#10004 &#10004
Pad &#10004 &#10004 &#10004
Permute &#10004 &#10004 &#10004
Pooling1D &#10004 &#10004 &#10004
Pooling2D &#10004 &#10004 &#10004 &#10004
Pooling3D &#10004
PriorBox &#10004
Range &#10004
Reduce &#10004 &#10004 &#10004 &#10004
Reshape &#10004 &#10004
Resize2D &#10004 &#10004 &#10004 &#10004
Rnn &#10004
Scale &#10004 &#10004 &#10004
ScatterND &#10004 &#10004
Shape &#10004
ShuffleChannel &#10004
SpaceToDepth &#10004 &#10004
Split &#10004 &#10004
Slice &#10004 &#10004 &#10004
Softmax &#10004 &#10004 &#10004 &#10004
Squeeze &#10004 &#10004 &#10004 &#10004
Tile &#10004 &#10004
Unary &#10004 &#10004 &#10004 &#10004 See the Unary list below for details on the supported Unary types
Unsqueeze &#10004 &#10004 &#10004 &#10004
Where &#10004


# Activation List:

name describe
None f(x) = x
Abs f(x) = [x]
Clip f(x) = min(max(x, constA), constB)
HardSigmoid f(x) = min(max(x * constA + constB, 0), 1)
HardSwish f(x) = min(max(x * constA + constB, 0), 1) * x
HSigmoid f(x) = (ReLU6(x + 3) / 6)
HSwish f(x) = (ReLU6(x + 3) / 6) * x
LeakyReLU f(x) = min(x, 0) * constA + max(x, 0)
Linear f(x) = x * constA + constB
PReLU f(x) = min(x, 0) * weight + max(x, 0) (Caffe1's )
Hot stove f(x) = max(x, 0)
Hot wheel f(x) = min(x, 0) * constA + min(max(x, 0), constB)
SELU f(x) = (x >= 0 ? x : (exp(x)-1) * constA) * constB
Sigmoid f(x) = 1 / (1 + exp(-x)), a.k.a. Logistic
SoftPlus f(x) = log(1 + exp(x * constB)) * constA
SoftSign f(x) = x / (1 +
Swish f(x) = x / (1 + exp(-x * constA))
Tanh f(x) = tanh(x * constB) * constA
Threshold f(x) = (x > constA ? 1 : 0)
ThrReLU f(x) = (x > constA ? x : 0) (Thresholded Hot stove)


# Binary list:

name describe
Add f(x, y) = x + y
Sub f(x, y) = x - y
Mul f(x, y) = x * y
Div f(x, y) = x / y
Pow f(x, y) = pow(x, y)
Max f(x, y) = max(x, y)
Min f(x, y) = min(x, y)
Mean f(x, y) = (x + y) / 2
And f(x, y) = x & y
Or f(x, y) = x
Xor f(x, y) = x y
BitShiftLeft f(x, y) = x << y
BitShiftRight f(x, y) = x >> y
Equal f(x, y) = (x == y)
NotEqual f(x, y) = (x != y)
Greater f(x, y) = (x > y)
GreaterEqual f(x, y) = (x >= y)
Less f(x, y) = (x < y)
LessEqual f(x, y) = (x <= y)


# Unary list:

name describe
Abs f(x) = [x]
Neg f(x) = -x
Lateral force f(x) = ceil(x)
Floor f(x) = floor(x)
Reciprocal f(x) = 1 / x
Sqrt f(x) = sqrt(x)
Exp f(x) = exp(x)
Log f(x) = log(x)
Erf f(x) = erf(x)
Acos f(x) = acos(x)
Acosh f(x) = Acosh (x)
Cos f(x) = cos(x)
Cosh f(x) = cosh(x)
Sin f(x) = sin(x)
Sinh f(x) = sinh(x)
Atan f(x) = atan(x)
Atanh f(x) = Atanh (x)
Tan f(x) = tan(x)
Tanh f(x) = tanh(x)
ExpM1 f(x) = expm1(x)
Log1P f(x) = log1p(x)