# List of Arithmetic Support

# Weixin Mini Program AI inference is responsible for running your model in an optimized manner and leveraging hardware acceleration on specific devices when available.This page provides information on which Ops are currently supported by various devices.

Note:

1. The following operators are generally consistent with the ONNX operator definition if there is no special remarks, refer to: https://github.com/onnx/onnx/blob/main/docs/Operators.md

2. At present, GPU reasoning is not open to the outside world, please look forward to it.

Operator CPU IOS NPU IOS GPU Android GPU 备注
Activation Refer to the activation list below for detailed supported activation types
ArgMax
ArgMin
BatchNorm
Bias
Binary See the Binary list below for details of supported binary operations
Bucketize
Cast
Concat
Const
ConstOfShape
Conv1D
Conv1DTranspose
Conv2D
Conv2DTranspose
Conv3D
Conv3DTranspose
Crop
CropAndResize
CumSum
DepthToSpace
Dropout
ElementWise
Expand
FakeQuantize
Flatten
FullyConnected
Gather
GatherND
Gemm
GlobalPooling
GroupNorm
Gru
InstanceNorm
LayerNorm
LpNorm
Lrn
Lstm
MatMul
NMS
Normalize
OneHot
Pad
Permute
Pooling1D
Pooling2D
Pooling3D
PriorBox
Range
Reduce
Reshape
Resize2D
Rnn
Scale
ScatterND
Shape
ShuffleChannel
SpaceToDepth
Split
Slice
Softmax
Squeeze
Tile
TopK
Unary See the Unary list below for details on supported Unary types
Unsqueeze
Where



# Activation List:

name describe
None f(x) = x
Abs f(x) = [x]
Clip f(x) = min(max(x, constA), constB)
HardSigmoid f(x) = min(max(x * constA + constB, 0), 1)
HardSwish f(x) = min(max(x*constA + constB, 0), 1)*x
HSigmoid f(x) = (ReLU6(x + 3) / 6)
HSwish f(x) = (ReLU6(x + 3) / 6) * x
LeakyReLU f(x) = min(x, 0) * constA + max(x, 0)
Linear f(x) = x * constA + constB
PReLU f(x) = min(x, 0) * weight + max(x, 0) (Caffe1's)
ReLU f(x) = max(x, 0)
ReLUN f(x) = min(x, 0) * constA + min(max(x, 0), constB)
SELU f(x) = (x >= 0 ? x : (exp(x)-1)*constA)*constB
Sigmoid f(x) = 1 / (1 + exp(-x)), a.k.a. Logistic
SoftPlus f(x) = log(1 + exp(x*constB))*constA
SoftSign f(x) = x / (1 + |x|)
Swish f(x) = x / (1 + exp(-x * constA))
Tanh f(x) = tanh(x*constB)*constA
Threshold f(x) = (x > constA ? 1 : 0)
ThrReLU f(x) = (x > constA ? x : 0) (Thresholded ReLU)



# Binary list:

name describe
Add f(x, y) = x + y
Sub f(x, y) = x - y
Mul f(x, y) = x * y
Div f(x, y) = x / y
Pow f(x, y) = pow(x, y)
Max f(x, y) = max(x, y)
Min f(x, y) = min(x, y)
Mean f(x, y) = (x + y) / 2
And f(x, y) = x & y
Or f(x, y) = x |y
Xor f(x, y) = x ^ y
BitShiftLeft f(x, y) = x << y
BitShiftRight f(x, y) = x >> y
Equal f(x, y) = (x == y)
NotEqual f(x, y) = (x != y)
Greater f(x, y) = (x > y)
GreaterEqual f(x, y) = (x >= y)
Less f(x, y) = (x < y)
LessEqual f(x, y) = (x <= y)



# Unary list:

name describe
Abs f(x) = [x]
Neg f(x) = -x
Celi f(x) = ceil(x)
Floor f(x) = floor(x)
Reciprocal f(x) = 1 / x
Sqrt f(x) = sqrt(x)
Exp f(x) = exp(x)
Log f(x) = log(x)
Erf f(x) = erf(x)
Acos f(x) = acos(x)
Acosh f(x) = acosh(x)
Cos f(x) = cos(x)
Cosh f(x) = cosh(x)
Sin f(x) = sin(x)
Sinh f(x) = sinh(x)
Atan f(x) = atan(x)
Atanh f(x) = atanh(x)
Tan f(x) = tan(x)
Tanh f(x) = tanh(x)
ExpM1 f(x) = expm1(x)
Log1P f(x) = log1p(x)