# 算子支持列表
# 小程序AI推理负责以最优化的方式运行您的模型,并在可用时利用特定设备的硬件加速。此页面提供关于目前各设备支持哪些Op的信息。
注:
1.以下算子若无特殊备注说明,一般符合ONNX 算子定义,参考:https://github.com/onnx/onnx/blob/main/docs/Operators.md
2.目前GPU推理暂未对外开放,敬请期待。
Operator | CPU | IOS NPU | IOS GPU | Android GPU | 备注 |
---|---|---|---|---|---|
Activation | ✔ | ✔ | ✔ | ✔ | 详细支持的Activation种类参考下方Activation列表 |
ArgMax | ✔ | ✔ | ✔ | ||
ArgMin | ✔ | ✔ | ✔ | ||
BatchNorm | ✔ | ✔ | ✔ | ✔ | |
Bias | ✔ | ✔ | ✔ | ✔ | |
Binary | ✔ | ✔ | ✔ | ✔ | 详细支持的Binary操作种类参考下方Binary列表 |
Bucketize | ✔ | ||||
Cast | ✔ | ✔ | |||
Concat | ✔ | ✔ | ✔ | ✔ | |
Const | ✔ | ✔ | |||
ConstOfShape | ✔ | ||||
Conv1D | ✔ | ✔ | ✔ | ||
Conv1DTranspose | ✔ | ✔ | |||
Conv2D | ✔ | ✔ | ✔ | ✔ | |
Conv2DTranspose | ✔ | ✔ | ✔ | ✔ | |
Conv3D | ✔ | ||||
Conv3DTranspose | ✔ | ||||
Crop | ✔ | ||||
CropAndResize | ✔ | ||||
CumSum | ✔ | ||||
DepthToSpace | ✔ | ✔ | |||
Dropout | ✔ | ||||
ElementWise | ✔ | ✔ | ✔ | ✔ | |
Expand | ✔ | ✔ | |||
FakeQuantize | ✔ | ||||
Flatten | ✔ | ✔ | |||
FullyConnected | ✔ | ✔ | ✔ | ✔ | |
Gather | ✔ | ✔ | |||
GatherND | ✔ | ||||
Gemm | ✔ | ✔ | ✔ | ||
GlobalPooling | ✔ | ✔ | ✔ | ✔ | |
GroupNorm | ✔ | ||||
Gru | ✔ | ||||
InstanceNorm | ✔ | ✔ | ✔ | ||
LayerNorm | ✔ | ||||
LpNorm | ✔ | ||||
Lrn | ✔ | ||||
Lstm | ✔ | ||||
MatMul | ✔ | ✔ | |||
Normalize | ✔ | ✔ | ✔ | ||
OneHot | ✔ | ✔ | |||
Pad | ✔ | ✔ | ✔ | ||
Permute | ✔ | ✔ | ✔ | ||
Pooling1D | ✔ | ✔ | ✔ | ||
Pooling2D | ✔ | ✔ | ✔ | ✔ | |
Pooling3D | ✔ | ||||
PriorBox | ✔ | ||||
Range | ✔ | ||||
Reduce | ✔ | ✔ | ✔ | ✔ | |
Reshape | ✔ | ✔ | |||
Resize2D | ✔ | ✔ | ✔ | ✔ | |
Rnn | ✔ | ||||
Scale | ✔ | ✔ | ✔ | ||
ScatterND | ✔ | ✔ | |||
Shape | ✔ | ||||
ShuffleChannel | ✔ | ||||
SpaceToDepth | ✔ | ✔ | |||
Split | ✔ | ✔ | |||
Slice | ✔ | ✔ | ✔ | ||
Softmax | ✔ | ✔ | ✔ | ✔ | |
Squeeze | ✔ | ✔ | ✔ | ✔ | |
Tile | ✔ | ✔ | |||
Unary | ✔ | ✔ | ✔ | ✔ | 详细支持的Unary种类参考下方Unary列表 |
Unsqueeze | ✔ | ✔ | ✔ | ✔ | |
Where | ✔ |
# Activation列表:
名称 | 描述 |
---|---|
None | f(x) = x |
Abs | f(x) = [x] |
Clip | f(x) = min(max(x, constA), constB) |
HardSigmoid | f(x) = min(max(x * constA + constB, 0), 1) |
HardSwish | f(x) = min(max(x * constA + constB, 0), 1) * x |
HSigmoid | f(x) = (ReLU6(x + 3) / 6) |
HSwish | f(x) = (ReLU6(x + 3) / 6) * x |
LeakyReLU | f(x) = min(x, 0) * constA + max(x, 0) |
Linear | f(x) = x * constA + constB |
PReLU | f(x) = min(x, 0) * weight + max(x, 0) (Caffe1's) |
ReLU | f(x) = max(x, 0) |
ReLUN | f(x) = min(x, 0) * constA + min(max(x, 0), constB) |
SELU | f(x) = (x >= 0 ? x : (exp(x)-1) * constA) * constB |
Sigmoid | f(x) = 1 / (1 + exp(-x)), a.k.a. Logistic |
SoftPlus | f(x) = log(1 + exp(x * constB)) * constA |
SoftSign | f(x) = x / (1 + |x|) |
Swish | f(x) = x / (1 + exp(-x * constA)) |
Tanh | f(x) = tanh(x * constB) * constA |
Threshold | f(x) = (x > constA ? 1 : 0) |
ThrReLU | f(x) = (x > constA ? x : 0) (Thresholded ReLU) |
# Binary列表:
名称 | 描述 |
---|---|
Add | f(x, y) = x + y |
Sub | f(x, y) = x - y |
Mul | f(x, y) = x * y |
Div | f(x, y) = x / y |
Pow | f(x, y) = pow(x, y) |
Max | f(x, y) = max(x, y) |
Min | f(x, y) = min(x, y) |
Mean | f(x, y) = (x + y) / 2 |
And | f(x, y) = x & y |
Or | f(x, y) = x | y |
Xor | f(x, y) = x ^ y |
BitShiftLeft | f(x, y) = x << y |
BitShiftRight | f(x, y) = x >> y |
Equal | f(x, y) = (x == y) |
NotEqual | f(x, y) = (x != y) |
Greater | f(x, y) = (x > y) |
GreaterEqual | f(x, y) = (x >= y) |
Less | f(x, y) = (x < y) |
LessEqual | f(x, y) = (x <= y) |
# Unary列表:
名称 | 描述 |
---|---|
Abs | f(x) = [x] |
Neg | f(x) = -x |
Celi | f(x) = ceil(x) |
Floor | f(x) = floor(x) |
Reciprocal | f(x) = 1 / x |
Sqrt | f(x) = sqrt(x) |
Exp | f(x) = exp(x) |
Log | f(x) = log(x) |
Erf | f(x) = erf(x) |
Acos | f(x) = acos(x) |
Acosh | f(x) = acosh(x) |
Cos | f(x) = cos(x) |
Cosh | f(x) = cosh(x) |
Sin | f(x) = sin(x) |
Sinh | f(x) = sinh(x) |
Atan | f(x) = atan(x) |
Atanh | f(x) = atanh(x) |
Tan | f(x) = tan(x) |
Tanh | f(x) = tanh(x) |
ExpM1 | f(x) = expm1(x) |
Log1P | f(x) = log1p(x) |