下载
中文
注册
我要评分
文档获取效率
文档正确性
内容完整性
文档易理解
在线提单
论坛求助
昇腾小AI

接口列表

算子归属领域

算子接口可被归属为如下领域:

  • aclnn_ops_infer:NN网络算子推理库
  • aclnn_ops_train:NN网络算子训练库
  • aclnn_math:数学算子库
  • aclnn_rand:随机数算子库

其中推理库依赖数学库,训练库依赖推理库、数学库与随机库。

每个算子接口所属领域如下表所示。

aclnn api aclnn_ops_infer aclnn_ops_train aclnn_math aclnn_rand
aclnnAbs
aclnnAcos
aclnnAcosh
aclnnAdaptiveAvgPool2d
aclnnAdaptiveAvgPool2dBackward
aclnnAdaptiveAvgPool3d
aclnnAdaptiveAvgPool3dBackward
aclnnAdaptiveMaxPool2d
aclnnAdd
aclnnAddbmm
aclnnAddcdiv
aclnnAddcmul
aclnnAddmm
aclnnAddmv
aclnnAddr
aclnnAdds
aclnnAffineGrid
aclnnAll
aclnnAmax
aclnnAmin
aclnnAminmax
aclnnAminmaxAll
aclnnAminmaxDim
aclnnAny
aclnnApplyAdamWV2
aclnnArange
aclnnArgMax
aclnnArgMin
aclnnArgsort
aclnnAscendAntiQuant
aclnnAscendQuant
aclnnAscendQuantV3
aclnnAsin
aclnnAsinh
aclnnAtan
aclnnAtan2
aclnnAtanh
aclnnAvgPool2d
aclnnAvgPool2dBackward
aclnnAvgPool3d
aclnnAvgPool3dBackward
aclnnBackgroundReplace
aclnnBaddbmm
aclnnBatchMatMul
aclnnBatchMatMulQuant
aclnnBatchNorm
aclnnBatchNormBackward
aclnnBatchNormElemt
aclnnBatchNormElemtBackward
aclnnBatchNormGatherStatsWithCounts
aclnnBatchNormReduceBackward
aclnnBatchNormStats
aclnnBernoulli
aclnnBernoulliTensor
aclnnBidirectionLSTM
aclnnBinaryCrossEntropy
aclnnBinaryCrossEntropyBackward
aclnnBinaryCrossEntropyWithLogits
aclnnBinaryCrossEntropyWithLogitsBackward
aclnnBincount
aclnnBitwiseAndScalar
aclnnBitwiseAndTensor
aclnnBitwiseAndTensorOut
aclnnBitwiseNot
aclnnBitwiseOrScalar
aclnnBitwiseOrTensor
aclnnBitwiseXorScalar
aclnnBitwiseXorTensor
aclnnBlendImagesCustom
aclnnCast
aclnnCat
aclnnCeil
aclnnCelu
aclnnChannelShuffle
aclnnClamp
aclnnClampMax
aclnnClampMaxTensor
aclnnClampMin
aclnnClampMinTensor
aclnnClampTensor
aclnnComplex
aclnnConstantPadNd
aclnnConvDepthwise2d
aclnnConvertWeightToINT4Pack
aclnnConvolution
aclnnConvolutionBackward
aclnnConvTbc
aclnnConvTbcBackward
aclnnCos
aclnnCosh
aclnnCtcLoss
aclnnCtcLossBackward
aclnnCummax
aclnnCummin
aclnnCumsum
aclnnCumsumV2
aclnnDiag
aclnnDiagFlat
aclnnDigamma
aclnnDiv
aclnnDivMod
aclnnDivMods
aclnnDivs
aclnnDot
aclnnDropout
aclnnDropoutBackward
aclnnDropoutDoMask
aclnnDropoutGenMask
aclnnDropoutGenMaskV2
aclnnElu
aclnnEluBackward
aclnnEmbedding
aclnnEmbeddingBag
aclnnEmbeddingDenseBackward
aclnnEmbeddingRenorm
aclnnEqScalar
aclnnEqTensor
aclnnEqual
aclnnErf
aclnnErfc
aclnnErfinv
aclnnExp
aclnnExp2
aclnnExpand
aclnnExpm1
aclnnEye
aclnnFakeQuantPerChannelAffineCachemask
aclnnFakeQuantPerTensorAffineCachemask
aclnnFlatten
aclnnFlip
aclnnFloor
aclnnFloorDivide
aclnnFloorDivides
aclnnFmodScalar
aclnnFmodTensor
aclnnForeachAbs
aclnnForeachAcos
aclnnForeachAddcdivList
aclnnForeachAddcdivScalar
aclnnForeachAddcdivScalarList
aclnnForeachAddcdivScalarV2
aclnnForeachAddcmulList
aclnnForeachAddcmulScalar
aclnnForeachAddcmulScalarList
aclnnForeachAddcmulScalarV2
aclnnForeachAddList
aclnnForeachAddListV2
aclnnForeachAddScalar
aclnnForeachAddScalarList
aclnnForeachAddScalarV2
aclnnForeachAsin
aclnnForeachAtan
aclnnForeachCopy
aclnnForeachCos
aclnnForeachCosh
aclnnForeachDivList
aclnnForeachDivScalar
aclnnForeachDivScalarList
aclnnForeachDivScalarV2
aclnnForeachErf
aclnnForeachErfc
aclnnForeachExp
aclnnForeachExpm1
aclnnForeachLerpList
aclnnForeachLerpScalar
aclnnForeachLog
aclnnForeachLog1p
aclnnForeachLog2
aclnnForeachLog10
aclnnForeachMaximumList
aclnnForeachMaximumScalar
aclnnForeachMaximumScalarList
aclnnForeachMaximumScalarV2
aclnnForeachMinimumList
aclnnForeachMinimumScalar
aclnnForeachMinimumScalarList
aclnnForeachMinimumScalarV2
aclnnForeachMulList
aclnnForeachMulScalar
aclnnForeachMulScalarList
aclnnForeachMulScalarV2
aclnnForeachNeg
aclnnForeachNorm
aclnnForeachPowList
aclnnForeachPowScalar
aclnnForeachPowScalarAndTensor
aclnnForeachPowScalarList
aclnnForeachPowScalarV2
aclnnForeachReciprocal
aclnnForeachRoundOffNumber
aclnnForeachRoundOffNumberV2
aclnnForeachSigmoid
aclnnForeachSign
aclnnForeachSin
aclnnForeachSinh
aclnnForeachSqrt
aclnnForeachSubList
aclnnForeachSubListV2
aclnnForeachSubScalar
aclnnForeachSubScalarList
aclnnForeachSubScalarV2
aclnnForeachTan
aclnnForeachTanh
aclnnForeachZeroInplace
aclnnFrac
aclnnGather
aclnnGatherNd
aclnnGatherV2
aclnnGcd
aclnnGeGlu
aclnnGeGluBackward
aclnnGelu
aclnnGeluBackward
aclnnGeluBackwardV2
aclnnGemm
aclnnGer
aclnnGeScalar
aclnnGeTensor
aclnnGlobalAveragePool
aclnnGlu
aclnnGluBackward
aclnnGridSampler2D
aclnnGridSampler2DBackward
aclnnGridSampler3D
aclnnGridSampler3DBackward
aclnnGroupedBiasAddGrad
aclnnGroupedBiasAddGradV2
aclnnGroupNorm
aclnnGroupNormBackward
aclnnGroupNormSilu
aclnnGroupNormSiluV2
aclnnGroupQuant
aclnnGtScalar
aclnnGtTensor
aclnnHardshrink
aclnnHardshrinkBackward
aclnnHardsigmoid
aclnnHardsigmoidBackward
aclnnHardswish
aclnnHardswishBackward
aclnnHardtanh
aclnnHardtanhBackward
aclnnHistc
aclnnIm2col
aclnnIm2colBackward
aclnnIndex
aclnnIndexAdd
aclnnIndexCopy
aclnnIndexFillTensor
aclnnIndexPutImpl
aclnnIndexSelect
aclnnInplaceAcos
aclnnInplaceAcosh
aclnnInplaceAdd
aclnnInplaceAddbmm
aclnnInplaceAddcdiv
aclnnInplaceAddcmul
aclnnInplaceAddmm
aclnnInplaceAddr
aclnnInplaceAdds
aclnnInplaceAsin
aclnnInplaceAsinh
aclnnInplaceAtan
aclnnInplaceAtan2
aclnnInplaceAtanh
aclnnInplaceBaddbmm
aclnnInplaceBernoulli
aclnnInplaceBernoulliTensor
aclnnInplaceBitwiseAndScalar
aclnnInplaceBitwiseAndTensor
aclnnInplaceBitwiseOrScalar
aclnnInplaceBitwiseOrTensor
aclnnInplaceBitwiseXorScalar
aclnnInplaceBitwiseXorTensor
aclnnInplaceCeil
aclnnInplaceCelu
aclnnInplaceClampMax
aclnnInplaceClampMaxTensor
aclnnInplaceClampMinTensor
aclnnInplaceCopy
aclnnInplaceCos
aclnnInplaceCosh
aclnnInplaceDiv
aclnnInplaceDivMod
aclnnInplaceDivMods
aclnnInplaceDivs
aclnnInplaceElu
aclnnInplaceEqScalar
aclnnInplaceEqTensor
aclnnInplaceErf
aclnnInplaceErfc
aclnnInplaceErfinv
aclnnInplaceExp
aclnnInplaceExp2
aclnnInplaceExpm1
aclnnInplaceFillDiagonal
aclnnInplaceFillScalar
aclnnInplaceFillTensor
aclnnInplaceFloor
aclnnInplaceFloorDivide
aclnnInplaceFloorDivides
aclnnInplaceFmodScalar
aclnnInplaceFmodTensor
aclnnInplaceFrac
aclnnInplaceGeScalar
aclnnInplaceGeTensor
aclnnInplaceGtScalar
aclnnInplaceGtTensor
aclnnInplaceHardsigmoid
aclnnInplaceHardswish
aclnnInplaceHardtanh
aclnnInplaceIndexCopy
aclnnInplaceIndexFillTensor
aclnnInplaceLeakyRelu
aclnnInplaceLerp
aclnnInplaceLerps
aclnnInplaceLeScalar
aclnnInplaceLeTensor
aclnnInplaceLog
aclnnInplaceLog10
aclnnInplaceLog1p
aclnnInplaceLog2
aclnnInplaceLogicalAnd
aclnnInplaceLogicalNot
aclnnInplaceLogicalOr
aclnnInplaceLtScalar
aclnnInplaceLtTensor
aclnnInplaceMaskedFillScalar
aclnnInplaceMaskedFillTensor
aclnnInplaceMaskedScatter
aclnnInplaceMish
aclnnInplaceMul
aclnnInplaceMuls
aclnnInplaceNanToNum
aclnnInplaceNeg
aclnnInplaceNeScalar
aclnnInplaceNeTensor
aclnnInplaceNormal
aclnnInplaceOne
aclnnInplacePowTensorScalar
aclnnInplacePowTensorTensor
aclnnInplacePut
aclnnInplaceRandom
aclnnInplaceReciprocal
aclnnInplaceRelu
aclnnInplaceRemainderTensorScalar
aclnnInplaceRemainderTensorTensor
aclnnInplaceRenorm
aclnnInplaceRound
aclnnInplaceRoundDecimals
aclnnInplaceRReluWithNoise
aclnnInplaceRsqrt
aclnnInplaceScatter
aclnnInplaceScatterUpdate
aclnnInplaceScatterValue
aclnnInplaceSelu
aclnnInplaceSigmoid
aclnnInplaceSin
aclnnInplaceSinc
aclnnInplaceSinh
aclnnInplaceSqrt
aclnnInplaceSub
aclnnInplaceSubs
aclnnInplaceTan
aclnnInplaceTanh
aclnnInplaceThreshold
aclnnInplaceTril
aclnnInplaceTriu
aclnnInplaceTrunc
aclnnInplaceUniform
aclnnInplaceXLogYScalarOther
aclnnInplaceXLogYTensor
aclnnInplaceZero
aclnnInverse
aclnnIsClose
aclnnIsFinite
aclnnIsInScalarTensor
aclnnIsInTensorScalar
aclnnIsNegInf
aclnnIsPosInf
aclnnKlDiv
aclnnKlDivBackward
aclnnKthvalue
aclnnLgamma
aclnnL1Loss
aclnnL1LossBackward
aclnnLayerNorm
aclnnLayerNormBackward
aclnnLayerNormWithImplMode
aclnnLeakyRelu
aclnnLeakyReluBackward
aclnnLerp
aclnnLerps
aclnnLeScalar
aclnnLeTensor
aclnnLinalgCross
aclnnLinalgQr
aclnnLinalgVectorNorm
aclnnLinspace
aclnnLog
aclnnLog10
aclnnLog1p
aclnnLog2
aclnnLogAddExp
aclnnLogAddExp2
aclnnLogdet
aclnnLogicalAnd
aclnnLogicalNot
aclnnLogicalOr
aclnnLogicalXor
aclnnLogSigmoid
aclnnLogSigmoidBackward
aclnnLogSigmoidForward
aclnnLogSoftmax
aclnnLogSoftmaxBackward
aclnnLogSumExp
aclnnLtScalar
aclnnLtTensor
aclnnMaskedSelect
aclnnMaskedSoftmaxWithRelPosBias
aclnnMatmul
aclnnMatmulCompressDequant
aclnnMax
aclnnMaxDim
aclnnMaximum
aclnnMaxPool
aclnnMaxPool2dWithIndices
aclnnMaxPool2dWithIndicesBackward
aclnnMaxPool2dWithMask
aclnnMaxPool2dWithMaskBackward
aclnnMaxN
aclnnMaxUnpool2d
aclnnMaxUnpool2dBackward
aclnnMaxUnpool3d
aclnnMaxUnpool3dBackward
aclnnMaxV2
aclnnMean
aclnnMeanV2
aclnnMedian
aclnnMedianDim
aclnnMin
aclnnMinDim
aclnnMinimum
aclnnMinN
aclnnMish
aclnnMishBackward
aclnnMm
aclnnMseLoss
aclnnMseLossBackward
aclnnMseLossOut
aclnnMul
aclnnMuls
aclnnMultilabelMarginLoss
aclnnMultinomial
aclnnMv
aclnnNanMedian
aclnnNanMedianDim
aclnnNanToNum
aclnnNeg
aclnnNeScalar
aclnnNeTensor
aclnnNLLLoss
aclnnNLLLoss2d
aclnnNLLLoss2dBackward
aclnnNLLLossBackward
aclnnNonMaxSuppression
aclnnNonzero
aclnnNonzeroV2
aclnnNorm
aclnnNormalFloatFloat
aclnnNormalFloatTensor
aclnnNormalTensorFloat
aclnnNormalTensorTensor
aclnnOneHot
aclnnPdist
aclnnPdistForward
aclnnPermute
aclnnPolar
aclnnPowScalarTensor
aclnnPowTensorScalar
aclnnPowTensorTensor
aclnnPrecisionCompare
aclnnPrelu
aclnnPreluBackward
aclnnProd
aclnnProdDim
aclnnQr
aclnnQuantMatmul
aclnnQuantMatmulV2
aclnnQuantMatmulV3
aclnnQuantMatmulV4
aclnnRandperm
aclnnRange
aclnnReal
aclnnReciprocal
aclnnReduceNansum
aclnnReduceSum
aclnnReflectionPad1d
aclnnReflectionPad1dBackward
aclnnReflectionPad2d
aclnnReflectionPad2dBackward
aclnnReflectionPad3d
aclnnReflectionPad3dBackward
aclnnRelu
aclnnRemainderScalarTensor
aclnnRemainderTensorScalar
aclnnRemainderTensorTensor
aclnnRenorm
aclnnRepeat
aclnnRepeatInterleave
aclnnRepeatInterleaveInt
aclnnRepeatInterleaveIntWithDim
aclnnRepeatInterleaveTensor
aclnnRepeatInterleaveWithDim
aclnnReplicationPad1d
aclnnReplicationPad1dBackward
aclnnReplicationPad2d
aclnnReplicationPad2dBackward
aclnnReplicationPad3d
aclnnReplicationPad3dBackward
aclnnResize
aclnnRoiAlign
aclnnRoll
aclnnRound
aclnnRoundDecimals
aclnnRReluWithNoise
aclnnRsqrt
aclnnRsub
aclnnRsubs
aclnnScale
aclnnScatter
aclnnScatterAdd
aclnnScatterNd
aclnnScatterNdUpdate
aclnnScatterValue
aclnnSearchSorted
aclnnSearchSorteds
aclnnSelu
aclnnSeluBackward
aclnnSigmoid
aclnnSigmoidBackward
aclnnSign
aclnnSignbit
aclnnSilentCheck
aclnnSilu
aclnnSiluBackward
aclnnSin
aclnnSinc
aclnnSinh
aclnnSlice
aclnnSliceV2
aclnnSlogdet
aclnnSmoothL1Loss
aclnnSmoothL1LossBackward
aclnnSoftMarginLoss
aclnnSoftMarginLossBackward
aclnnSoftmax
aclnnSoftmaxBackward
aclnnSoftplus
aclnnSoftplusBackward
aclnnSoftshrink
aclnnSoftshrinkBackward
aclnnSort
aclnnSplitTensor
aclnnSplitWithSize
aclnnSqrt
aclnnStack
aclnnStd
aclnnStdMeanCorrection
aclnnStridedSliceAssignV2
aclnnSub
aclnnSubs
aclnnSum
aclnnSWhere
aclnnSwiGlu
aclnnSwiGluGrad
aclnnTake
aclnnTan
aclnnTanh
aclnnTanhBackward
aclnnThreshold
aclnnThresholdBackward
aclnnTopk
aclnnTrace
aclnnTransConvolutionWeight
aclnnTransQuantParam
aclnnTransQuantParamV2
aclnnTriangularSolve
aclnnTril
aclnnTriu
aclnnTrunc
aclnnUnique
aclnnUnique2
aclnnUniqueDim
aclnnUniqueConsecutive
aclnnUpsampleBicubic2d
aclnnUpsampleBicubic2dBackward
aclnnUpsampleBicubic2dAA
aclnnUpsampleBicubic2dAAGrad
aclnnUpsampleBilinear2d
aclnnUpsampleBilinear2dBackward
aclnnUpsampleBilinear2dAA
aclnnUpsampleBilinear2dAABackward
aclnnUpsampleLinear1d
aclnnUpsampleLinear1dBackward
aclnnUpsampleNearest1d
aclnnUpsampleNearest1dBackward
aclnnUpsampleNearest2d
aclnnUpsampleNearest2dBackward
aclnnUpsampleNearest3d
aclnnUpsampleNearest3dBackward
aclnnUpsampleTrilinear3d
aclnnUpsampleTrilinear3dBackward
aclnnVar
aclnnVarCorrection
aclnnVarMean
aclnnWeightQuantBatchMatmul
aclnnXLogYScalarOther
aclnnXLogYScalarSelf
aclnnXLogYTensor

废弃接口

废弃接口 说明
aclnnWeightQuantBatchMatmul 此接口后续版本会废弃,请使用 aclnnWeightQuantBatchMatmulV2接口。
搜索结果
找到“0”个结果

当前产品无相关内容

未找到相关内容,请尝试其他搜索词