下载
中文
注册
我要评分
文档获取效率
文档正确性
内容完整性
文档易理解
在线提单
论坛求助
昇腾小AI

算子归属领域

算子接口可被归属为如下领域:

  • aclnn_ops_infer:NN网络算子推理库
  • aclnn_ops_train:NN网络算子训练库
  • aclnn_math:数学算子库
  • aclnn_rand:随机数算子库

其中推理库依赖数学库,训练库依赖推理库、数学库与随机库。

每个算子接口所属领域如下表所示。

aclnn api aclnn_ops_infer aclnn_ops_train aclnn_math aclnn_rand
aclnnAbs
aclnnAcos
aclnnAcosh
aclnnAdaptiveAvgPool2d
aclnnAdaptiveAvgPool2dBackward
aclnnAdaptiveAvgPool3dBackward
aclnnAdaptiveMaxPool2d
aclnnAdd
aclnnAddbmm
aclnnAddcdiv
aclnnAddcmul
aclnnAddmm
aclnnAddmv
aclnnAddr
aclnnAddRmsNorm
aclnnAdds
aclnnAddLayerNorm
aclnnAffineGrid
aclnnAll
aclnnAllGatherMatmul
aclnnAmax
aclnnAmin
aclnnAminmax
aclnnAminmaxAll
aclnnAminmaxDim
aclnnAny
aclnnApplyAdamWV2
aclnnApplyRotaryPosEmb
aclnnArange
aclnnArgMax
aclnnArgMin
aclnnArgsort
aclnnAsin
aclnnAsinh
aclnnAtan
aclnnAtan2
aclnnAtanh
aclnnAvgPool2d
aclnnAvgPool2dBackward
aclnnBaddbmm
aclnnBackgroundReplace
aclnnBatchMatMul
aclnnBatchNorm
aclnnBatchNormBackward
aclnnBatchNormElemt
aclnnBatchNormElemtBackward
aclnnBatchNormGatherStatsWithCounts
aclnnBatchNormReduceBackward
aclnnBatchNormStats
aclnnBernoulli
aclnnBernoulliTensor
aclnnBinaryCrossEntropy
aclnnBinaryCrossEntropyBackward
aclnnBinaryCrossEntropyWithLogits
aclnnBinaryCrossEntropyWithLogitsBackward
aclnnBincount
aclnnBitwiseAndScalar
aclnnBitwiseAndTensor
aclnnBitwiseAndTensorOut
aclnnBitwiseNot
aclnnBitwiseOrScalar
aclnnBitwiseOrTensor
aclnnBitwiseXorScalar
aclnnBitwiseXorTensor
aclnnCast
aclnnCat
aclnnCeil
aclnnCelu
aclnnChannelShuffle
aclnnClamp
aclnnClampMax
aclnnClampMaxTensor
aclnnClampMin
aclnnClampMinTensor
aclnnClampTensor
aclnnComplex
aclnnConstantPadNd
aclnnConvDepthwise2d
aclnnConvertWeightToINT4Pack
aclnnConvolution
aclnnConvolutionBackward
aclnnConvTbc
aclnnConvTbcBackward
aclnnCos
aclnnCosh
aclnnCtcLoss
aclnnCtcLossBackward
aclnnCummax
aclnnCummin
aclnnCumsum
aclnnCumsumV2
aclnnDeepNorm
aclnnDeepNormGrad
aclnnDiag
aclnnDiagFlat
aclnnDigamma
aclnnDiv
aclnnDivMod
aclnnDivMods
aclnnDivs
aclnnDot
aclnnDropout
aclnnDropoutBackward
aclnnDropoutDoMask
aclnnDropoutGenMask
aclnnDropoutGenMaskV2
aclnnEinsum
aclnnElu
aclnnEluBackward
aclnnEmbedding
aclnnEmbeddingDenseBackward
aclnnEmbeddingRenorm
aclnnEqScalar
aclnnEqTensor
aclnnEqual
aclnnErf
aclnnErfc
aclnnErfinv
aclnnExp
aclnnExp2
aclnnExpand
aclnnExpm1
aclnnEye
aclnnFFN
aclnnFFNV2
aclnnFFNV3
aclnnFlashAttentionScore
aclnnFlashAttentionScoreGrad
aclnnFlashAttentionScoreV2
aclnnFlashAttentionScoreGradV2
aclnnFlashAttentionVarLenScoreV2
aclnnFlashAttentionUnpaddingScoreGradV2
aclnnFlatten
aclnnFlip
aclnnFloor
aclnnFloorDivide
aclnnFloorDivides
aclnnFmodScalar
aclnnFmodTensor
aclnnFrac
aclnnFusedInferAttentionScore
aclnnFusedInferAttentionScoreV2
aclnnGather
aclnnGatherNd
aclnnGatherV2
aclnnGcd
aclnnGeGlu
aclnnGeGluBackward
aclnnGelu
aclnnGeluBackward
aclnnGemm
aclnnGer
aclnnGeScalar
aclnnGeTensor
aclnnGlobalAveragePool
aclnnGlu
aclnnGluBackward
aclnnGridSampler2D
aclnnGridSampler2DBackward
aclnnGridSampler3D
aclnnGroupedMatmul
aclnnGroupedMatMulAllReduce
aclnnGroupedMatmulV2
aclnnGroupedMatmulV3
aclnnGroupNorm
aclnnGroupNormBackward
aclnnGroupNormSilu
aclnnGroupNormSiluV2
aclnnGtScalar
aclnnGtTensor
aclnnHardshrink
aclnnHardshrinkBackward
aclnnHardsigmoid
aclnnHardsigmoidBackward
aclnnHardswish
aclnnHardswishBackward
aclnnHardtanh
aclnnHardtanhBackward
aclnnHistc
aclnnIm2col
aclnnIm2colBackward
aclnnIncreFlashAttention
aclnnIncreFlashAttentionV2
aclnnIncreFlashAttentionV3
aclnnIncreFlashAttentionV4
aclnnIndex
aclnnIndexAdd
aclnnIndexCopy
aclnnIndexFillTensor
aclnnIndexPutImpl
aclnnIndexSelect
aclnnInplaceAcos
aclnnInplaceAcosh
aclnnInplaceAdd
aclnnInplaceAddbmm
aclnnInplaceAddcdiv
aclnnInplaceAddcmul
aclnnInplaceAddmm
aclnnInplaceAddr
aclnnInplaceAdds
aclnnInplaceAsin
aclnnInplaceAsinh
aclnnInplaceAtan
aclnnInplaceAtan2
aclnnInplaceAtanh
aclnnInplaceBaddbmm
aclnnInplaceBernoulli
aclnnInplaceBernoulliTensor
aclnnInplaceBitwiseAndScalar
aclnnInplaceBitwiseAndTensor
aclnnInplaceBitwiseAndTensorOut
aclnnInplaceBitwiseOrScalar
aclnnInplaceBitwiseOrTensor
aclnnInplaceBitwiseXorScalar
aclnnInplaceBitwiseXorTensor
aclnnInplaceCeil
aclnnInplaceCelu
aclnnInplaceClampMax
aclnnInplaceClampMaxTensor
aclnnInplaceClampMinTensor
aclnnInplaceCopy
aclnnInplaceCos
aclnnInplaceCosh
aclnnInplaceDiv
aclnnInplaceDivMod
aclnnInplaceDivMods
aclnnInplaceDivs
aclnnInplaceElu
aclnnInplaceEqScalar
aclnnInplaceEqTensor
aclnnInplaceErf
aclnnInplaceErfc
aclnnInplaceErfinv
aclnnInplaceExp
aclnnInplaceExp2
aclnnInplaceExpm1
aclnnInplaceFillDiagonal
aclnnInplaceFillScalar
aclnnInplaceFillTensor
aclnnInplaceFloor
aclnnInplaceFloorDivide
aclnnInplaceFloorDivides
aclnnInplaceFmodScalar
aclnnInplaceFmodTensor
aclnnInplaceFrac
aclnnInplaceGeScalar
aclnnInplaceGeTensor
aclnnInplaceGtScalar
aclnnInplaceGtTensor
aclnnInplaceHardsigmoid
aclnnInplaceHardswish
aclnnInplaceHardtanh
aclnnInplaceIndexCopy
aclnnInplaceIndexFillTensor
aclnnInplaceLeakyRelu
aclnnInplaceLerp
aclnnInplaceLerps
aclnnInplaceLeScalar
aclnnInplaceLeTensor
aclnnInplaceLog
aclnnInplaceLog10
aclnnInplaceLog1p
aclnnInplaceLog2
aclnnInplaceLogicalAnd
aclnnInplaceLogicalNot
aclnnInplaceLogicalOr
aclnnInplaceLtScalar
aclnnInplaceLtTensor
aclnnInplaceMaskedFillScalar
aclnnInplaceMaskedFillTensor
aclnnInplaceMaskedScatter
aclnnInplaceMatmulAllReduceAddRmsNorm
aclnnInplaceMish
aclnnInplaceMul
aclnnInplaceMuls
aclnnInplaceNanToNum
aclnnInplaceNeg
aclnnInplaceNeScalar
aclnnInplaceNeTensor
aclnnInplaceNormal
aclnnInplaceOne
aclnnInplacePowTensorScalar
aclnnInplacePowTensorTensor
aclnnInplacePut
aclnnInplaceQuantMatmulAllReduceAddRmsNorm
aclnnInplaceQuantScatter
aclnnInplaceRandom
aclnnInplaceReciprocal
aclnnInplaceRelu
aclnnInplaceRemainderTensorScalar
aclnnInplaceRemainderTensorTensor
aclnnInplaceRenorm
aclnnInplaceRound
aclnnInplaceRoundDecimals
aclnnInplaceRReluWithNoise
aclnnInplaceRsqrt
aclnnInplaceScatter
aclnnInplaceScatterUpdate
aclnnInplaceScatterValue
aclnnInplaceSelu
aclnnInplaceSigmoid
aclnnInplaceSin
aclnnInplaceSinc
aclnnInplaceSinh
aclnnInplaceSqrt
aclnnInplaceSub
aclnnInplaceSubs
aclnnInplaceTan
aclnnInplaceTanh
aclnnInplaceThreshold
aclnnInplaceTril
aclnnInplaceTriu
aclnnInplaceTrunc
aclnnInplaceUniform
aclnnInplaceWeightQuantMatmulAllReduceAddRmsNorm
aclnnInplaceXLogYScalarOther
aclnnInplaceXLogYTensor
aclnnInplaceZero
aclnnInverse
aclnnIsClose
aclnnIsFinite
aclnnIsInScalarTensor
aclnnIsInTensorScalar
aclnnIsNegInf
aclnnIsPosInf
aclnnKlDiv
aclnnKlDivBackward
aclnnKthvalue
aclnnLgamma
aclnnL1Loss
aclnnL1LossBackward
aclnnLayerNorm
aclnnLayerNormBackward
aclnnLayerNormWithImplMode
aclnnLeakyRelu
aclnnLeakyReluBackward
aclnnLerp
aclnnLerps
aclnnLeScalar
aclnnLeTensor
aclnnLinalgCross
aclnnLinalgQr
aclnnLinalgVectorNorm
aclnnLinspace
aclnnLog
aclnnLog10
aclnnLog1p
aclnnLog2
aclnnLogdet
aclnnLogicalAnd
aclnnLogicalNot
aclnnLogicalOr
aclnnLogicalXor
aclnnLogSigmoid
aclnnLogSigmoidBackward
aclnnLogSoftmax
aclnnLogSoftmaxBackward
aclnnLogSumExp
aclnnLtScalar
aclnnLtTensor
aclnnMaskedSelect
aclnnMatmul
aclnnMatmulAllReduce
aclnnMatmulAllReduceAddRmsNorm
aclnnMatmulAllReduceV2
aclnnMatmulCompressDequant
aclnnMatmulReduceScatter
aclnnMax
aclnnMaxDim
aclnnMaximum
aclnnMaxPool2dWithIndices
aclnnMaxPool2dWithIndicesBackward
aclnnMaxUnpool2d
aclnnMaxUnpool2dBackward
aclnnMaxUnpool3d
aclnnMaxUnpool3dBackward
aclnnMaxV2
aclnnMean
aclnnMeanV2
aclnnMedian
aclnnMedianDim
aclnnMin
aclnnMinDim
aclnnMinimum
aclnnMish
aclnnMishBackward
aclnnMm
aclnnMoeFinalizeRouting
aclnnMoeGatingTopKSoftmax
aclnnMoeInitRouting
aclnnMrgbaCustom
aclnnMseLoss
aclnnMseLossBackward
aclnnMseLossOut
aclnnMul
aclnnMuls
aclnnMultilabelMarginLoss
aclnnMultinomial
aclnnMultiScaleDeformableAttentionGrad
aclnnMv
aclnnNanMedian
aclnnNanMedianDim
aclnnNanToNum
aclnnNeg
aclnnNeScalar
aclnnNeTensor
aclnnNLLLoss
aclnnNLLLoss2d
aclnnNLLLoss2dBackward
aclnnNLLLossBackward
aclnnNonMaxSuppression
aclnnNonzero
aclnnNonzeroV2
aclnnNorm
aclnnNormalFloatFloat
aclnnNormalFloatTensor
aclnnNormalTensorFloat
aclnnNormalTensorTensor
aclnnOneHot
aclnnPdist
aclnnPdistForward
aclnnPermute
aclnnPowScalarTensor
aclnnPowTensorScalar
aclnnPowTensorTensor
aclnnPrelu
aclnnPreluBackward
aclnnProd
aclnnProdDim
aclnnPromptFlashAttention
aclnnPromptFlashAttentionV2
aclnnQr
aclnnQuantMatmul
aclnnQuantMatmulAllReduce
aclnnQuantMatmulAllReduceV2
aclnnQuantMatmulAllReduceAddRmsNorm
aclnnQuantMatmulV2
aclnnQuantMatmulV3
aclnnQuantMatmulV4
aclnnRandperm
aclnnRange
aclnnReal
aclnnReciprocal
aclnnReduceNansum
aclnnReduceSum
aclnnReflectionPad1d
aclnnReflectionPad1dBackward
aclnnReflectionPad2d
aclnnReflectionPad2dBackward
aclnnReflectionPad3d
aclnnRelu
aclnnRemainderScalarTensor
aclnnRemainderTensorScalar
aclnnRemainderTensorTensor
aclnnRenorm
aclnnRepeat
aclnnRepeatInterleave
aclnnRepeatInterleaveInt
aclnnRepeatInterleaveIntWithDim
aclnnRepeatInterleaveTensor
aclnnRepeatInterleaveWithDim
aclnnReplicationPad1d
aclnnReplicationPad1dBackward
aclnnReplicationPad2d
aclnnReplicationPad2dBackward
aclnnResize
aclnnRmsNorm
aclnnRmsNormGrad
aclnnRoiAlign
aclnnRoll
aclnnRound
aclnnRoundDecimals
aclnnRReluWithNoise
aclnnRsqrt
aclnnRsub
aclnnRsubs
aclnnScale
aclnnScatter
aclnnScatterAdd
aclnnScatterValue
aclnnScatterNd
aclnnScatterNdUpdate
aclnnSearchSorted
aclnnSearchSorteds
aclnnSelu
aclnnSeluBackward
aclnnSigmoid
aclnnSigmoidBackward
aclnnSign
aclnnSignbit
aclnnSilu
aclnnSiluBackward
aclnnSin
aclnnSinc
aclnnSinh
aclnnSlice
aclnnSliceV2
aclnnSlogdet
aclnnSmoothL1Loss
aclnnSmoothL1LossBackward
aclnnSoftMarginLoss
aclnnSoftMarginLossBackward
aclnnSoftmax
aclnnSoftmaxBackward
aclnnSoftplus
aclnnSoftplusBackward
aclnnSoftshrink
aclnnSoftshrinkBackward
aclnnSort
aclnnSplitTensor
aclnnSplitWithSize
aclnnSqrt
aclnnStack
aclnnStd
aclnnStdMeanCorrection
aclnnStridedSliceAssignV2
aclnnSub
aclnnSubs
aclnnSum
aclnnSWhere
aclnnTake
aclnnTan
aclnnTanh
aclnnTanhBackward
aclnnThreshold
aclnnThresholdBackward
aclnnTopk
aclnnTrace
aclnnTransConvolutionWeight
aclnnTransQuantParam
aclnnTransQuantParamV2
aclnnTriangularSolve
aclnnTril
aclnnTriu
aclnnTrunc
aclnnUnique
aclnnUnique2
aclnnUniqueConsecutive
aclnnUpsampleBicubic2d
aclnnUpsampleBilinear2d
aclnnUpsampleBilinear2dBackward
aclnnUpsampleLinear1d
aclnnUpsampleLinear1dBackward
aclnnUpsampleNearest1d
aclnnUpsampleNearest1dBackward
aclnnUpsampleNearest2d
aclnnUpsampleNearest2dBackward
aclnnUpsampleNearest3d
aclnnUpsampleNearest3dBackward
aclnnUpsampleTrilinear3d
aclnnUpsampleTrilinear3dBackward
aclnnVar
aclnnVarCorrection
aclnnVarMean
aclnnWeightQuantMatmulAllReduce
aclnnWeightQuantMatmulAllReduceAddRmsNorm
aclnnXLogYScalarOther
aclnnXLogYScalarSelf
aclnnXLogYTensor
aclnnBlendImagesCustom
aclRfft1D
aclnnDynamicQuant
aclnnAscendQuant
aclnnSwinTransformerLnQkvQuant
aclnnSwinAttentionScoreQuant
搜索结果
找到“0”个结果

当前产品无相关内容

未找到相关内容,请尝试其他搜索词