PAI & TVM Meetup - Shanghai 20191116threadIdx.y/warpDim.y*warpDim.y badGimy -8 y warpDim.y = 32/warpDim.x = 32/blockDim.x Loop scaling We 。, “UN1T1a:111T1a SUMT1C(G 了引包cf =“c=1JoalB)ioat人+C XC6CT6IT6032 三Dloss5ca/9g=gsca/e ctom7 No need to modify or add any line of code. 计算平台事业部 COMPUTING PLATFORM Loss Scaling in TF 下和全于由 loss = loss_fn() opt = tf.Adamoptimizer(learning_rate=...) # minimize() on the loss scale optimizer. train_op = loss_scale_optimizer.minimize(1oss) Loss Scaling in PAI-TF Loss Scaling the loss using S 了 Backward propagation in MP N 放gradients( Y ) Unscaled gradients0 码力 | 26 页 | 5.82 MB | 5 月前3
Curve Cloud Nativemetrics, alerts, log processing and workload analysis AUTO PILOT Plan to Support horizontal/vertical scaling, auto config tuning, abnormal detection, schedule tuningCloud native Feature list • Features for0 码力 | 9 页 | 2.85 MB | 6 月前3
Curve for CNCF MainKubernetes (in Plan) • Support Operator capability level 5 (in Plan) • horizontal / vertical scaling, auto config tuning, abnormal detection and schedule tuningStorage Engine Comparison (vs. Ceph)0 码力 | 21 页 | 4.56 MB | 6 月前3
共 3 条
- 1













