Experiment 1: Linear Regressionthat your implementation of gradient descent had found? Visualize the relationship by both surf and contour commands. Remarks: For the surf function surf(x, y, z), if x and y are vectors, x = 1 : columns(z) on x(j) and y(i). This rule is also applicable to the contour function. We can specify the number and the distribution of contours in the contour function, by introduction different spaced vector, e.g0 码力 | 7 页 | 428.11 KB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 1 - IntroductionFigure 1-12: Bayesian Optimization over two dimensions x1 and x2. Red contour lines denote a high loss value, and blue contour lines denote a low loss value. The contours are unknown to the algorithm0 码力 | 21 页 | 3.17 MB | 1 年前3
【PyTorch深度学习-龙龙老师】-测试版202112import Axes3D fig = plt.figure() ax = Axes3D(fig) # 设置 3D 坐标轴 # 根据网格点绘制 sinc 函数 3D 曲面 ax.contour3D(x.numpy(), y.numpy(), z.numpy(), 50) plt.show() 5.7 经典数据集加载 到这里为止,已经学习完张量的常用操作方法,已具备实现大部分深度网络的技术储 plt.contourf(XX, YY, preds.reshape(XX.shape), 25, alpha = 1, cmap=cm.Spectral) plt.contour(XX, YY, preds.reshape(XX.shape), levels=[.5], cmap="Greys", vmin=0, vmax=.6) # 绘制散点图,根据标签区分颜色 preds.reshape(XX.shape), 25, alpha = 0.08, cmap=cm.Spectral) plt.contour(XX, YY, preds.reshape(XX.shape), levels=[.5], cmap="Greys", vmin=0, vmax=0 码力 | 439 页 | 29.91 MB | 1 年前3
动手学深度学习 v2.0torch.meshgrid(torch.arange(-5.5, 1.0, 0.1), torch.arange(-3.0, 1.0, 0.1), indexing='ij') d2l.plt.contour(x1, x2, f(x1, x2), colors='#1f77b4') d2l.plt.xlabel('x1') d2l.plt.ylabel('x2') 接下来,我们观察学习率η =0 码力 | 797 页 | 29.45 MB | 1 年前3
共 4 条
- 1













