PyTorch Tutorial/Miniconda3-latest-Linux-x86_64.sh • After Miniconda is installed: conda install pytorch -c pytorch Writing code • Up to you; feel free to use emacs, vim, PyCharm, etc. if you want. • Our recommendations: • Install: 1234:localhost:1234 __@__.cs.princeton.edu • First blank is username, second is hostname Jupyter Notebook VS Code • Install the Python extension. • ???????????? Install the Remote Development extension. • Python Jupyter notebooks by delimiting cells/sections with #%% • Debugging PyTorch code is just like debugging any other Python code: see Piazza @108 for info. Also try Jupyter Lab! Why talk about libraries0 码力 | 38 页 | 4.09 MB | 1 年前3
PyTorch Release Notesarchitectures and an automatic mixed precision-like API that can be used seamlessly with your PyTorch code. ‣ A preview of Torch-TensorRT (1.4.0dev0) is now included. Torch-TRT is the TensorRT integration architectures and an automatic mixed precision-like API that can be used seamlessly with your PyTorch code. PyTorch Release 23.06 PyTorch RN-08516-001_v23.07 | 15 ‣ A preview of Torch-TensorRT (1.4 architectures and an automatic mixed precision-like API that can be used seamlessly with your PyTorch code. ‣ NVIDIA Deep Learning Profiler (DLProf) v1.8, which was included in the 21.12 container, was the0 码力 | 365 页 | 2.94 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 5 - Advanced Compression Techniquesrequired libraries to start with. We will use the gzip python module for demonstrating compression. The code for this exercise is available as a Jupyter notebook here. %%capture import gzip import operator same number of weights pruned. Phew! It feels like we have gone through a lot of talk without much code! In chapter four, we trained a model to predict masks for pets to build snapchat like filters. Let’s prunable block using magnitude-based pruning. Note that the below code is in addition to the original segmentation project in chapter four. The code for this project is available as a Jupyter notebook here.0 码力 | 34 页 | 3.18 MB | 1 年前3
深度学习与PyTorch入门实战 - 54. AutoEncoder自编码器Auto-Encoders 主讲:龙良曲 Outline Supervised Learning https://towardsdatascience.com/supervised-vs-unsupervised-learning-14f68e32ea8d Massive Unlabeled data Unsupervised Learning https://medium.com/ to-learn-better-dropout-in-deep-machine-learning-74334da4bfc5 Adversarial AutoEncoders ▪ Distribution of hidden code https://towardsdatascience.com/a-wizards-guide-to-adversarial-autoencoders-part-2- exploring-lat0 码力 | 29 页 | 3.49 MB | 1 年前3
【PyTorch深度学习-龙龙老师】-测试版202112Anaconda to my PATH environment variable”一项,这样可以通过命令行方式调用 Anaconda 程序。如图 1.23 所示,安装程序 询问是否连带安装 VS Code 软件,选择 Skip 即可。整个安装流程约持续 5 分钟,具体时间 预览版202112 第 1 章 人工智能绪论 18 需依据计算机性能而定。 图 1.22 Anaconda 语言编写程序的方式非常多,可以使用 ipython 或者 ipython notebook 方式 交互式编写代码,也可以利用 Sublime Text、PyCharm 和 VS Code 等综合 IDE 开发中大型 项目。本书推荐使用 PyCharm 编写和调试,使用 VS Code 交互式开发,这两者都可以免费 使用,用户自行下载安装,并配置好 Python 解释器即可。限于篇幅,这里不再赘述。 预览版202112 Henderson, R. E. Howard, W. Hubbard 和 L. D. Jackel, “Backpropagation Applied to Handwritten Zip Code Recognition,” Neural Comput., 卷 1, pp. 541-551, 12 1989. [3] A. Krizhevsky, I. Sutskever 和 G0 码力 | 439 页 | 29.91 MB | 1 年前3
星际争霸与人工智能without Collision 3 Marines (ours) vs. 1 Super Zergling (enemy) Hit and Run Tactics 3 Marines (ours) vs. 1 Zealot (enemy) Coordinated Cover Attack 3 Marines (ours) vs. 1 Super Zergling (enemy) Focus Focus Fire without Overkill 15 Marines (ours) vs. 16 Marines (enemy) Coordinated Heterogonous Agents 2 Dropships and 2 tanks vs. 1 Ultralisk Hierarchical Reinforcement Learning Strategy & Planning Combat0 码力 | 24 页 | 2.54 MB | 1 年前3
Lecture 1: Overviewcan be used to infer uncertainty. A one-vs-one SVM approach can be used to tackle multiple classes. Feng Li (SDU) Overview September 6, 2023 47 / 57 Parametric vs Non-Parametric Models Parametric model regression ˆy = ˆβ 0 + p � j=1 ˆβ jxj Feng Li (SDU) Overview September 6, 2023 48 / 57 Parametric vs Non-Parametric Models (Contd.) Non-parametric model: nearest-neighbor method Make predictions for that aren’t relevant to the test case. Feng Li (SDU) Overview September 6, 2023 49 / 57 Parametric vs Non-Parametric Models (Contd.) These two methods are opposite w.r.t. computation. NN-like methods0 码力 | 57 页 | 2.41 MB | 1 年前3
《TensorFlow 2项目进阶实战》1-基础理论篇:TensorFlow 2设计思想《TensorFlow 2 项目进阶实战》视频课程 基础理论篇:TensorFlow 2 设计思想 • TensorFlow 2 设计原则 • TensorFlow 2 核心模块 • TensorFlow 2 vs TensorFlow 1.x • TensorFlow 2 落地应用 目录 TensorFlow 2 设计原则 TensorFlow - Infra of AI TensorFlow 2 Support Limited Support Limited Support SavedModel:生产级 TensorFlow 模型格式 TensorFlow 2 vs TensorFlow 1.x Keras vs TensorFlow 1.x TensorFlow 1.x 工作流 Full of abstract notions TensorFlow 2 工作流 Native0 码力 | 40 页 | 9.01 MB | 1 年前3
Lecture 3: Logistic RegressionFeng Li (SDU) Logistic Regression September 20, 2023 24 / 29 Transformation to Binary One-vs.-rest (one-vs.-all, OvA or OvR, one-against-all, OAA) strategy is to train a single classifier per class, classifier Feng Li (SDU) Logistic Regression September 20, 2023 25 / 29 Transformation to Binary One-vs.-One (OvO) reduction is to train K(K − 1)/2 binary classifiers For the (s, t)-th classifier: Positive0 码力 | 29 页 | 660.51 KB | 1 年前3
深度学习与PyTorch入门实战 - 01. 初见PyTorchcom/battle-of-the-deep-learning-frameworks-part-i-cff0e3841750 大浪淘沙 王者之争 https://www.edureka.co/blog/pytorch-vs-tensorflow/ 动态图 https://towardsdatascience.com/battle-of-the-deep-learning-frameworks-part-i-cff0e3841750 41750 静态图 综合评价 PyTorch TensorFlow 1 TensorFlow 2 性能 生态 工业界 学术界 上手难度 易用性 兼容性 发展前景 0 小结 VS PyTorch生态 TorchVision PyTorch能做什么? • GPU加速 • 自动求导 • 常用网络层 1. GPU加速 2. 自动求导 3. 常用网络层 ▪ nn0 码力 | 19 页 | 1.06 MB | 1 年前3
共 38 条
- 1
- 2
- 3
- 4













