keras tutorialthrough the installation of Keras, basics of deep learning, Keras models, Keras layers, Keras modules and finally conclude with some real-time applications. Audience This tutorial is prepared . 18 Core Modules ................................................................................................................................................ 19 6. Keras ― Modules ........... ................................................................................ 20 Available modules ................................................................................................0 码力 | 98 页 | 1.57 MB | 1 年前3
PyTorch Release Noteswith lower memory utilization. Transformer Engine also includes a collection of highly optimized modules for popular Transformer architectures and an automatic mixed precision-like API that can be used with lower memory utilization. Transformer Engine also includes a collection of highly optimized modules for popular Transformer architectures and an automatic mixed precision-like API that can be used with lower memory utilization. Transformer Engine also includes a collection of highly optimized modules for popular Transformer architectures and an automatic mixed precision-like API that can be used0 码力 | 365 页 | 2.94 MB | 1 年前3
AI大模型千问 qwen 中文文档class LoraArguments: lora_r: int = 64 lora_alpha: int = 16 lora_dropout: float = 0.05 lora_target_modules: List[str] = field( default_factory=lambda: [ "q_proj", "k_proj", "v_proj", "o_proj", "up_proj" lora_alpha: the alpha value for LoRA; • lora_dropout: the dropout rate for LoRA; • lora_target_modules: the target modules for LoRA. By default we tune all linear layers; • lora_weight_path: the path to the weight lora_config = LoraConfig( r=lora_args.lora_r, lora_alpha=lora_args.lora_alpha, target_modules=lora_args.lora_target_modules, lora_dropout=lora_args.lora_dropout, bias=lora_args.lora_bias, task_type="CAUSAL_LM"0 码力 | 56 页 | 835.78 KB | 1 年前3
深度学习与PyTorch入门实战 - 43. nn.Module▪ Conv2d ▪ ConvTransposed2d ▪ Dropout ▪ etc. 2. Container ▪ net(x) 3. parameters 4. modules ▪ modules: all nodes ▪ children: direct children 5. to(device) 6. save and load 7. train/test 80 码力 | 16 页 | 1.14 MB | 1 年前3
PyTorch TutorialSample Code in practice Complex Models • Complex Model Class • Predefined 'layer' modules • 'Sequential' layer modules Dataset • Dataset • In PyTorch, a dataset is represented by a regular Python class0 码力 | 38 页 | 4.09 MB | 1 年前3
全连接神经网络实战. pytorch 版的训练带来极大好处。 在 NeuralNetwork 内部定义函数: def weight_init ( s e l f ) : #遍 历 网 络 的 每 一 层 fo r m in s e l f . modules () : #如 果 该 层 是 线 性 连 接 层 i f i s i n s t a n c e (m, nn . Linear ) : print (m. weight . shape 8) , nn .ReLU() , nn . Linear (8 , 4) , ) def weight_init ( s e l f ) : fo r m in s e l f . modules () : i f i s i n s t a n c e (m, nn . Linear ) : m. weight . data . normal_ ( 0 . 0 , 1.0)#. f0 码力 | 29 页 | 1.40 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 7 - AutomationHyperBand, the recommended factor is 3. We will use the same. Now, let's go on and load the required modules and the dataset. import tensorflow as tf import tensorflow_datasets as tfds import keras_tuner as contains an implementation of the NASCell. !pip install tensorflow-addons Next, we load the required modules and initialize the random seeds. import random import matplotlib.pyplot as plt import numpy as np0 码力 | 33 页 | 2.48 MB | 1 年前3
Machine Learning Pytorch Tutorialexpand_dims(x, 1) ref: https://github.com/wkentaro/pytorch-for-numpy-users Tensors – Device ● Tensors & modules will be computed with CPU by default Use .to() to move tensors to appropriate devices. ● CPU x0 码力 | 48 页 | 584.86 KB | 1 年前3
深度学习下的图像视频处理技术-沈小勇[Sun et al, 2015], [Schuler et al, 2016], [Xiao et al, 2016], etc. Substitute a few traditional modules with learned parameters More recent: [Nah et al, 2017], [Kim et al, 2017], [Su et al, 2017], [Wieschollek0 码力 | 121 页 | 37.75 MB | 1 年前3
动手学深度学习 v2.0display from matplotlib import pyplot as plt from matplotlib_inline import backend_inline d2l = sys.modules[__name__] 本书中的大部分代码都是基于PyTorch的。PyTorch是一个开源的深度学习框架,在研究界非常受欢迎。本书 中的所有代码都在最新版本的PyTorch下通过了测试。但是 module in enumerate(args): # 这里,module是Module子类的一个实例。我们把它保存在'Module'类的成员 # 变量_modules中。_module的类型是OrderedDict self._modules[str(idx)] = module def forward(self, X): # OrderedDict保证了按照成员添加的顺序遍历它们 (continues self._modules.values(): X = block(X) return X __init__函数将每个模块逐个添加到有序字典_modules中。读者可能会好奇为什么每个Module都有一 个_modules属性?以及为什么我们使用它而不是自己定义一个Python列表?简而言之,_modules的主要优 点是:在模块的参数初始化过程中,系统知道在_modules字典中查找需要初始化参数的子块。0 码力 | 797 页 | 29.45 MB | 1 年前3
共 12 条
- 1
- 2













