PyTorch Release NotesR418, R440, R460, and R520 drivers, which are not forward- compatible with CUDA 12.1. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Volta™, NVIDIA Turing™, NVIDIA Ampere architecture, and NVIDIA Hopper™ architecture families. For a list of GPUs to which this compute capability corresponds, see CUDA GPUs. For additional support details R418, R440, R460, and R520 drivers, which are not forward- compatible with CUDA 12.1. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA0 码力 | 365 页 | 2.94 MB | 1 年前3
AI大模型千问 qwen 中文文档"user", (续下页) 24 Chapter 1. 文档 Qwen (接上页) "content": "Show me the python code for quick sorting a list of integers." } ], "max_tokens": 512 }' | jq -r '.choices[0].message.content' 1.11.5 使用 Chat GUI LoraArguments: lora_r: int = 64 lora_alpha: int = 16 lora_dropout: float = 0.05 lora_target_modules: List[str] = field( default_factory=lambda: [ "q_proj", "k_proj", "v_proj", "o_proj", "up_proj", "gate_proj" # Check this issue https://github.com/huggingface/peft/issues/746 for more␣ �→information. if ( list(pathlib.Path(training_args.output_dir).glob("checkpoint-*")) and not training_args.use_lora ): trainer0 码力 | 56 页 | 835.78 KB | 1 年前3
keras tutorialKeras in this chapter. Available modules Let us first see the list of modules available in the Keras. Initializers: Provides a list of initializers function. We can learn it in details in Keras learning. Regularizers: Provides a list of regularizers function. We can learn it in details in Keras Layers chapter. Constraints: Provides a list of constraints function. We can learn it Layers chapter. Activations: Provides a list of activator function. We can learn it in details in Keras Layers chapter. Losses: Provides a list of loss function. We can learn it in details0 码力 | 98 页 | 1.57 MB | 1 年前3
【PyTorch深度学习-龙龙老师】-测试版202112cmd.exe。或者点击开始菜单,输入“cmd”也可搜索到 cmd.exe 程序,打开即可。输入 conda list 命令即可查看 Python 环境已安装的库,如果是新安装的 Python 环境,则列出的 库都是 Anaconda 自带的软件库,如图 1.24 所示。如果 conda list 能够正常弹出一系列的库 列表信息,说明 Anaconda 软件安装成功;如果 conda 命令不能被识别,则说明安装失败, 将 PyTorch 张量的数据导出为 numpy 数组格式 Out[3]: array([1. , 2. , 3.3], dtype=float32) 创建向量、矩阵、张量等,可以通过 List 容器传给 torch.tensor()函数。例如,创建一 个元素的向量,代码如下: In [4]: a = torch.tensor([1.2]) # 创建一个元素的向量 a, a Python List 列表是 Python 程序中非常重要的数据载体容器,很多 数据都是通过 Python 语言将数据加载至 Array 或者 List 容器,再转换到 Tensor 类型,通过 PyTorch 运算处理后导出到 Array 或者 List 容器,方便其他模块调用。 通过 tf.tensor()函数可以创建新 Tensor,并将保存在 Python List 对象或者 Numpy0 码力 | 439 页 | 29.91 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 7 - Automationthe blocks which could produce more complex cells. For primitive operations, NASNet chooses from a list of 13 frequently used operations in convolution networks such as regular convolutions, max pooling cells=2, initial_width=1, initial_channels=4 ) STATE_SPACE = [ dict(name='hidden_state', values=list(range(2)), count=2), dict( name=primitives, values=['sep_3x3', 'sep_5x5', 'sep_7x7', 'avg_3x3', 'max_3x3' the input feature space using a convolution layer with stride 2 """ if self.stride == 2: inputs = list( map( lambda inp: layers.Conv2D( self.channels, 3, strides=(2,2), padding='same' )(inp), inputs0 码力 | 33 页 | 2.48 MB | 1 年前3
机器学习课程-温州大学-时间序列总结还可以将包含多个datetime对象的列表传给 index参数,同样能创建具有时间戳索引的 Series对象。 date_list = [datetime(2018, 1, 1), datetime(2018, 1, 15] time_se = pd.Series(np.arange(6), index=date_list) 12 创建时间序列 如果希望DataFrame对象具有时间戳索引, 也可以采用上述方式进行创建。 也可以采用上述方式进行创建。 data_demo = [[11, 22, 33], [44, 55, 66]] date_list = [datetime(2018, 1, 23), datetime(2018, 2, 15)] time_df = pd.DataFrame(data_demo, index=date_list) 13 通过时间戳索引选取子集 最简单的选取子集的方式,是直接使用位置 索引来获取具体的数据。 创建时期对象 除了使用上述方式创建PeriodIndex外,还 可以直接在PeriodIndex的构造方法中传入 一组日期字符串。 str_list = ['2010', '2011', '2012'] pd.PeriodIndex(str_list, freq='A-DEC') 39 创建时期对象 DatetimeIndex是用来指代一系列时间点 的一种索引结构,而PeriodIndex则是用0 码力 | 67 页 | 1.30 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 5 - Advanced Compression Techniqueszero. def sparsify_smallest(w, sparsity_rate): w = w.copy() w_1d = np.reshape(w, (-1)) # Create a list of indices sorted by the absolute magnitude of the weights. w_1d_sorted_indices = np.argsort(np.abs(w_1d)) code prepares the input arguments to create a model for pruning. The prunable_blocks variable is the list of names of prunable convolution blocks. We prune all convolution blocks from second (zero indexed) to the pet segmentation model from chapter four. # Pruning start and end blocks prunable_blocks = list(map(lambda l: l.name, model.layers[2:13])) model_for_pruning = create_model_for_pruning(model, prunable_blocks)0 码力 | 34 页 | 3.18 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 3 - Learning Techniquesdata. We used an image of the whale to demonstrate the effects of transformations visually. The above list is not exhaustive, rather we have used it as a guide to help make better transformation choices. A LookupError as e: import nltk nltk.download('wordnet') """ It returns a list of synonyms of the input word. The output list may contain the original word. """ def synonyms(word): results = set() for syn in wordnet.synsets(word): for lemma in syn.lemmas(): results.add(lemma.name()) return list(results) """ It handles the cases when the synonyms for a word are unavailable. It returns the0 码力 | 56 页 | 18.93 MB | 1 年前3
动手学深度学习 v2.0预备知识 (continued from previous page) return (hasattr(X, "ndim") and X.ndim == 1 or isinstance(X, list) and not hasattr(X[0], "__len__")) if has_one_axis(X): X = [X] if Y is None: X, Y = [[]] * len(X) defining the shape of the output tensor. Can be a variable number of arguments or a collection like a list or tuple. Keyword arguments: 82 2. 预备知识 out (Tensor, optional): the output tensor. dtype (torch tensor([1., 1., 1., 1.]) 在Jupyter记事本中,我们可以使用?指令在另一个浏览器窗口中显示文档。例如,list?指令将创建 与help(list)指令几乎相同的内容,并在新的浏览器窗口中显示它。此外,如果我们使用两个问号,如list??, 将显示实现该函数的Python代码。 2.7. 查阅文档 83 小结 • 官方文档提供了本书之外的大量描述和示例。 •0 码力 | 797 页 | 29.45 MB | 1 年前3
PyTorch Tutorialreturns a dictionary of trainable parameters with their current values • model.parameters() - returns a list of all trainable parameters in the model • model.train() or model.eval() Putting things together regular Python class that inherits from the Dataset class. You can think of it as a kind of a Python list of tuples, each tuple corresponding to one point (features, label) • 3 components: • __init__(self)0 码力 | 38 页 | 4.09 MB | 1 年前3
共 24 条
- 1
- 2
- 3













