《Efficient Deep Learning Book》[EDL] Chapter 5 - Advanced Compression Techniquesperformance tradeoff. Next, the chapter goes over weight sharing using clustering. Weight sharing, and in particular clustering is a generalization of quantization. If you noticed, quantization ensures range. It creates equal sized quantization ranges (bins), regardless of the frequency of data. Clustering helps solve that problem by adapting the allocation of precision to match the distribution of the a convolutional layer which receives a 3-channel input. Each individual 3x3 matrix is a kernel. A column of 3 kernels represents a channel. As you might notice, with such structured sparsity we can obtain0 码力 | 34 页 | 3.18 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 4 - Efficient Architecturesthe petting zoo. If we revisit the plot in Figure 4-1 with the newly assigned labels in the third column of Table 4-2, we can see a pattern. It is possible to linearly separate3 the data points belonging described in chapter 2. We could also incorporate compression techniques such as sparsity, k-means clustering, etc. which will be discussed in the later chapters. 2. Even after compression, the vocabulary mechanism. On the left is the list of tokens. The tokens are hashed using the hash function in the center column. The output of the hash function % vocab_size is an index in [0, vocab_size - 1] and is used to refer0 码力 | 53 页 | 3.92 MB | 1 年前3
Lecture 7: K-Means/ 46 Outline 1 Clustering 2 K-Means Method 3 K-Means Optimization Problem 4 Kernel K-Means 5 Hierarchical Clustering Feng Li (SDU) K-Means December 28, 2021 2 / 46 Clustering Usually an unsupervised labels A good clustering is one that achieves: High within-cluster similarity Low inter-cluster similarity Feng Li (SDU) K-Means December 28, 2021 3 / 46 Similarity can be Subjective Clustering only looks important in clustering Also important to define/ask: “Clustering based on what”? Feng Li (SDU) K-Means December 28, 2021 4 / 46 Clustering: Some Examples Document/Image/Webpage Clustering Image Segmentation0 码力 | 46 页 | 9.78 MB | 1 年前3
Lecture 1: OverviewDiscovering Clusters Feng Li (SDU) Overview September 6, 2023 29 / 57 Unsupervised Learning: Clustering (Contd.) Feng Li (SDU) Overview September 6, 2023 30 / 57 Unsupervised Learning: Discovering required compared to supervised learning. At the same time, improving the results of unsupervised clustering to the expectations of the user. With lots of unlabeled data the decision boundary becomes apparent Constrained Clustering Distance Metric Learning Manifold based Learning Sparsity based Learning (Compressed Sensing) Feng Li (SDU) Overview September 6, 2023 40 / 57 Constrained Clustering When we have0 码力 | 57 页 | 2.41 MB | 1 年前3
机器学习课程-温州大学-10机器学习-聚类与此不同的是,在无监督学习中,我们的数据没有附带任何标签?,无 监督学习主要分为聚类、降维、关联规则、推荐系统等方面。 监督学习和无监督学习的区别 5 1.无监督学习方法概述 ✓ 聚类(Clustering) ✓ 如何将教室里的学生按爱好、身高划分为5类? ✓ 降维( Dimensionality Reduction ) ✓ 如何将将原高维空间中的数据点映射到低维度的空间中? ✓ 关联规则( 内,那么集合 S 称为凸集。反之,为非凸集。 29 密度聚类-DBSCAN DBSCAN密度聚类 与划分和层次聚类方法不同,DBSCAN(Density-Based Spatial Clustering of Applications with Noise)是一个比较有代表性的基于密度的聚类算法。它将簇 定义为密度相连的点的最大集合,能够把具有足够高密度的区域划分为簇,并 可在噪声的空间数据库中发现任意形状的聚类。 类结果与真实情况越吻合。从广义的角度 来讲,ARI衡量的是两个数据分布的吻合 程度 46 参考文献 [1] Wong J A H A . Algorithm AS 136: A K-Means Clustering Algorithm[J]. Journal of the Royal Statistical Society, 1979, 28(1):100-108. [2] Ester M . A0 码力 | 48 页 | 2.59 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 7 - Automationbetween quantization and clustering, which one is preferable? What is the performance impact when both are used together? We have four options: none, quantization, clustering, and both. We would need to for choosing quantization and/or clustering techniques for model optimization. We have a search space which has two boolean valued parameters: quantization and clustering. A $$True$$ value means that the0 码力 | 33 页 | 2.48 MB | 1 年前3
《TensorFlow 快速入门与实战》7-实战TensorFlow人脸识别Kalenichenko, D. and Philbin, J., 2015. Facenet: A unified embedding for face recognition and clustering. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 815-823) Schroff, Dmitry Kalenichenko, James Philbin. FaceNet: A unified embedding for face recognition and clustering. 2015, computer vision and pattern recognition. Facebook DeepFace �������� �e��m�o�p��ruE��i��9������ Schroff, Dmitry Kalenichenko, James Philbin. FaceNet: A unified embedding for face recognition and clustering. 2015, computer vision and pattern recognition. �������� �������� �� API – �AI���� �� API –0 码力 | 81 页 | 12.64 MB | 1 年前3
机器学习课程-温州大学-10深度学习-人脸识别与风格迁移• 有一个K个人的人脸数据库 • 获取输入图像 • 如果图像是K个人中的某人(或不认识) • 输入图片,以及某人的ID或者是名字 • 验证输入图片是否是这个人 人脸聚类(Face Clustering) 在数据库中对人脸进行聚类, 直接K-Means即可。 5 1.人脸识别概述 人脸检测的步骤 • 人脸定位 确定是否存在人脸,人脸存在的位置、范围等 • 人脸对齐 把众多人脸图像转换到一个统一角度和姿势0 码力 | 34 页 | 2.49 MB | 1 年前3
Lecture 6: Support Vector Machineregression, etc. Many of the unsupervised learning algorithms too can be kernelized (e.g., K-means clustering, Principal Component Analysis, etc.) Feng Li (SDU) SVM December 28, 2021 53 / 82 Kernelized SVM0 码力 | 82 页 | 773.97 KB | 1 年前3
机器学习课程-温州大学-01机器学习-引言根据肿瘤的体积、患者的年龄来判断良性或恶性? ✓ 回归(Regression、Prediction) ✓ 如何预测上海浦东的房价? ✓ 未来的股票市场走向? 2. 机器学习的类型-监督学习 17 ✓ 聚类(Clustering) ✓ 如何将教室里的学生按爱好、身高划分为5类? ✓ 降维( Dimensionality Reduction ) ✓ 如何将将原高维空间中的数据点映射到低维度的 空间中? 2.0 码力 | 78 页 | 3.69 MB | 1 年前3
共 19 条
- 1
- 2













