《Efficient Deep Learning Book》[EDL] Chapter 1 - Introductionoften tolerate approximate responses, since often there are no exact answers. Machine learning algorithms help build models, which as the name suggests is an approximate mathematical model of what outputs that you would end up clicking on, at that particular moment, with more data and sophisticated algorithms, these models can be trained to be fairly accurate over a longer term. Figure 1-1: Relation between training algorithms There has been substantial progress in machine learning algorithms over the past two decades. Stochastic Gradient Descent (SGD) and Backpropagation were the well-known algorithms designed0 码力 | 21 页 | 3.17 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 4 - Efficient Architecturesnumber of examples / more than two features? In those cases, we could use classical machine learning algorithms like the Support Vector Machine4 (SVM) to learn classifiers that would do this for us. We could models: 1. Embedding Table Generation: Generate the embeddings for the inputs using machine learning algorithms of your choice. 2. Embedding Lookup: Look up the embeddings for the inputs in the embedding table embeddings. One example of an automated embedding generation technique is the word2vec family of algorithms6 (apart from others like GloVe7) which can learn embeddings for word tokens for NLP tasks. The0 码力 | 53 页 | 3.92 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 7 - Automationrest of the states (unimportant parameters). Figure 7-2: A comparison of hyperparameter search algorithms for two hyperparameters. The blue contours show the regions with positive results while the red search approach on the budget allocation to cap the resource utilization. Multi-Armed Bandit based algorithms allocate a finite amount of resources to a set of hyperparameter configurations. The trials for HyperBand to terminate the runs sooner if they do not show improvements for a number of epochs. The algorithms like HyperBand bring the field of HPO closer to the evolutionary approaches which are based on0 码力 | 33 页 | 2.48 MB | 1 年前3
QCon北京2018-《从键盘输入到神经网络--深度学习在彭博的应用》-李碧野ng_algorithms#/media/File:Moving_From_unknown_to_known_feature_spaces_based_on_TS-ELM_with_random_kernels_and_connections.tif https://commons.wikimedia.org/wiki/Category:Machine_learning_algorithms#/m andom_kernels_and_connections.tif https://commons.wikimedia.org/wiki/Category:Machine_learning_algorithms#/media/File:OPTICS.svg May be re-distributed in accordance with the terms of the CC-SA 4.0 license0 码力 | 64 页 | 13.45 MB | 1 年前3
Lecture 1: OverviewResearch Fellow, National University of Singapore, Singapore. Research Interests: Distributed Algorithms and Systems, Wireless Net- works, Mobile Computing, Internet of Things. Feng Li (SDU) Overview September 6, 2023 3 / 57 Course Information We will investigate fundamental concepts, techniques and algorithms in machine learning. The topics include linear regression, logistic re- gression, regularization Overview September 6, 2023 45 / 57 Active Learning Basic idea: Traditional supervised learning algorithms passively accept training data. Instead, query for annotations on informative images from the unlabeled0 码力 | 57 页 | 2.41 MB | 1 年前3
Machine Learning Pytorch TutorialValidation Testing Step 4. torch.optim Load Data torch.optim ● Gradient-based optimization algorithms that adjust network parameters to reduce error. (See Adaptive Learning Rate lecture video) ● optimizer.step() to adjust model parameters. See official documentation for more optimization algorithms. Training & Testing Neural Networks – in Pytorch Define Neural Network Loss Function Optimization0 码力 | 48 页 | 584.86 KB | 1 年前3
keras tutorialof the major subfield of machine learning framework. Machine learning is the study of design of algorithms, inspired from the model of human brain. Deep learning is becoming more popular in data science open source machine learning library. It is used for classification, regression and clustering algorithms. Before moving to the installation, it requires the following: Python version 3.5 or higher process huge amount of features, which makes deep learning a very powerful tool. Deep learning algorithms are also useful for the analysis of unstructured data. Let us go through the basics of deep learning0 码力 | 98 页 | 1.57 MB | 1 年前3
Lecture 6: Support Vector Machineby φ(x(i))Tφ(x(j)) = K(x(i), x(j)) Most learning algorithms are like that SVM, linear regression, etc. Many of the unsupervised learning algorithms too can be kernelized (e.g., K-means clustering, Principal0 码力 | 82 页 | 773.97 KB | 1 年前3
Lecture Notes on Support Vector Machinex(j) with K(x(i), x(j)). Actually, most learning algorithms are like that, such as SVM, linear regression, etc. Many of the unsupervised learning algorithms (e.g., K-means clustering, Principal Component0 码力 | 18 页 | 509.37 KB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 5 - Advanced Compression TechniquesarXiv:1510.00149 (2015). 16 Bottou, Leon, and Yoshua Bengio. "Convergence properties of the k-means algorithms." Advances in neural information processing systems 7 (1994). 15 David Arthur and Sergei Vassilvitskii advantages of careful seeding. In Proceedings of the eighteenth annual ACM-SIAM symposium on Discrete algorithms (SODA '07). Society for Industrial and Applied Mathematics, USA, 1027–1035. with tf.GradientTape()0 码力 | 34 页 | 3.18 MB | 1 年前3
共 18 条
- 1
- 2













