Lecture 6: Support Vector Machineequivalent to minimizing ∥ω∥2 = ωTω min ω,b ωTω s.t. y(i)(ωTx(i) + b) ≥ 1, ∀i This is a quadratic programming (QP) problem! Interior point method (https://en.wikipedia.org/wiki/Interior-point_method) Active problem, so the strong duality (p∗ = d∗) holds and the KKT conditions are respected Quadratic Programming problem in α Several off-the-shelf solvers exist to solve such QPs Some examples: quadprog (MATLAB)0 码力 | 82 页 | 773.97 KB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 4 - Efficient Architecturesefficient models capable of running on mobile and edge devices. We have also set up a couple of programming projects for a hands-on model optimization experience using these efficient layers and architectures How do DSC models perform on the quality metrics? To answer this question, we have prepared a programming project for you in the next section. We take up a novel task to train a model that predicts a segmentation0 码力 | 53 页 | 3.92 MB | 1 年前3
Lecture Notes on Support Vector Machine(x(i), y(i)), ωT x(i) +b ≥ 1 if y(i) = 1, and ωT x(i) + b ≤ 1 if y(i) = −1. This is a quadratic programming (QP) problem, and can be solved by exiting generic QP solvers, e.g., interior point method, active0 码力 | 18 页 | 509.37 KB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 2 - Compression Techniquesevaluation is a boiler-plate code. There is not much we can do to make it interesting. We are programming in the python language. Naturally, it is possible to use other languages (like Java for Android0 码力 | 33 页 | 1.96 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 3 - Learning Techniquesfollows right after. Following the lead from the previous chapters, the theory is complemented with programming projects to assist readers to implement these techniques from scratch. Our journey of learning0 码力 | 56 页 | 18.93 MB | 1 年前3
动手学深度学习 v2.0总而言之,我们没有编写唤醒词识别器,而是编写了一个“学习”程序。如果我们用一个巨大的带标签的数 据集,它很可能可以“学习”识别唤醒词。这种“通过用数据集来确定程序行为”的方法可以被看作用数据 编程(programming with data)。比如,我们可以通过向机器学习系统,提供许多猫和狗的图片来设计一个 “猫图检测器”。检测器最终可以学会:如果输入是猫的图片就输出一个非常大的正数,如果输入是狗的图片 选择构成的所有可能 的组合进行求和。如果任何hi可以接受k个不同的值(有限的状态数),这意味着我们需要对kT 个项求和,这 个任务显然难于登天。幸运的是,有个巧妙的解决方案:动态规划(dynamic programming)。 352 9. 现代循环神经网络 要了解动态规划的工作方式,我们考虑对隐变量h1, . . . , hT 的依次求和。根据 (9.4.1),将得出: P(x1, . . . , xT 对于前几章中实现的那些模型,可以进一步提高它们的计算性能。例如,我们可以在不影响准确性的前提下, 大大减少训练时间。 12.1 编译器和解释器 目前为止,本书主要关注的是命令式编程(imperative programming)。命令式编程使用诸如print、“+” 和if之类的语句来更改程序的状态。考虑下面这段简单的命令式程序: def add(a, b): return a + b def fancy_func(a0 码力 | 797 页 | 29.45 MB | 1 年前3
PyTorch Release Notesuse `export CUDA_MODULE_LOADING=EAGER` or `unset CUDA_MODULE_LOADING`. Refer to the CUDA C++ Programming Guide for more information about this environment variable. Announcements ‣ NVIDIA Deep Learning0 码力 | 365 页 | 2.94 MB | 1 年前3
共 7 条
- 1













