QCon北京2018-《未来都市--智慧城市与基于深度学习的机器视觉》-陈宇恒演讲者/陈宇恒 概要 • 我们是谁 • 智慧城市中机器视觉应用 • 我们是如何构建城市级AI+智慧城市系统 • 大规模深度学习实战系统的几点经验 l商汤科技联合创始人,架构师 lC++/Go/Rust/Ruby开发者 l多个开源项目贡献者 lNIPS国际会议论文作者 @chyh1990 2017.6 2016.3 2015.11 2014.6 2013.3 2011年中 20170 码力 | 23 页 | 9.26 MB | 1 年前3
PyTorch Release Notesrequires NVIDIA Driver release 530 or later. However, if you are running on a data center GPU (for example, T4 or any other data center GPU), you can use NVIDIA driver release 450.51 (or later R450), 470 workaround is to manually install a Conda package manager, and add the conda path to your PYTHONPATH for example, using export PYTHONPATH="/opt/conda/lib/python3.8/site-packages" if your Conda package manager was convergence from NVIDIA Volta™ tensor cores by using the latest deep learning example networks and model scripts for training. Each example model trains with mixed precision Tensor Cores on NVIDIA Volta and NVIDIA0 码力 | 365 页 | 2.94 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 3 - Learning Techniquessamples to achieve the same performance, which makes it cheaper to train. Refer to Figure 3-1 for an example of such a model, and note how it achieves accuracy similar to the baseline, but does so in fewer labelers looking at each example and assigning them a label that they believe describes it best. The assigned labels are subjective to the perception of their labelers. For example, a human labeler might might perceive the digit in figure 3-2 as a 1 and another one might see it as a 7. Figure 3-2: An example of handwritten digit that can potentially confuse the human labelers to choose a 1 or a 7 as the target0 码力 | 56 页 | 18.93 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 4 - Efficient Architecturesare excited to explain how they work. In the following section we will explain them through a toy example, but feel free to jump ahead if you are familiar with the motivation behind them. 1 Dimensionality explain these techniques in further detail in chapter 6. A Petting Zoo for Kids Let’s go back to our example of cute and dangerous animals, and represent each animal using two features, say cute and dangerous thought was reasonable. The purpose of this toy-example is to illustrate how embeddings work, and we encourage you to try and construct your own example to understand it better. represented on the x-axis0 码力 | 53 页 | 3.92 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 6 - Advanced Learning Techniques - Technical Reviewunlabeled dataset of animal images. The pre-trained model is then fine-tuned for downstream tasks, for example object detection for tigers, segmentation for pets etc., where the labeled data might be sparse. Anthology, 2005, aclanthology.org/I05-5002. Figure 6-1: Pre-training and fine-tuning stages. With an example of a large unlabeled animal images dataset which is used for pre-training. The pre-trained model train models that learn general representations. Doersch et al.2 extract two patches from a training example and train the model to predict their relative position in the image (refer to figure 6-4 (a)). They0 码力 | 31 页 | 4.03 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 1 - Introductionexpensive, since it requires training and paying humans to do the arduous job of going through each example and matching it to the given guidelines. The ImageNet dataset was a big boon in this aspect. It has model need to achieve the desired performance on the given task that the model is solving? For example, when a model is trained to predict if a given tweet contains offensive text, the user should be RAM consumption during inference, inference latency, etc. Using the sensitive tweet classifier example, during the deployment phase the user will be concerned about the inference efficiency and should0 码力 | 21 页 | 3.17 MB | 1 年前3
Lecture 1: Overviewis Machine Learning ? (Contd.) Example 1 T: Playing checkers P: Percentage of games won against an arbitrary opponent E: Playing practice games against itself Example 2 T: Recognizing hand-written (Contd.) Example 3 T: Categorize email messages as spam or legitimate P: Percentage of email messages correctly classified E: Database of emails, some with human-given labels Example 4 T: Driving examples Learner can query an oracle about class of an unlabeled example in the environment Learner can construct an arbitrary example and query an oracle for its label Learner can design and run experiments0 码力 | 57 页 | 2.41 MB | 1 年前3
keras tutorialthe modules by using the below syntax: Syntax conda install -c anacondaFor example, you want to install pandas: conda install -c anaconda pandas Like the same method, try it yourself layers sequentially and finally the output layer predict something useful about the input data. For example, the input may be an image and the output may be the thing identified in the image, say a “Cat” past. In this case bidirectional RNN is helpful to learn from the past and predict the future. For example, we have handwritten samples in multiple inputs. Suppose, we have confusion in one input then we 0 码力 | 98 页 | 1.57 MB | 1 年前3
《Efficient Deep Learning Book》[EDL] Chapter 7 - Automationeven for experts. The simplest approach is to try and see which ones produce the best results. For example, between quantization and clustering, which one is preferable? What is the performance impact when still there are the ones with continuous parameters. Some choices even have multiple parameters. For example, horizontal flip is a boolean choice, rotation requires a fixed angle or a range of rotation, and maximizes) an Evaluation Function . Formally, we can define as Let's understand this using the earlier example for choosing quantization and/or clustering techniques for model optimization. We have a search space0 码力 | 33 页 | 2.48 MB | 1 年前3
Lecture 7: K-MeansK-Means December 28, 2021 7 / 46 K-Means Algorithm (Lloyd, 1957) In each iteration (Re)-Assign each example xi to its closest cluster center (based on the smallest Euclidean distance) Ck = {xi | ∥xi − µk∥2 2021 29 / 46 Hierarchical Clustering Agglomerative (bottom-up) Clustering 1 Start with each example in its own singleton cluster 2 At each time-step, greedily merge 2 most similar clusters 3 Stop cluster 2 At each time-step, remove the “outsiders” from the least cohesive cluster 3 Stop when each example is in its own singleton cluster, else go to 2 Feng Li (SDU) K-Means December 28, 2021 31 / 46 Hierarchical0 码力 | 46 页 | 9.78 MB | 1 年前3
共 40 条
- 1
- 2
- 3
- 4













