Build Python App with Serverless 费良宏 费良宏 Technical Evangelist | Amazon Web Services lianghon@amazon.com 用 Python 开发 Serverless 应用 @ 2019, Amazon Web Services, Inc. or its Affiliates. All rights reserved $whoami • 软件工程师 (1991 – 今) • 别的服务 AWS Lambda AWS Fargate Amazon API Gateway Amazon SNS Amazon SQS AWS Step Functions 计 算 数 据 存 储 集 成 Amazon Aurora Serverless Amazon S3 Amazon DynamoDB AWS AppSync 2014.11 2016 Oracle Fn 2017.10 Project Riff “无服务器”计算平台的发展 2017.2 OpenFAAS “Serverless” 好在哪里? 来源:https://aws.amazon.com/cn/solutions/case-studies/autodesk-serverless/ f() f() f() f() 复杂的并行任务 工作流编排 AWS Lambda0 码力 | 35 页 | 7.81 MB | 1 年前3
2 张孝峰 Python与云 AWS的Python原生应用浅析 v2.6 2008/10 v3.0 2008/12 v2.7 2010/7 v3.7 2018/6 贝索斯开始思考 微服务 2000 Amazon S3 2006/3 Amazon SQS 2006/7 Amazon EC2 2006/8 开始研发 2004 AWS Lambda 2014/11 2019/10 22个区域 165项服务 Python被称为“瑞士军刀” 涵盖计算、存储、数据库、联网、分析、机器人、 机器学习与人工智能、物联网、移动、安全、混 合云、虚拟现实与增强现实、媒体,以及应用开 发、部署与管理等方面。 如何管理和使用海量的云API Amazon Athena Amazon Redshift 超过165项服务 数千个不同的API AWS Tools and SDKs • Python (boto3) • C++ • PHP • .NET • 无服务器服务不止Lambda AWS Lambda Amazon SNS Amazon SQS AWS Step Functions Amazon DynamoDB Amazon Kinesis AWS Amplify AWS AppSync Amazon S3 Amazon CloudFront Amazon API Gateway Amazon Athena 范例:媒体分析解决方案 用户上传0 码力 | 42 页 | 8.12 MB | 1 年前3
3 在AWS部署与发布你面向全球的Python Serverless应用 谢洪恩Packaging and Distribution Serverless to the Next Level Why Serverless Development transformation at Amazon: 2001–2002 monolithic application + teams 2001 Lesson learned: decompose for agility 2002 APIs are the front door of microservices Mobile apps Websites Services Internet Amazon CloudFront Amazon CloudWatch monitoring API Gateway cache Any other AWS service All publicly accessible Endpoints on Amazon EC2 Your VPC AWS Manage APIs with API Gateway Queues Simple Fully-managed Any volume Pub/sub Simple Fully-managed Flexible Amazon Simple Queue Service Amazon Simple Notification0 码力 | 53 页 | 24.15 MB | 1 年前3
Django Q Documentation
Release 0.6.4django_redis disque_nodes disque_auth iron_mq sqs bulk cache cpu_affinity Brokers Redis Disque IronMQ Amazon SQS Reference Tasks Async Groups Synchronous testing Connection pooling Reference Schedules Schedule during high loads: $ pip install hiredis Boto3 [https://github.com/boto/boto3] is used for the Amazon SQS broker in favor of the now deprecating boto library: $ pip install boto3 Iron-mq [https://github iron-mq [https://github.com/iron- io/iron_mq_python#configure] library for more info sqs To use Amazon SQS as a broker you need to provide the AWS region and credentials: # example SQS broker connection0 码力 | 53 页 | 512.86 KB | 1 年前3
Scrapy 1.7 Documentationorg/wiki/Web_scraping], it can also be used to extract data using APIs (such as Amazon Associates Web Services [https://affiliate- program.amazon.com/gp/advertising/api/detail/main.html]) or as a general purpose web easily change the export format (XML or CSV, for example) or the storage backend (FTP or Amazon S3 [https://aws.amazon.com/s3/], for example). You can also write an item pipeline to store the items in a database instead, set the FEED_STORAGE_FTP_ACTIVE setting to True. S3 The feeds are stored on Amazon S3 [https://aws.amazon.com/s3/]. URI scheme: s3 Example URIs: s3://mybucket/path/to/export.csv s3://aws_ke0 码力 | 391 页 | 598.79 KB | 1 年前3
Scrapy 2.6 Documentationorg/wiki/Web_scraping], it can also be used to extract data using APIs (such as Amazon Associates Web Services [https://affiliate- program.amazon.com/gp/advertising/api/detail/main.html]) or as a general purpose web easily change the export format (XML or CSV, for example) or the storage backend (FTP or Amazon S3 [https://aws.amazon.com/s3/], for example). You can also write an item pipeline to store the items in a database True. This storage backend uses delayed file delivery. S3 The feeds are stored on Amazon S3 [https://aws.amazon.com/s3/]. URI scheme: s3 Example URIs: s3://mybucket/path/to/export.csv s3://aws_ke0 码力 | 475 页 | 667.85 KB | 1 年前3
Scrapy 2.7 Documentationorg/wiki/Web_scraping], it can also be used to extract data using APIs (such as Amazon Associates Web Services [https://affiliate- program.amazon.com/gp/advertising/api/detail/main.html]) or as a general purpose web easily change the export format (XML or CSV, for example) or the storage backend (FTP or Amazon S3 [https://aws.amazon.com/s3/], for example). You can also write an item pipeline to store the items in a database True. This storage backend uses delayed file delivery. S3 The feeds are stored on Amazon S3 [https://aws.amazon.com/s3/]. URI scheme: s3 Example URIs: s3://mybucket/path/to/export.csv s3://aws_ke0 码力 | 490 页 | 682.20 KB | 1 年前3
Scrapy 1.8 Documentationorg/wiki/Web_scraping], it can also be used to extract data using APIs (such as Amazon Associates Web Services [https://affiliate- program.amazon.com/gp/advertising/api/detail/main.html]) or as a general purpose web easily change the export format (XML or CSV, for example) or the storage backend (FTP or Amazon S3 [https://aws.amazon.com/s3/], for example). You can also write an item pipeline to store the items in a database instead, set the FEED_STORAGE_FTP_ACTIVE setting to True. S3 The feeds are stored on Amazon S3 [https://aws.amazon.com/s3/]. URI scheme: s3 Example URIs: s3://mybucket/path/to/export.csv s3://aws_0 码力 | 451 页 | 616.57 KB | 1 年前3
Scrapy 1.1 Documentationorg/wiki/Web_scraping], it can also be used to extract data using APIs (such as Amazon Associates Web Services [https://affiliate- program.amazon.com/gp/advertising/api/detail/main.html]) or as a general purpose web easily change the export format (XML or CSV, for example) or the storage backend (FTP or Amazon S3 [https://aws.amazon.com/s3/], for example). You can also write an item pipeline to store the items in a database com/path/to/export.csv Required external libraries: none S3 The feeds are stored on Amazon S3 [https://aws.amazon.com/s3/]. URI scheme: s3 Example URIs: s3://mybucket/path/to/export.csv s3://aws_k0 码力 | 322 页 | 582.29 KB | 1 年前3
Scrapy 1.2 Documentationorg/wiki/Web_scraping], it can also be used to extract data using APIs (such as Amazon Associates Web Services [https://affiliate- program.amazon.com/gp/advertising/api/detail/main.html]) or as a general purpose web easily change the export format (XML or CSV, for example) or the storage backend (FTP or Amazon S3 [https://aws.amazon.com/s3/], for example). You can also write an item pipeline to store the items in a database com/path/to/export.csv Required external libraries: none S3 The feeds are stored on Amazon S3 [https://aws.amazon.com/s3/]. URI scheme: s3 Example URIs: s3://mybucket/path/to/export.csv s3://aws_k0 码力 | 330 页 | 548.25 KB | 1 年前3
共 176 条
- 1
- 2
- 3
- 4
- 5
- 6
- 18













