Celery v5.0.1 Documentationfollowing keys: database The database name to connect to. Defaults to celery. taskmeta_collection The collection name to store task meta data. Defaults to celery_taskmeta. max_pool_size Passed as max_pool_size 'mongodb://localhost:27017/' mongodb_backend_settings = { 'database': 'mydb', 'taskmeta_collection': 'my_taskmeta_collection', } Redis backend settings Configuring the backend URL Note The Redis backend result_backend set to a ArangoDB URL: result_backend = 'arangodb://username:password@host:port/database/collection' arangodb_backend_settings Default: {} (empty mapping). This is a dict supporting the following0 码力 | 2313 页 | 2.13 MB | 1 年前3
Celery v5.0.2 Documentationfollowing keys: database The database name to connect to. Defaults to celery. taskmeta_collection The collection name to store task meta data. Defaults to celery_taskmeta. max_pool_size Passed as max_pool_size 'mongodb://localhost:27017/' mongodb_backend_settings = { 'database': 'mydb', 'taskmeta_collection': 'my_taskmeta_collection', } Redis backend settings Configuring the backend URL Note The Redis backend result_backend set to a ArangoDB URL: result_backend = 'arangodb://username:password@host:port/database/collection' arangodb_backend_settings Default: {} (empty mapping). This is a dict supporting the following0 码力 | 2313 页 | 2.14 MB | 1 年前3
Celery v5.0.0 Documentationfollowing keys: database The database name to connect to. Defaults to celery. taskmeta_collection The collection name to store task meta data. Defaults to celery_taskmeta. max_pool_size Passed as max_pool_size 'mongodb://localhost:27017/' mongodb_backend_settings = { 'database': 'mydb', 'taskmeta_collection': 'my_taskmeta_collection', } Redis backend settings Configuring the backend URL Note The Redis backend result_backend set to a ArangoDB URL: result_backend = 'arangodb://username:password@host:port/database/collection' arangodb_backend_settings Default: {} (empty mapping). This is a dict supporting the following0 码力 | 2309 页 | 2.13 MB | 1 年前3
Celery v5.0.5 Documentationfollowing keys: database The database name to connect to. Defaults to celery. taskmeta_collection The collection name to store task meta data. Defaults to celery_taskmeta. max_pool_size Passed as max_pool_size 'mongodb://localhost:27017/' mongodb_backend_settings = { 'database': 'mydb', 'taskmeta_collection': 'my_taskmeta_collection', } Redis backend settings Configuring the backend URL Note The Redis backend result_backend set to a ArangoDB URL: result_backend = 'arangodb://username:password@host:port/database/collection' arangodb_backend_settings Default: {} (empty mapping). This is a dict supporting the following0 码力 | 2315 页 | 2.14 MB | 1 年前3
sqlalchemy tutorialobject of MetaData class from SQLAlchemy Metadata is a collection of Table objects and their associated schema constructs. It holds a collection of Table objects as well as an optional binding to an internally as interface between mapped class and table. Each Table object is a member of larger collection known as MetaData and this object is available using the .metadata attribute of declarative the session. Its state is persisted in the database on next flush operation add_all() adds a collection of objects to the session commit() flushes all items and any transaction in progress delete()0 码力 | 92 页 | 1.77 MB | 1 年前3
Celery 3.0 Documentationprocess. class celery.result.ResultSet(results, app=None, ready_barrier=None, **kwargs)[source] A collection of results. Parameters: results (Sequence[AsyncResult]) – List of result instances. add(result)[source] Password censored if disabled. cleanup()[source] Delete expired meta-data. collection[source] Get the meta-data task collection. database[source] Get database from MongoDB connection. performs authentication decode(data)[source] encode(data)[source] expires_delta[source] group_collection[source] Get the meta-data task collection. groupmeta_collection = u'celery_groupmeta' host = u'localhost' max_pool_size = 100 码力 | 2110 页 | 2.23 MB | 1 年前3
Celery v4.0.0 Documentationprocess. class celery.result.ResultSet(results, app=None, ready_barrier=None, **kwargs)[source] A collection of results. Parameters: results (Sequence[AsyncResult]) – List of result instances. add(result)[source] Password censored if disabled. cleanup()[source] Delete expired meta-data. collection[source] Get the meta-data task collection. database[source] Get database from MongoDB connection. performs authentication decode(data)[source] encode(data)[source] expires_delta[source] group_collection[source] Get the meta-data task collection. groupmeta_collection = u'celery_groupmeta' host = u'localhost' max_pool_size = 100 码力 | 2106 页 | 2.23 MB | 1 年前3
Scrapy 0.14 DocumentationBuilt-in services Logging Understand the simple logging facility provided by Scrapy. Stats Collection Collect statistics about your scraping crawler. Sending e-mail Send email notifications when from HTML and XML sources Built-in support for cleaning and sanitizing the scraped data using a collection of reusable filters (called Item Loaders) shared between all the spiders. Built-in support for auto-detection, for dealing with foreign, non- standard and broken encoding declarations. Extensible stats collection for multiple spider metrics, useful for monitoring the performance of your spiders and detecting0 码力 | 235 页 | 490.23 KB | 1 年前3
Scrapy 0.14 Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 4.2 Stats Collection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 from HTML and XML sources • Built-in support for cleaning and sanitizing the scraped data using a collection of reusable filters (called Item Loaders) shared between all the spiders. • Built-in support for for dealing with foreign, non-standard and broken encoding dec- larations. • Extensible stats collection for multiple spider metrics, useful for monitoring the performance of your spiders and detecting0 码力 | 179 页 | 861.70 KB | 1 年前3
Scrapy 0.16 Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 4.2 Stats Collection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 from HTML and XML sources • Built-in support for cleaning and sanitizing the scraped data using a collection of reusable filters (called Item Loaders) shared between all the spiders. • Built-in support for more consistent on large projects. See genspider command for more details. • Extensible stats collection for multiple spider metrics, useful for monitoring the performance of your spiders and detecting0 码力 | 203 页 | 931.99 KB | 1 年前3
共 329 条
- 1
- 2
- 3
- 4
- 5
- 6
- 33













