Apache Karaf Decanter 2.x - Documentation2. Custom Collector. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 2.2.1. Event Driven Collector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 2.2.2. Polled Collector. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . the kind of notification that you want. Apache Karaf Decanter provides Karaf features for each collector, appender, alerter. The first thing to do is to add the Decanter features repository in Karaf:0 码力 | 64 页 | 812.01 KB | 1 年前3
Apache Karaf Decanter 1.x - Documentation1.4.2. Alerters 2. Developer Guide 2.1. Architecture 2.2. Custom Collector 2.2.1. Event Driven Collector 2.2.2. Polled Collector 2.3. Custom Appender 2.4. Custom SLA Alerter 1. User Guide 1.1. the kind of notification that you want. Apache Karaf Decanter provides Karaf features for each collector, appender, SLA alerter. The first thing to do is to add the Decanter features repository in Karaf: collectors harvest the monitoring data, and send this data to the Decanter appenders. Two kinds of collector are available: • Event Driven Collectors react to events and "broadcast" the data to the appenders0 码力 | 67 页 | 213.16 KB | 1 年前3
OpenShift Container Platform 4.6 分布式追踪15 16 17 18 19 20 22 22 24 25 25 26 26 目 目录 录 1 3.2.5.2. 分布式追踪默认配置选项 3.2.5.3. Jaeger Collector 配置选项 3.2.5.4. 分布式追踪抽样配置选项 3.2.5.5. 分布式追踪存储配置选项 3.2.5.5.1. 自动置备 Elasticsearch 实例 3.2.5.5.2. 连接到现有 sidecar 3.2.6.1. 自动注入 sidecar 3.2.6.2. 手动注入 sidecar 3.3. 配置和部署分布式追踪数据收集 3.3.1. OpenTelemetry Collector 配置选项 3.3.2. 验证部署 3.3.3. 访问 Jaeger 控制台 3.4. 升级分布式追踪 3.4.1. 更改 2.0 的 Operator 频道 3.5. 删除分布式追踪 版本的 技术预览。 1.5.2. Red Hat OpenShift distributed tracing 2.2.0 技术预览 2.1 发行版本中包含的 OpenTelemetry Collector 组件已被删除。 1.5.3. Red Hat OpenShift distributed tracing 2.1.0 技术预览 此发行版本引入了一个具有破坏性的更改,这个变化与如何在 OpenTelemetry0 码力 | 59 页 | 572.03 KB | 1 年前3
OpenShift Container Platform 4.14 分布式追踪导出器转发并存储在 user-workload- monitoring 中。 支持 Operator 成熟度 级别 IV、Deep Insights,它启用了对 OpenTelemetry Collector 实例的升 级和监控,以及红帽构建的 OpenTelemetry Operator。 使用 OTLP 或 HTTP 和 HTTPS 报告远程集群中的追踪和指标。 通过 resourcedetection OpenTelemetry 项目。 重要 重要 Jaeger 不使用经 FIPS 验证的加密模块。 1.11.2. 技术预览功能 2.1 发行版本中包含的 OpenTelemetry Collector 组件已被删除。 1.11.3. 程序错误修复 此 Red Hat OpenShift distributed tracing platform 版本解决了 CVE 报告的安全漏洞问题以及程序错误。 mode: deployment config: | exporters: jaeger: endpoint: jaeger-production-collector-headless.tracing-system.svc:14250 ca_file: "/var/run/secrets/kubernetes.io/serviceaccount/service-ca0 码力 | 100 页 | 928.24 KB | 1 年前3
Scrapy 0.9 Documentationper spider. It’s called the Stats Collector, and it’s a singleton which can be imported and used quickly, as illustrated by the examples in the Common Stats Collector uses section below. The stats collection is enabled by default but can be disabled through the STATS_ENABLED setting. However, the Stats Collector is always available, so you can always import it in your module and use its API (to increment or simplifying the stats collector usage: you should spend no more than one line of code for collecting stats in your spider, Scrapy extension, or whatever code you’re using the Stats Collector from. Another feature0 码力 | 204 页 | 447.68 KB | 1 年前3
Scrapy 0.9 Documentationper spider. It’s called the Stats Collector, and it’s a singleton which can be imported and used quickly, as illustrated by the examples in the Common Stats Collector uses section below. The stats collection is enabled by default but can be disabled through the STATS_ENABLED setting. However, the Stats Collector is always available, so you can always import it in your module and use its API (to increment or simplifying the stats collector usage: you should spend no more than one line of code for collecting stats in your spider, Scrapy extension, or whatever code you’re using the Stats Collector from. Another feature0 码力 | 156 页 | 764.56 KB | 1 年前3
Dependency Injection in C++consolidation for DI class Builder { public: virtual void build(const Tick& tick) const { Data info = collector_.getData(tick, bid_, ask_, localAsk_, localBid_); //... } protected: // Sides info std::optional: public Builder { public: virtual void build(const Tick& tick) const override { Data info = collector_.getData(tick, bid_, ask_, localAsk_, localBid_, bidBroker_, std::optional askYield_; };Bloomberg 73 Data structure/Parameter consolidation for DI class Collector { public: // Basic Data Data getData(const Tick&, const std::optional & bid, const std::optional & 0 码力 | 106 页 | 1.76 MB | 6 月前3
Scrapy 0.12 Documentationper spider. It’s called the Stats Collector, and it’s a singleton which can be imported and used quickly, as illustrated by the examples in the Common Stats Collector uses section below. The stats collection is enabled by default but can be disabled through the STATS_ENABLED setting. However, the Stats Collector is always available, so you can always import it in your module and use its API (to increment or simplifying the stats collector usage: you should spend no more than one line of code for collecting stats in your spider, Scrapy extension, or whatever code you’re using the Stats Collector from. Another feature0 码力 | 177 页 | 806.90 KB | 1 年前3
Scrapy 0.12 Documentationper spider. It’s called the Stats Collector, and it’s a singleton which can be imported and used quickly, as illustrated by the examples in the Common Stats Collector uses section below. The stats collection is enabled by default but can be disabled through the STATS_ENABLED setting. However, the Stats Collector is always available, so you can always import it in your module and use its API (to increment or simplifying the stats collector usage: you should spend no more than one line of code for collecting stats in your spider, Scrapy extension, or whatever code you’re using the Stats Collector from. Another feature0 码力 | 228 页 | 462.54 KB | 1 年前3
Scrapy 0.16 DocumentationStats Collector, and can be accessed through the stats attribute of the Crawler API, as illustrated by the examples in the Common Stats Collector uses section below. However, the Stats Collector is always simplifying the stats collector usage: you should spend no more than one line of code for collecting stats in your spider, Scrapy extension, or whatever code you’re using the Stats Collector from. Another feature feature of the Stats Collector is that it’s very efficient (when enabled) and extremely efficient (almost unno- ticeable) when disabled. The Stats Collector keeps a stats table per open spider which is0 码力 | 203 页 | 931.99 KB | 1 年前3
共 668 条
- 1
- 2
- 3
- 4
- 5
- 6
- 67













