积分充值
 首页
前端开发
AngularDartElectronFlutterHTML/CSSJavaScriptReactSvelteTypeScriptVue.js构建工具
后端开发
.NetC#C++C语言DenoffmpegGoIdrisJavaJuliaKotlinLeanMakefilenimNode.jsPascalPHPPythonRISC-VRubyRustSwiftUML其它语言区块链开发测试微服务敏捷开发架构设计汇编语言
数据库
Apache DorisApache HBaseCassandraClickHouseFirebirdGreenplumMongoDBMySQLPieCloudDBPostgreSQLRedisSQLSQLiteTiDBVitess数据库中间件数据库工具数据库设计
系统运维
AndroidDevOpshttpdJenkinsLinuxPrometheusTraefikZabbix存储网络与安全
云计算&大数据
Apache APISIXApache FlinkApache KarafApache KyuubiApache OzonedaprDockerHadoopHarborIstioKubernetesOpenShiftPandasrancherRocketMQServerlessService MeshVirtualBoxVMWare云原生CNCF机器学习边缘计算
综合其他
BlenderGIMPKiCadKritaWeblate产品与服务人工智能亿图数据可视化版本控制笔试面试
文库资料
前端
AngularAnt DesignBabelBootstrapChart.jsCSS3EchartsElectronHighchartsHTML/CSSHTML5JavaScriptJerryScriptJestReactSassTypeScriptVue前端工具小程序
后端
.NETApacheC/C++C#CMakeCrystalDartDenoDjangoDubboErlangFastifyFlaskGinGoGoFrameGuzzleIrisJavaJuliaLispLLVMLuaMatplotlibMicronautnimNode.jsPerlPHPPythonQtRPCRubyRustR语言ScalaShellVlangwasmYewZephirZig算法
移动端
AndroidAPP工具FlutterFramework7HarmonyHippyIoniciOSkotlinNativeObject-CPWAReactSwiftuni-appWeex
数据库
ApacheArangoDBCassandraClickHouseCouchDBCrateDBDB2DocumentDBDorisDragonflyDBEdgeDBetcdFirebirdGaussDBGraphGreenPlumHStreamDBHugeGraphimmudbIndexedDBInfluxDBIoTDBKey-ValueKitDBLevelDBM3DBMatrixOneMilvusMongoDBMySQLNavicatNebulaNewSQLNoSQLOceanBaseOpenTSDBOracleOrientDBPostgreSQLPrestoDBQuestDBRedisRocksDBSequoiaDBServerSkytableSQLSQLiteTiDBTiKVTimescaleDBYugabyteDB关系型数据库数据库数据库ORM数据库中间件数据库工具时序数据库
云计算&大数据
ActiveMQAerakiAgentAlluxioAntreaApacheApache APISIXAPISIXBFEBitBookKeeperChaosChoerodonCiliumCloudStackConsulDaprDataEaseDC/OSDockerDrillDruidElasticJobElasticSearchEnvoyErdaFlinkFluentGrafanaHadoopHarborHelmHudiInLongKafkaKnativeKongKubeCubeKubeEdgeKubeflowKubeOperatorKubernetesKubeSphereKubeVelaKumaKylinLibcloudLinkerdLonghornMeiliSearchMeshNacosNATSOKDOpenOpenEBSOpenKruiseOpenPitrixOpenSearchOpenStackOpenTracingOzonePaddlePaddlePolicyPulsarPyTorchRainbondRancherRediSearchScikit-learnServerlessShardingSphereShenYuSparkStormSupersetXuperChainZadig云原生CNCF人工智能区块链数据挖掘机器学习深度学习算法工程边缘计算
UI&美工&设计
BlenderKritaSketchUI设计
网络&系统&运维
AnsibleApacheAWKCeleryCephCI/CDCurveDevOpsGoCDHAProxyIstioJenkinsJumpServerLinuxMacNginxOpenRestyPrometheusServertraefikTrafficUnixWindowsZabbixZipkin安全防护系统内核网络运维监控
综合其它
文章资讯
 上传文档  发布文章  登录账户
IT文库
  • 综合
  • 文档
  • 文章

无数据

分类

全部云计算&大数据(11)Apache Flink(11)

语言

全部英语(10)中文(简体)(1)

格式

全部PDF文档 PDF(11)
 
本次搜索耗时 0.017 秒,为您找到相关结果约 11 个.
  • 全部
  • 云计算&大数据
  • Apache Flink
  • 全部
  • 英语
  • 中文(简体)
  • 全部
  • PDF文档 PDF
  • 默认排序
  • 最新排序
  • 页数排序
  • 大小排序
  • 全部时间
  • 最近一天
  • 最近一周
  • 最近一个月
  • 最近三个月
  • 最近半年
  • 最近一年
  • pdf文档 Streaming in Apache Flink

    Examples Tuples Tuple1 through Tuple25 types. POJOs A POJO (plain old Java object) is any Java class that • has an empty default constructor • all fields are either ◦public, or ◦have a default getter Tuple2<>("Fred", 35); // zero based index! String name = person.f0; Integer age = person.f1; public class Person { public String name; public Integer age; public Person() {}; public total fare collected Lab 1 -- Ride Cleansing Transforming Data Transforming Data public static class EnrichedRide extends TaxiRide { public int startCell; public int endCell; public EnrichedRide()
    0 码力 | 45 页 | 3.00 MB | 1 年前
    3
  • pdf文档 Introduction to Apache Flink and Apache Kafka - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    Boston University 2020 DataStream API Basics Vasiliki Kalavri | Boston University 2020 case class Reading(id: String, time: Long, temp: Double)
 
 object MaxSensorReadings { def main(args: Array[String]) temperature”)
 }
 } Example: Sensor Readings 7 Vasiliki Kalavri | Boston University 2020 case class Reading(id: String, time: Long, temp: Double)
 
 object MaxSensorReadings { def main(args: Array[String]) temperature reading Example: Sensor Readings 8 Vasiliki Kalavri | Boston University 2020 case class Reading(id: String, time: Long, temp: Double)
 
 object MaxSensorReadings { def main(args: Array[String])
    0 码力 | 26 页 | 3.33 MB | 1 年前
    3
  • pdf文档 State management - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    data types handled by the state are specified as Class or TypeInformation objects. 16 Registering state Vasiliki Kalavri | Boston University 2020 class TemperatureAlertFunction(val threshold: Double) assign name and get the state handle In the operator (FlatMap) class In the open() method Vasiliki Kalavri | Boston University 2020 class TemperatureAlertFunction(val threshold: Double) extends Rich flatMap(new MatchFunction());
 Java example 20 Vasiliki Kalavri | Boston University 2020 public static class EnrichmentFunction extends RichCoFlatMapFunction> {

    0 码力 | 24 页 | 914.13 KB | 1 年前
    3
  • pdf文档 PyFlink 1.15 Documentation

    DataStream API. [7]: from pyflink.common import Row from pyflink.datastream import FlatMapFunction class MyFlatMapFunction(FlatMapFunction): def flat_map(self, value): for s in str(value.data).split('|'): scala:234) at scala.collection.Iterator$class.foreach(Iterator.scala:891) at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) scala.collection.AbstractIterable.foreach(Iterable.scala:54) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.AbstractTraversable.map(Traversable.scala:104)
    0 码力 | 36 页 | 266.77 KB | 1 年前
    3
  • pdf文档 PyFlink 1.16 Documentation

    DataStream API. [7]: from pyflink.common import Row from pyflink.datastream import FlatMapFunction class MyFlatMapFunction(FlatMapFunction): def flat_map(self, value): for s in str(value.data).split('|'): scala:234) at scala.collection.Iterator$class.foreach(Iterator.scala:891) at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) scala.collection.AbstractIterable.foreach(Iterable.scala:54) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.AbstractTraversable.map(Traversable.scala:104)
    0 码力 | 36 页 | 266.80 KB | 1 年前
    3
  • pdf文档 Windows and triggers - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    average temperature per sensor. // The accumulator holds the sum of temperatures and an event count. class AvgTempFunction extends AggregateFunction [(String, Double), (String, Double, Int), (String, Double)] watermark. ProcessWindowFunction 18 Vasiliki Kalavri | Boston University 2020 public abstract class ProcessWindowFunction extends AbstractRichFunction { // Evaluates KEY key, Context ctx, Iterable vals, Collector out) throws Exception; public abstract class Context implements Serializable { // Returns the metadata of the window public
    0 码力 | 35 页 | 444.84 KB | 1 年前
    3
  • pdf文档 Course introduction - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    Announcements, updates, discussions • Website: vasia.github.io/dspa20 • Syllabus: /syllabus.html • Class schedule: /lectures.html • including today’s slides • Piazza: piazza.com/bu/spring2020/cs591k1/home applications 6 Vasiliki Kalavri | Boston University 2020 Grading Scheme (1) • No Exam • 5 in-class quizzes (10%): • Each quiz contributes 2% to the final grade • 3 hands-on assignments (40%): Kalavri | Boston University 2020 Schedule 9 vasia.github.io/dspa20/ lectures.html deadline no class guest lecture quizzes and announcements Vasiliki Kalavri | Boston University 2020 Guest Lectures
    0 码力 | 34 页 | 2.53 MB | 1 年前
    3
  • pdf文档 Scalable Stream Processing - Spark Streaming and Flink

    source: extend the Receiver class. ▶ Implement onStart() and onStop(). ▶ Call store(data) to store received data inside Spark. 16 / 79 Input Operations - Custom Sources (2/3) class CustomReceiver(host: String 79 Basic Operations ▶ Most of operations on DataFrame/Dataset are supported for streaming. case class Call(action: String, time: Timestamp, id: Int) val df: DataFrame = spark.readStream.json("s3://logs")
    0 码力 | 113 页 | 1.22 MB | 1 年前
    3
  • pdf文档 Notions of time and progress - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    setAutoWatermarkInterval(5000) Watermarks in Flink 18 Vasiliki Kalavri | Boston University 2020 class PeriodicAssigner extends AssignerWithPeriodicWatermarks[Reading] { val bound: Long = 60 * 1000 // // return record timestamp r.timestamp } } 19 Vasiliki Kalavri | Boston University 2020 20 class PunctuatedAssigner extends AssignerWithPunctuatedWatermarks[Reading] { val bound: Long = 60 * 1000
    0 码力 | 22 页 | 2.22 MB | 1 年前
    3
  • pdf文档 Stream processing fundamentals - CS 591 K1: Data Stream Processing and Analytics Spring 2020

    punctuations • window fires, post becomes inactive 41 Vasiliki Kalavri | Boston University 2020 case class Reading(id: String, time: Long, temp: Double)
 
 object MaxSensorReadings { def main(args: Array[String])
    0 码力 | 45 页 | 1.22 MB | 1 年前
    3
共 11 条
  • 1
  • 2
前往
页
相关搜索词
StreaminginApacheFlinkIntroductiontoandKafkaCS591K1DataStreamProcessingAnalyticsSpring2020StatemanagementPy1.15Documentation1.16WindowstriggersCourseintroductionScalableSparkNotionsoftimeprogressprocessingfundamentals
IT文库
关于我们 文库协议 联系我们 意见反馈 免责声明
本站文档数据由用户上传或本站整理自互联网,不以营利为目的,供所有人免费下载和学习使用。如侵犯您的权益,请联系我们进行删除。
IT文库 ©1024 - 2025 | 站点地图
Powered By MOREDOC AI v3.3.0-beta.70
  • 关注我们的公众号【刻舟求荐】,给您不一样的精彩
    关注我们的公众号【刻舟求荐】,给您不一样的精彩