• 大数据(9d)Flink转换算子Transform


    环境

    开发环境:WIN10+IDEA

    pom.xml

    
    <properties>
        <maven.compiler.source>8maven.compiler.source>
        <maven.compiler.target>8maven.compiler.target>
        <flink.version>1.14.6flink.version>
        <scala.binary.version>2.12scala.binary.version>
        <slf4j.version>2.0.3slf4j.version>
        <log4j.version>2.17.2log4j.version>
        <fastjson.version>2.0.19fastjson.version>
        <lombok.version>1.18.24lombok.version>
    properties>
    
    <dependencies>
        
        <dependency>
            <groupId>org.apache.flinkgroupId>
            <artifactId>flink-javaartifactId>
            <version>${flink.version}version>
        dependency>
        <dependency>
            <groupId>org.apache.flinkgroupId>
            <artifactId>flink-streaming-java_${scala.binary.version}artifactId>
            <version>${flink.version}version>
        dependency>
        <dependency>
            <groupId>org.apache.flinkgroupId>
            <artifactId>flink-clients_${scala.binary.version}artifactId>
            <version>${flink.version}version>
        dependency>
        <dependency>
            <groupId>org.apache.flinkgroupId>
            <artifactId>flink-runtime-web_${scala.binary.version}artifactId>
            <version>${flink.version}version>
        dependency>
        
        <dependency>
            <groupId>org.slf4jgroupId>
            <artifactId>slf4j-apiartifactId>
            <version>${slf4j.version}version>
        dependency>
        <dependency>
            <groupId>org.slf4jgroupId>
            <artifactId>slf4j-log4j12artifactId>
            <version>${slf4j.version}version>
        dependency>
        <dependency>
            <groupId>org.apache.logging.log4jgroupId>
            <artifactId>log4j-to-slf4jartifactId>
            <version>${log4j.version}version>
        dependency>
    dependencies>
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    • 23
    • 24
    • 25
    • 26
    • 27
    • 28
    • 29
    • 30
    • 31
    • 32
    • 33
    • 34
    • 35
    • 36
    • 37
    • 38
    • 39
    • 40
    • 41
    • 42
    • 43
    • 44
    • 45
    • 46
    • 47
    • 48
    • 49
    • 50
    • 51

    log4j.properties

    log4j.rootLogger=error, stdout
    log4j.appender.stdout=org.apache.log4j.ConsoleAppender
    log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
    log4j.appender.stdout.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n
    
    • 1
    • 2
    • 3
    • 4

    java模板

    import org.apache.flink.streaming.api.datastream.DataStreamSource;
    import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
    
    public class Hello {
        public static void main(String[] args) throws Exception {
            //创建流执行环境
            StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
            //设置并行度
            env.setParallelism(1);
            //创建流数据源
            DataStreamSource<Long> d = env.fromElements(1L, 2L, 3L, 4L);
            //--------------------------------- Transform ----------------------------------------
            d.print();
            //--------------------------------- Transform ----------------------------------------
            env.execute();
        }
    }
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    print结果
    1
    2
    3
    4

    Transform

    • 译名:转换算子
    • 功能:把 一个或多个DataStream 转成 一个新的DataStream

    map

    • 消费1个元素,产出1个元素
    d.map(s -> s + 1L).print();
    
    • 1
    print结果
    2
    3
    4
    5

    flatMap

    • 消费1个元素,产出零~多个元素
    • 使用Lambda表达式时, 由于泛型擦除,须用returns指定泛型的类型
    import org.apache.flink.api.common.functions.FlatMapFunction;
    import org.apache.flink.api.common.typeinfo.Types;
    
    • 1
    • 2
    d
            .flatMap((FlatMapFunction<Long, Long>) (value, out) -> {
                out.collect(value * value);
                out.collect(-value * value);
            })
            .returns(Types.LONG)
            .print();
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    print结果
    1
    -1
    4
    -4
    9
    -9
    16
    -16

    filter

    • 设定计算规则,保留返回true的数据,去除返回false的数据
    d.filter(i -> (i % 2 == 0)).print();
    
    • 1
    print结果
    2
    4

    union

    • 多条流汇聚成1条流
    • 多条流的类型要求一致
    //创建3条流
    DataStreamSource<Integer> d1 = env.fromElements(1);
    DataStreamSource<Integer> d2 = env.fromElements(2, 2);
    DataStreamSource<Integer> d3 = env.fromElements(3, 3, 3);
    //联合2条流
    d1.union(d2).union(d3).print();
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    某次print结果
    2
    2
    3
    3
    3
    1

    connect

    • 两个流放到一个流,两个流仍然相互独立
    • 两个流中存储的数据类型可以不同
    import org.apache.flink.streaming.api.datastream.ConnectedStreams;
    import org.apache.flink.streaming.api.datastream.DataStream;
    
    • 1
    • 2
    //创建2条流
    DataStreamSource<Integer> d1 = env.fromElements(1, 2, 3, 4);
    DataStreamSource<String> d2 = env.fromElements("a", "b", "c");
    //连结2条流
    ConnectedStreams<Integer, String> dd = d1.connect(d2);
    //分别取出2条流
    DataStream<Integer> s1 = dd.getFirstInput();
    DataStream<String> s2 = dd.getSecondInput();
    //打印
    s1.print("first");
    s2.print("second");
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    print结果
    second> a
    first> 1
    second> b
    first> 2
    second> c
    first> 3
    first> 4

    keyBy

    • 把流中的数据 按key 分到不同的分区
    • DataStream => KeyedStream
    • keyBy之后,可使用reduce、sum、max、min等方法

    reduce

    import org.apache.flink.api.common.functions.ReduceFunction;
    import org.apache.flink.streaming.api.datastream.DataStreamSource;
    import org.apache.flink.streaming.api.datastream.KeyedStream;
    import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
    
    public class Hi {
        public static void main(String[] args) throws Exception {
            //创建流执行环境
            StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
            //设置并行度
            env.setParallelism(1);
            //创建流数据源
            DataStreamSource<String> d = env.fromElements("1", "4", "5", "2", "3");
            //分区
            KeyedStream<String, Integer> k = d.keyBy(i -> (Integer.parseInt(i) % 2));
            //归约
            k.reduce((ReduceFunction<String>) (value1, value2) -> value1 + "," + value2).print();
            //执行
            env.execute();
        }
    }
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    print结果
    1
    4
    1,5
    4,2
    1,5,3

    sum、max、min

    import org.apache.flink.streaming.api.datastream.DataStreamSource;
    import org.apache.flink.streaming.api.datastream.KeyedStream;
    import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
    
    public class Hi {
        public static void main(String[] args) throws Exception {
            //创建流执行环境
            StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
            //设置并行度
            env.setParallelism(1);
            //创建流数据源
            DataStreamSource<Integer> d = env.fromElements(1, 2, 3, 4, 5);
            //分区
            KeyedStream<Integer, Integer> k = d.keyBy(i -> (i % 2));
            //聚合计算
            k.sum(0).print("sum");
            k.max(0).print("max");
            k.min(0).print("min");
            //执行
            env.execute();
        }
    }
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 22
    sum结果
    sum> 1
    sum> 2
    sum> 4(1+3)
    sum> 6(2+4)
    sum> 9(1+3+5)
    max结果
    max> 1
    max> 2
    max> 3(1、3)
    max> 4(2、4)
    max> 5(1、3、5)
    min结果
    min> 1
    min> 2
    min> 1(1、3)
    min> 2(2、4)
    min> 1(1、3、5)

    process

    • 底层的算子

    ProcessFunction

    • ProcessFunction可在没有keyBy的情况下使用
    import org.apache.flink.streaming.api.functions.ProcessFunction;
    import org.apache.flink.util.Collector;
    
    • 1
    • 2
    d.process(new ProcessFunction<Long, String>() {
        @Override
        public void processElement(Long value, Context ctx, Collector<String> out) {
            out.collect(value + "L");
        }
    }).print();
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    print结果
    1L
    2L
    3L
    4L

    KeyedProcessFunction

    • KeyedProcessFunction要在keyBy后使用
    import org.apache.flink.streaming.api.datastream.KeyedStream;
    import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
    import org.apache.flink.util.Collector;
    
    • 1
    • 2
    • 3
    KeyedStream<Long, Integer> k = d.keyBy(i -> (int) (i % 2));
    k.process(new KeyedProcessFunction<Integer, Long, String>() {
        @Override
        public void processElement(Long value, Context ctx, Collector<String> out) {
            System.out.println("当前key:" + ctx.getCurrentKey());
            out.collect(value + "L");
        }
    }).print("输出");
    
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    结果
    当前key:1
    输出> 1L
    当前key:0
    输出> 2L
    当前key:1
    输出> 3L
    当前key:0
    输出> 4L
  • 相关阅读:
    【JAVA】excel读取常见问题(涉及格式:xls、xlsx)
    举一个 web worker 的例子
    【北亚数据恢复】IBM System Storage存储lvm信息丢失数据恢复方案
    基于javaweb音乐网站管理系统
    Android Studio: unrecognized Attribute name MODULE
    Python 爬虫基础
    安装YMFE/yapi API管理服务器(Ubuntu20)
    C++笔记之std::forward
    关于前端传值,springboot后端的参数处理方式汇总
    MobPush Android 快速集成
  • 原文地址:https://blog.csdn.net/Yellow_python/article/details/127926400